About Neural Networks and Writing Definitions

Timothee Mickus, Mathieu Constant, Denis Paperno

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

In this article, we describe the current state of the field of NLP (Natural Language Processing) and detail the applications and trends that are of interest to lexicographers. We begin with a brief overview of how dictionaries have been used in the NLP community, particularly to introduce semantic knowledge in NLP systems. We follow up with a detailed account of one of the most well-known types of NLP semantic representations, namely word embeddings, and some of their limitations—in particular, how they fail to relate words to real-world objects. We then argue that the task of Definition Modeling, which consists of generating dictionary definitions from word embeddings, is well suited to studying these limitations and highlight how current issues in automated evaluation of NLP systems specifically hinder this investigation.
Original languageEnglish
Pages (from-to)95-117
JournalDictionaries
Volume42
Issue number2
DOIs
Publication statusPublished - 2021

Keywords

  • Natural Language Processing (NLP)
  • distributional semantics
  • word embeddings
  • Definition Modeling
  • semantic grounding

Fingerprint

Dive into the research topics of 'About Neural Networks and Writing Definitions'. Together they form a unique fingerprint.

Cite this