Changing the Geometry of Representations: <i>α</i>-Embeddings for NLP Tasks

Word embeddings based on a conditional model are commonly used in Natural Language Processing (NLP) tasks to embed the words of a dictionary in a low dimensional linear space. Their computation is based on the maximization of the likelihood of a conditional probability distribution for each word of...

Full description

Bibliographic Details
Main Authors: Riccardo Volpi, Uddhipan Thakur, Luigi Malagò
Format: Article
Language:English
Published: MDPI AG 2021-02-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/23/3/287