Changing the Geometry of Representations: <i>α</i>-Embeddings for NLP Tasks
Word embeddings based on a conditional model are commonly used in Natural Language Processing (NLP) tasks to embed the words of a dictionary in a low dimensional linear space. Their computation is based on the maximization of the likelihood of a conditional probability distribution for each word of...
Main Authors: | Riccardo Volpi, Uddhipan Thakur, Luigi Malagò |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-02-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/23/3/287 |
Similar Items
-
On Monotone Embedding in Information Geometry
by: Jun Zhang
Published: (2015-06-01) -
A Collection of Swedish Diachronic Word Embedding Models Trained on Historical Newspaper Data
by: Simon Hengchen, et al.
Published: (2021-01-01) -
Toroidal Embeddings and Desingularization
by: NGUYEN, LEON
Published: (2018) -
Word and Relation Embedding for Sentence Representation
Published: (2017) -
Boosting Arabic Named-Entity Recognition With Multi-Attention Layer
by: Mohammed Nadher Abdo Ali, et al.
Published: (2019-01-01)