Language modeling and bidirectional coders representations: an overview of key technologies
The article is an essay on the development of technologies for natural language processing, which formed the basis of BERT (Bidirectional Encoder Representations from Transformers), a language model from Google, showing high results on the whole class of problems associated with the understanding of...
Main Author: | D. I. Kachkou |
---|---|
Format: | Article |
Language: | Russian |
Published: |
The United Institute of Informatics Problems of the National Academy of Sciences of Belarus
2021-01-01
|
Series: | Informatika |
Subjects: | |
Online Access: | https://inf.grid.by/jour/article/view/1080 |
Similar Items
-
The Power of Selecting Key Blocks with Local Pre-ranking for Long Document Information Retrieval
by: Chagnon, J., et al.
Published: (2023) -
Leveraging Pre-Trained Language Model for Summary Generation on Short Text
by: Shuai Zhao, et al.
Published: (2020-01-01) -
Zero‐anaphora resolution in Korean based on deep language representation model: BERT
by: Youngtae Kim, et al.
Published: (2020-10-01) -
French AXA Insurance Word Embeddings : Effects of Fine-tuning BERT and Camembert on AXA France’s data
by: Zouari, Hend
Published: (2020) -
The language of proteins: NLP, machine learning & protein sequences
by: Dan Ofer, et al.
Published: (2021-01-01)