Language modeling and bidirectional coders representations: an overview of key technologies

The article is an essay on the development of technologies for natural language processing, which formed the basis of BERT (Bidirectional Encoder Representations from Transformers), a language model from Google, showing high results on the whole class of problems associated with the understanding of...

Full description

Bibliographic Details
Main Author: D. I. Kachkou
Format: Article
Language:Russian
Published: The United Institute of Informatics Problems of the National Academy of Sciences of Belarus 2021-01-01
Series:Informatika
Subjects:
Online Access:https://inf.grid.by/jour/article/view/1080
id doaj-31f96994cc0845f687ee37f19a13222f
record_format Article
spelling doaj-31f96994cc0845f687ee37f19a13222f2021-07-28T21:07:30ZrusThe United Institute of Informatics Problems of the National Academy of Sciences of Belarus Informatika1816-03012021-01-01174617210.37661/1816-0301-2020-17-4-61-72945Language modeling and bidirectional coders representations: an overview of key technologiesD. I. Kachkou0Belarusian State UniversityThe article is an essay on the development of technologies for natural language processing, which formed the basis of BERT (Bidirectional Encoder Representations from Transformers), a language model from Google, showing high results on the whole class of problems associated with the understanding of natural language. Two key ideas implemented in BERT are knowledge transfer and attention mechanism. The model is designed to solve two problems on a large unlabeled data set and can reuse the identified language patterns for effective learning for a specific text processing problem. Architecture Transformer is based on the attention mechanism, i.e. it involves evaluation of relationships between input data tokens. In addition, the article notes strengths and weaknesses of BERT and the directions for further model improvement.https://inf.grid.by/jour/article/view/1080informaticsinformation technologylanguage modelsnatural language processingattention mechanismtransformer architecturemodel bert
collection DOAJ
language Russian
format Article
sources DOAJ
author D. I. Kachkou
spellingShingle D. I. Kachkou
Language modeling and bidirectional coders representations: an overview of key technologies
Informatika
informatics
information technology
language models
natural language processing
attention mechanism
transformer architecture
model bert
author_facet D. I. Kachkou
author_sort D. I. Kachkou
title Language modeling and bidirectional coders representations: an overview of key technologies
title_short Language modeling and bidirectional coders representations: an overview of key technologies
title_full Language modeling and bidirectional coders representations: an overview of key technologies
title_fullStr Language modeling and bidirectional coders representations: an overview of key technologies
title_full_unstemmed Language modeling and bidirectional coders representations: an overview of key technologies
title_sort language modeling and bidirectional coders representations: an overview of key technologies
publisher The United Institute of Informatics Problems of the National Academy of Sciences of Belarus
series Informatika
issn 1816-0301
publishDate 2021-01-01
description The article is an essay on the development of technologies for natural language processing, which formed the basis of BERT (Bidirectional Encoder Representations from Transformers), a language model from Google, showing high results on the whole class of problems associated with the understanding of natural language. Two key ideas implemented in BERT are knowledge transfer and attention mechanism. The model is designed to solve two problems on a large unlabeled data set and can reuse the identified language patterns for effective learning for a specific text processing problem. Architecture Transformer is based on the attention mechanism, i.e. it involves evaluation of relationships between input data tokens. In addition, the article notes strengths and weaknesses of BERT and the directions for further model improvement.
topic informatics
information technology
language models
natural language processing
attention mechanism
transformer architecture
model bert
url https://inf.grid.by/jour/article/view/1080
work_keys_str_mv AT dikachkou languagemodelingandbidirectionalcodersrepresentationsanoverviewofkeytechnologies
_version_ 1721262745881411584