Review of Pre-training Techniques for Natural Language Processing
In the published reviews of natural language pre-training technology, most literatures only elaborate neural network pre-training technologies or a brief introduction to traditional pre-training technologies, which may result in the development process of natural language pre-training dissected arti...
Main Author: | |
---|---|
Format: | Article |
Language: | zho |
Published: |
Journal of Computer Engineering and Applications Beijing Co., Ltd., Science Press
2021-08-01
|
Series: | Jisuanji kexue yu tansuo |
Subjects: | |
Online Access: | http://fcst.ceaj.org/CN/abstract/abstract2823.shtml |
id |
doaj-2b8c4b95ba4d495d9902e4fb7c59d53b |
---|---|
record_format |
Article |
spelling |
doaj-2b8c4b95ba4d495d9902e4fb7c59d53b2021-08-09T08:35:58ZzhoJournal of Computer Engineering and Applications Beijing Co., Ltd., Science PressJisuanji kexue yu tansuo1673-94182021-08-011581359138910.3778/j.issn.1673-9418.2012109Review of Pre-training Techniques for Natural Language ProcessingCHEN Deguang, MA Jinlin, MA Ziping, ZHOU Jie01. School of Computer Science and Engineering, North Minzu University, Yinchuan 750021, China 2. School of Mathematics and Information Science, North Minzu University, Yinchuan 750021, China 3. Key Laboratory for Intelligent Processing of Computer Images and Graphics of National Ethnic Affairs Commission of the PRC, Yinchuan 750021, ChinaIn the published reviews of natural language pre-training technology, most literatures only elaborate neural network pre-training technologies or a brief introduction to traditional pre-training technologies, which may result in the development process of natural language pre-training dissected artificially from natural language processing. Therefore, in order to avoid this phenomenon, this paper covers the process of natural language pre-training with four points as follows. Firstly, the traditional natural language pre-training technologies and neural network pre-training technologies are introduced according to the updating route of pre-training technology. With the characteristics of related technologies analyzed, compared, this paper sums up the process of development context and trend of natural language processing technology. Secondly, based on the improved BERT (bidirectional encoder representation from transformers), this paper mainly introduces the latest natural language processing models from two aspects and sums up these models from pre-training mechanism, advantages and disadvantages, performance and so on. The main application fields of natural language processing are presented. Furthermore, this paper explores the challenges and corresponding solutions to natural language processing models. Finally, this paper summarizes the work of this paper and prospects the future development direction, which can help researchers understand the development of pre-training technologies of natural language more comprehensively and provide some ideas to design new models and new pre-training methods.http://fcst.ceaj.org/CN/abstract/abstract2823.shtmlpre-training techniquesnatural language processingneural network |
collection |
DOAJ |
language |
zho |
format |
Article |
sources |
DOAJ |
author |
CHEN Deguang, MA Jinlin, MA Ziping, ZHOU Jie |
spellingShingle |
CHEN Deguang, MA Jinlin, MA Ziping, ZHOU Jie Review of Pre-training Techniques for Natural Language Processing Jisuanji kexue yu tansuo pre-training techniques natural language processing neural network |
author_facet |
CHEN Deguang, MA Jinlin, MA Ziping, ZHOU Jie |
author_sort |
CHEN Deguang, MA Jinlin, MA Ziping, ZHOU Jie |
title |
Review of Pre-training Techniques for Natural Language Processing |
title_short |
Review of Pre-training Techniques for Natural Language Processing |
title_full |
Review of Pre-training Techniques for Natural Language Processing |
title_fullStr |
Review of Pre-training Techniques for Natural Language Processing |
title_full_unstemmed |
Review of Pre-training Techniques for Natural Language Processing |
title_sort |
review of pre-training techniques for natural language processing |
publisher |
Journal of Computer Engineering and Applications Beijing Co., Ltd., Science Press |
series |
Jisuanji kexue yu tansuo |
issn |
1673-9418 |
publishDate |
2021-08-01 |
description |
In the published reviews of natural language pre-training technology, most literatures only elaborate neural network pre-training technologies or a brief introduction to traditional pre-training technologies, which may result in the development process of natural language pre-training dissected artificially from natural language processing. Therefore, in order to avoid this phenomenon, this paper covers the process of natural language pre-training with four points as follows. Firstly, the traditional natural language pre-training technologies and neural network pre-training technologies are introduced according to the updating route of pre-training technology. With the characteristics of related technologies analyzed, compared, this paper sums up the process of development context and trend of natural language processing technology. Secondly, based on the improved BERT (bidirectional encoder representation from transformers), this paper mainly introduces the latest natural language processing models from two aspects and sums up these models from pre-training mechanism, advantages and disadvantages, performance and so on. The main application fields of natural language processing are presented. Furthermore, this paper explores the challenges and corresponding solutions to natural language processing models. Finally, this paper summarizes the work of this paper and prospects the future development direction, which can help researchers understand the development of pre-training technologies of natural language more comprehensively and provide some ideas to design new models and new pre-training methods. |
topic |
pre-training techniques natural language processing neural network |
url |
http://fcst.ceaj.org/CN/abstract/abstract2823.shtml |
work_keys_str_mv |
AT chendeguangmajinlinmazipingzhoujie reviewofpretrainingtechniquesfornaturallanguageprocessing |
_version_ |
1721214947891871744 |