Pretraining Deep Learning Models for Natural Language Understanding
Main Author: | Shao, Han |
---|---|
Language: | English |
Published: |
Oberlin College Honors Theses / OhioLINK
2020
|
Subjects: | |
Online Access: | http://rave.ohiolink.edu/etdc/view?acc_num=oberlin158955297757398 |
Similar Items
-
Bloom’s Learning Outcomes’ Automatic Classification Using LSTM and Pretrained Word Embeddings
by: Sarang Shaikh, et al.
Published: (2021-01-01) -
Evaluating Statistical MachineLearning and Deep Learning Algorithms for Anomaly Detection in Chat Messages
by: Freberg, Daniel
Published: (2018) -
Deep learning for medical report texts
by: Nelsson, Mikael
Published: (2018) -
Identifying Military Veterans in a Clinical Research Database using Natural Language Processing
by: Daniel Leightley, et al.
Published: (2019-11-01) -
Transformers-sklearn: a toolkit for medical language understanding with transformer-based models
by: Feihong Yang, et al.
Published: (2021-07-01)