Using Bidirectional Encoder Representations from Transformers for Conversational Machine Comprehension

Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language representation model, designed to pre-train deep bidirectional representations, with the goal of extracting context-sensitive features from an input text [1]. One of the challenging problems in the field o...

Full description

Bibliographic Details
Main Author: Gogoulou, Evangelina
Format: Others
Language:English
Published: KTH, Skolan för elektroteknik och datavetenskap (EECS) 2019
Subjects:
Online Access:http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-265656