Linguistic Knowledge Transfer for Enriching Vector Representations

Bibliographic Details
Main Author: Kim, Joo-Kyung
Language:English
Published: The Ohio State University / OhioLINK 2017
Subjects:
Online Access:http://rave.ohiolink.edu/etdc/view?acc_num=osu1500571436042414
id ndltd-OhioLink-oai-etd.ohiolink.edu-osu1500571436042414
record_format oai_dc
spelling ndltd-OhioLink-oai-etd.ohiolink.edu-osu15005714360424142021-08-03T07:03:35Z Linguistic Knowledge Transfer for Enriching Vector Representations Kim, Joo-Kyung Computer Science Artificial Intelligence Transfer Learning Word Embedding Intent Detection Slot Filling POS Tagging Adversarial Traning Many state-of-the-art neural network models utilize a huge number of parameters, where a large number of labeled training examples are necessary for sufficient training of the models. Those models may not be properly trained if there are not enough training examples for target tasks. This dissertation focuses on transfer learning methods, which improve the performance of the target tasks in such situations by leveraging external resources or models from other tasks.Specifically, we introduce transfer learning methods for enriching word or sentence vector representations of neural network models by transferring linguistic knowledge.Usually, the first layer of the neural networks for Natural Language Processing (NLP) is a word embedding layer. Word embeddings represent each word as a real-valued vector, where semantically or syntactically similar words tend to have similar vector representations in vector spaces.The first part of this dissertation is mainly about word embedding enrichment, which is categorized as an inductive transfer learning methodology.We show that word embeddings can represent semantic intensity scales like "good" < "great" < "excellent" on vector spaces, and semantic intensity orders of words can be used as the knowledge sources to adjust word vector positions to improve the semantics of words by evaluating on word-level semantics tasks. Also, we show that word embeddings that are enriched with linguistic knowledge can be used to improve the performance of the Bidirectional Long Short-Term Memory (BLSTM) model for intent detection, which is a sentence-level downstream task especially when only small numbers of training examples are available.The second part of this dissertation concerns about sentence-level transfer learning for sequence tagging tasks. We introduce a cross-domain transfer learning model for dialog slot-filling, which is an inductive transfer learning method, and a cross-lingual transfer learning model for Part-of-Speech (POS) tagging, which is a transductive transfer learning method.Both models utilize a common BLSTM that enables knowledge transfer from other domains/languages, and private BLSTMs for domain/language-specific representations. We also use adversarial training and other auxiliary objectives such as representation separations and bidirectional language models to further improve the transfer learning performance. We show that those sentence-level transfer learning models improve sequence tagging performances without exploiting any other cross-domain or cross-lingual knowledge. 2017-12-12 English text The Ohio State University / OhioLINK http://rave.ohiolink.edu/etdc/view?acc_num=osu1500571436042414 http://rave.ohiolink.edu/etdc/view?acc_num=osu1500571436042414 unrestricted This thesis or dissertation is protected by copyright: all rights reserved. It may not be copied or redistributed beyond the terms of applicable copyright laws.
collection NDLTD
language English
sources NDLTD
topic Computer Science
Artificial Intelligence
Transfer Learning
Word Embedding
Intent Detection
Slot Filling
POS Tagging
Adversarial Traning
spellingShingle Computer Science
Artificial Intelligence
Transfer Learning
Word Embedding
Intent Detection
Slot Filling
POS Tagging
Adversarial Traning
Kim, Joo-Kyung
Linguistic Knowledge Transfer for Enriching Vector Representations
author Kim, Joo-Kyung
author_facet Kim, Joo-Kyung
author_sort Kim, Joo-Kyung
title Linguistic Knowledge Transfer for Enriching Vector Representations
title_short Linguistic Knowledge Transfer for Enriching Vector Representations
title_full Linguistic Knowledge Transfer for Enriching Vector Representations
title_fullStr Linguistic Knowledge Transfer for Enriching Vector Representations
title_full_unstemmed Linguistic Knowledge Transfer for Enriching Vector Representations
title_sort linguistic knowledge transfer for enriching vector representations
publisher The Ohio State University / OhioLINK
publishDate 2017
url http://rave.ohiolink.edu/etdc/view?acc_num=osu1500571436042414
work_keys_str_mv AT kimjookyung linguisticknowledgetransferforenrichingvectorrepresentations
_version_ 1719452808499953664