AMR-To-Text Generation with Graph Transformer

Abstract meaning representation (AMR)-to-text generation is the challenging task of generating natural language texts from AMR graphs, where nodes represent concepts and edges denote relations. The current state-of-the-art methods use graph-to-sequence models; however, they sti...

Full description

Bibliographic Details
Main Authors: Wang, Tianming, Wan, Xiaojun, Jin, Hanqi
Format: Article
Language:English
Published: The MIT Press 2020-07-01
Series:Transactions of the Association for Computational Linguistics
Online Access:https://www.mitpressjournals.org/doi/abs/10.1162/tacl_a_00297
id doaj-843422ea17234de9bcaa2939b34c79ac
record_format Article
spelling doaj-843422ea17234de9bcaa2939b34c79ac2020-11-25T03:25:18ZengThe MIT PressTransactions of the Association for Computational Linguistics2307-387X2020-07-018193310.1162/tacl_a_00297AMR-To-Text Generation with Graph TransformerWang, TianmingWan, XiaojunJin, Hanqi Abstract meaning representation (AMR)-to-text generation is the challenging task of generating natural language texts from AMR graphs, where nodes represent concepts and edges denote relations. The current state-of-the-art methods use graph-to-sequence models; however, they still cannot significantly outperform the previous sequence-to-sequence models or statistical approaches. In this paper, we propose a novel graph-to-sequence model (Graph Transformer) to address this task. The model directly encodes the AMR graphs and learns the node representations. A pairwise interaction function is used for computing the semantic relations between the concepts. Moreover, attention mechanisms are used for aggregating the information from the incoming and outgoing neighbors, which help the model to capture the semantic information effectively. Our model outperforms the state-of-the-art neural approach by 1.5 BLEU points on LDC2015E86 and 4.8 BLEU points on LDC2017T10 and achieves new state-of-the-art performances. https://www.mitpressjournals.org/doi/abs/10.1162/tacl_a_00297
collection DOAJ
language English
format Article
sources DOAJ
author Wang, Tianming
Wan, Xiaojun
Jin, Hanqi
spellingShingle Wang, Tianming
Wan, Xiaojun
Jin, Hanqi
AMR-To-Text Generation with Graph Transformer
Transactions of the Association for Computational Linguistics
author_facet Wang, Tianming
Wan, Xiaojun
Jin, Hanqi
author_sort Wang, Tianming
title AMR-To-Text Generation with Graph Transformer
title_short AMR-To-Text Generation with Graph Transformer
title_full AMR-To-Text Generation with Graph Transformer
title_fullStr AMR-To-Text Generation with Graph Transformer
title_full_unstemmed AMR-To-Text Generation with Graph Transformer
title_sort amr-to-text generation with graph transformer
publisher The MIT Press
series Transactions of the Association for Computational Linguistics
issn 2307-387X
publishDate 2020-07-01
description Abstract meaning representation (AMR)-to-text generation is the challenging task of generating natural language texts from AMR graphs, where nodes represent concepts and edges denote relations. The current state-of-the-art methods use graph-to-sequence models; however, they still cannot significantly outperform the previous sequence-to-sequence models or statistical approaches. In this paper, we propose a novel graph-to-sequence model (Graph Transformer) to address this task. The model directly encodes the AMR graphs and learns the node representations. A pairwise interaction function is used for computing the semantic relations between the concepts. Moreover, attention mechanisms are used for aggregating the information from the incoming and outgoing neighbors, which help the model to capture the semantic information effectively. Our model outperforms the state-of-the-art neural approach by 1.5 BLEU points on LDC2015E86 and 4.8 BLEU points on LDC2017T10 and achieves new state-of-the-art performances.
url https://www.mitpressjournals.org/doi/abs/10.1162/tacl_a_00297
work_keys_str_mv AT wangtianming amrtotextgenerationwithgraphtransformer
AT wanxiaojun amrtotextgenerationwithgraphtransformer
AT jinhanqi amrtotextgenerationwithgraphtransformer
_version_ 1724597666463088640