Improving neural machine translation with neural syntactic distance

C Ma, A Tamura, M Utiyama, E Sumita… - Proceedings of the …, 2019 - aclanthology.org
C Ma, A Tamura, M Utiyama, E Sumita, T Zhao
Proceedings of the 2019 Conference of the North American Chapter of …, 2019aclanthology.org
The explicit use of syntactic information has been proved useful for neural machine
translation (NMT). However, previous methods resort to either tree-structured neural
networks or long linearized sequences, both of which are inefficient. Neural syntactic
distance (NSD) enables us to represent a constituent tree using a sequence whose length is
identical to the number of words in the sentence. NSD has been used for constituent
parsing, but not in machine translation. We propose five strategies to improve NMT with …
Abstract
The explicit use of syntactic information has been proved useful for neural machine translation (NMT). However, previous methods resort to either tree-structured neural networks or long linearized sequences, both of which are inefficient. Neural syntactic distance (NSD) enables us to represent a constituent tree using a sequence whose length is identical to the number of words in the sentence. NSD has been used for constituent parsing, but not in machine translation. We propose five strategies to improve NMT with NSD. Experiments show that it is not trivial to improve NMT with NSD; however, the proposed strategies are shown to improve translation performance of the baseline model (+ 2.1 (En–Ja),+ 1.3 (Ja–En),+ 1.2 (En–Ch), and+ 1.0 (Ch–En) BLEU).
aclanthology.org
Showing the best result for this search. See all results