A Study of BERT for Context-Aware Neural Machine Translation

Nov 17, 2021

Speakers

About

Context-aware neural machine translation (NMT), which targets at translating sentences with contextual information, has attracted much attention recently. A key problem for context-aware NMT is to effectively encode and aggregate the contextual information. BERT has been proven to be an effective feature extractor in natural language understanding tasks, but it has not been well studied in context-aware NMT. In this work, we conduct a study about leveraging BERT to encode the contextual information for NMT, and explore three commonly used methods to aggregate the contextual features. We conduct experiments on five translation tasks and find that concatenating all contextual sequences as a longer one and then encoding it by BERT obtains the best translation results. Specifically, we achieved state-of-the-art BLEU scores on several widely investigated tasks, including IWSLT'14 German-English, News Commentary v11 English-German translation and OpenSubtitle English-Russian translation.

Organizer

About ACML 2021

The 13th Asian Conference on Machine Learning ACML 2021 aims to provide a leading international forum for researchers in machine learning and related fields to share their new ideas, progress and achievements.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ACML 2021