Summary of the paper

Title Low-Resource Neural Machine Translation with Transfer Learning
Authors Tao Feng, Miao Li and Lei Chen
Abstract Neural machine translation has achieved great success under a great deal of bilingual corpora in the past few years. However, it does not work well for low-resource language pairs. In order to solve this problem, we present a transfer learning method which can improve the BLEU scores of the low-resource machine translation. First, we exploit encoder-decoder framework with attention mechanism to train one neural machine translation model with large language pairs, and then employ some parameters of the trained model to initialize another neural machine translation model with less bilingual parallel corpora. Our experiments demonstrate that the proposed method can achieve the excellent performance on low-resource machine translation by weight adjustment and retraining. On the IWSLT2015 Vietnamese-English translation task, our model can improve the translation quality by an average of 1.55 BLEU scores. Besides, we can also get the increase of 0.99 BLEU scores when translating from Mongolian to Chinese. Finally, we analyze the results of experiments and summarize our contribution.
Topics Low-Resource, Transfer Learning, Neural Machine Translation
Full paper Low-Resource Neural Machine Translation with Transfer Learning
Bibtex @InProceedings{FENG18.3,
  author = {Tao Feng ,Miao Li and Lei Chen},
  title = {Low-Resource Neural Machine Translation with Transfer Learning},
  booktitle = {Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)},
  year = {2018},
  month = {may},
  date = {7-12},
  location = {Miyazaki, Japan},
  editor = {Jinhua Du and Mihael Arcan and Qun Liu and Hitoshi Isahara},
  publisher = {European Language Resources Association (ELRA)},
  address = {Paris, France},
  isbn = {979-10-95546-15-3},
  language = {english}
  }
Powered by ELDA © 2018 ELDA/ELRA