Summary of the paper

Title Augmenting Sparse Corpora for Enhanced Sign Language Recognition and Generation
Authors Heike Brock, Juliette Rengot and Kazuhiro Nakadai
Abstract The collection of signed utterances for recognition and generation of Sign Language (SL) is a costly and labor-intensive task. As a result, SL corpora are usually considerably smaller than their spoken language or image data counterparts. This is problematic, since the accuracy and applicability of a neural network depends largely on the quality and amount of its underlying training data. Common data augmentation strategies to increase the number of available training data are usually not applicable to the spatially and temporally constrained motion sequences of a SL corpus. In this paper, we therefore discuss possible data manipulation methods on the base of a collection of motion-captured SL sentence expressions. Evaluation of differently trained network architectures shows a significant reduction of overfitting by inclusion of the augmented data. Simultaneously, the accuracy of both sign recognition and generation was improved, indicating that the proposed data augmentation methods are beneficial for constrained and sparse data sets.
Full paper Augmenting Sparse Corpora for Enhanced Sign Language Recognition and Generation
Bibtex @InProceedings{BROCK18.18012,
  author = {Heike Brock ,Juliette Rengot and Kazuhiro Nakadai},
  title = {Augmenting Sparse Corpora for Enhanced Sign Language Recognition and Generation},
  booktitle = {Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)},
  year = {2018},
  month = {may},
  date = {7-12},
  location = {Miyazaki, Japan},
  editor = {Mayumi Bono and Eleni Efthimiou and Stavroula-Evita Fotinea and Thomas Hanke and Julie Hochgesang and Jette Kristoffersen and Johanna Mesch and Yutaka Osugi},
  publisher = {European Language Resources Association (ELRA)},
  address = {Paris, France},
  isbn = {979-10-95546-01-6},
  language = {english}
  }
Powered by ELDA © 2018 ELDA/ELRA