Joint Training for Neural Machine Translation - Cheng, Yong
- Format: Relié Voir le descriptif
Vous en avez un à vendre ?
Vendez-le-vôtreSoyez informé(e) par e-mail dès l'arrivée de cet article
Créer une alerte prix- Payez directement sur Rakuten (CB, PayPal, 4xCB...)
- Récupérez le produit directement chez le vendeur
- Rakuten vous rembourse en cas de problème
Gratuit et sans engagement
Félicitations !
Nous sommes heureux de vous compter parmi nos membres du Club Rakuten !
TROUVER UN MAGASIN
Retour
Avis sur Joint Training For Neural Machine Translation de Cheng, Yong Format Relié - Livre Informatique
0 avis sur Joint Training For Neural Machine Translation de Cheng, Yong Format Relié - Livre Informatique
Les avis publiés font l'objet d'un contrôle automatisé de Rakuten.
Présentation Joint Training For Neural Machine Translation de Cheng, Yong Format Relié
- Livre Informatique
Résumé :
This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.
Biographie:
Yong Cheng is currently a software engineer engaged in research at Google. Before joining Google, he worked as a senior researcher at Tencent AI Lab. He obtained his Ph.D. from the Institute for Interdisciplinary Information Sciences (IIIS) at Tsinghua University in 2017. His research interests focus on neural machine translation and natural language processing....
Sommaire: 1. Introduction.- 2. Neural Machine Translation.- 3. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation.- 4. Semi-supervised Learning for Neural Machine Translation.- 5. Joint Training for Pivot-based Neural Machine Translation.- 6. Joint Modeling for Bidirectional Neural Machine Translation with Contrastive Learning.- 7. Related Work.- 8. Conclusion.