Skip to main content
Log in

Exploration on Korean-Chinese collaborative translation method based on recursive recurrent neural network

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

In order to realize the historical sequence establishment of abstract dynamics in the cooperative translation of Korean language under machine learning and generate an abstract representation dynamically of the translation decoding tree in the recursive model during translation decoding. Combining the advantages of the two kinds of neural networks, this paper constructs a recursive recurrent neural network model, which can not only model the translation process by using the traditional machine translation features but also gradually construct the abstract representation of translation candidates in the process of translation, mining the important language model and other global features in machine translation effectively. This paper has trained the number of Korean-Chinese translation vocabulary, sentence length, and language pairs. Based on the test results, the model can effectively improve the performance of the machine translation model. In addition, based on the adjustment of pre-order word order to optimized the recursive recurring neural network model, and improved the performance of machine translation significantly.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Xu J, Sun X (2016) Dependency-based gated recursive neural network for Chinese word Segmentation, Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. pp. 567–572

  2. Ling W, Trancoso I, Dyer C, Black AW (2015) Character-based neural machine translation. arXiv:1–11

  3. Costa-Jussà MR, Fonollosa J A (2016) Character-based neural machine translation, Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. vol. 36, No. 2, pp.357–361

  4. Sennrich R, Haddow B, Birch A. Neural machine translation of rare words with subword units, Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics

  5. Junyoung C, Kyunghyun C, Yoshua B (2016) A character-level decoder without explicit segmentation for neural machine translation. arXiv:96–102

  6. Shen S, Cheng Y, He Z, He W, Wu H, Sun M, Liu Y (2016) Minimum risk training for neural machine translation, Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. pp.1683–1692

  7. Bahdanau D, Cho K, Bengio Y (2016) Neural machine translation by jointly learning to align and translate. Comput Sci:1–15

  8. Mikolov T, Karafidt M, Burget L (2010) Recurrent neural network based language model. Computer Sci:1045–1048

  9. Krings HP (2001) In: Koby GS (ed) Repairing texts: empirical investigations of machine translation post-editing processes”. Kent State University Press

  10. Liu L, Watanabe T, Sumita E (2013) Additive neural networks for statistical machine translation. Int Conf Parallel Distrib Syst 791-801

  11. Kalchbrenner N, Blunsom P (2013) Recurrent continuous translation models, Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing. pp. 1700-1709

  12. Devlin J, Zbib R, Huang Z (2014) Fast and robust neural network joint models for statistical machine translation. Assoc Comput Linguist 1370-1380

  13. Li W, Yang YT, Xiao L, Mi CG, Dong R (2019) Research on work language model for the Chinese-to-Uyghur machine translation. J Xiamen Univ (Natural Science):189–194

  14. Wuebker J, Mauser A, Ney H (2010) Training phrase translation models with leaving-one-out, Proceedings of Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics. pp. 475–484

  15. Luong MT, Manning CD (2016) Achieving open vocabulary neural machine translation with hybrid word-character models, Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. pp. 1054–1063

  16. Mikolov T, Sutskever I, Chen K. Distributed representations of words and phrases and their compositionality, Advances in Neural Information Processing Systems. pp. 3111-3119. 2013.

  17. Passban P, Liu Q, Way A (2017) Providing morphological information for statistical machine translation using Neural Networks, Nephron Clinical Practice. pp. 271–282

  18. Zhang JJ, Zong CQ (2017) Application of neural language model in statistical machine translation. Technol Intell Eng:21–28

  19. Xiao KZ (2017) Research on the translation of out of vocabulary words in the neural machine translation for Chinese and English patent corpus. Beijing Jiaotong University, Beijing

    Google Scholar 

  20. Li X, Zhang J, Zong C (2016) Towards zero unknown word in neural machine translation, International Joint Conference on Artificial Intelligence. pp. 2852–2858

  21. Visweswariah K, Rajlcumar R. Gandhe A, et al (2011) A word reordering model for improved machine translation. Proceedings of Proceedings of the Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, pp. 486–496

  22. Renshaw D, Hall KB (2015) Longshort-term memory language models with additive morphological features for Automatic speech recognition, International Conference on Acoustics, Speech and Signal Processing (ICASSP). Brisbane: IEEE. pp. 5246–5250

  23. Tran P, Dinh D, Nguyen HT (2016) A character level based and word level based approach for Chinese-Vietnamese machine translation. Comput Intell Neurosci:1–11

  24. Goto I, Utiyama M, Sumita E (2012) Post-ordering by parsing for Japanese-English statistical-machine translation, Proceedings of Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Short Papers-Volume. pp. 311–316

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lin Zhu.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhu, L. Exploration on Korean-Chinese collaborative translation method based on recursive recurrent neural network. Pers Ubiquit Comput 24, 309–318 (2020). https://doi.org/10.1007/s00779-019-01347-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-019-01347-5

Keywords

Navigation