{"id":33098452,"url":"https://github.com/alphadl/inspiring_papers","last_synced_at":"2025-11-19T14:02:14.473Z","repository":{"id":75380664,"uuid":"151175984","full_name":"alphadl/inspiring_papers","owner":"alphadl","description":"Papers related to Machine Translation (continuously updating \u0026 welcome Star/Fork/PR)","archived":false,"fork":false,"pushed_at":"2019-03-18T04:54:00.000Z","size":95909,"stargazers_count":7,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"master","last_synced_at":"2024-06-21T10:36:19.600Z","etag":null,"topics":["machine-translation","natural-language-processing","nlp"],"latest_commit_sha":null,"homepage":"","language":"HTML","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/alphadl.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2018-10-01T23:47:06.000Z","updated_at":"2022-12-29T09:19:01.000Z","dependencies_parsed_at":"2023-06-06T08:15:22.983Z","dependency_job_id":null,"html_url":"https://github.com/alphadl/inspiring_papers","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/alphadl/inspiring_papers","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alphadl%2Finspiring_papers","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alphadl%2Finspiring_papers/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alphadl%2Finspiring_papers/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alphadl%2Finspiring_papers/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/alphadl","download_url":"https://codeload.github.com/alphadl/inspiring_papers/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alphadl%2Finspiring_papers/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":285258051,"owners_count":27140780,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-11-19T02:00:05.673Z","response_time":65,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["machine-translation","natural-language-processing","nlp"],"created_at":"2025-11-14T21:00:21.230Z","updated_at":"2025-11-19T14:02:14.448Z","avatar_url":"https://github.com/alphadl.png","language":"HTML","readme":"# papers_about_mt and other inspired papers\n\n[1] attention is all you need\n\n[2] RvNN preordering En-Jp MT\n\n[3] Curriculum Learning for Natural Answer Generation\n\n[4] Extracting Relational Facts by an End-to-End Neural Model with Copy Mechanism\n\n[5] Generating Natural Answers by Incorporating Copying and Retrieving Mechanisms in Sequence-to-Sequence Learning\n\n[6] Close to Human Quality TTS with Transformer\n\n[7] Cross-lingual Knowledge Projection Using Machine Translation and Target-side Knowledge Base Completion\n\n[7-appendix] poster of paper7\n\n[8] Unsupervised Cross-lingual Transfer of Word Embedding Spaces\n\n[9] Commonsense Knowledge Base Completion\n\n[10] ConceptNet 5.5: An Open Multilingual Graph of General Knowledge\n\n[11] Adversarial learning meets graphs\n\n[12] Answering Cloze-style Software Questions Using Stack Overflow\n\n[13] Neural Machine Translation and Sequence-to-sequence Models: A Tutorial\n\n[14] Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation\n\n[15] Zero-Shot Dual Machine Translation\n\n[16] Commonsense Knowledge Base Completion\n\n[17] Unsupervised Cross-lingual Transfer of Word Embedding Spaces\n\n[18] An Empirical Study on Development Set Selection Strategy for Machine Translation Learning\n\n[19] Bagging-based System Combination for Domain Adaptation\n\n[20] Cross-Sentence N-ary Relation Extraction with Graph LSTMs\n\n[21] Distant supervision for relation extraction without labeled data\n\n[22] NMT-Keras\n\n[23] Fine-Tuning for Neural Machine Translation with Limited Degradation across In- and Out-of-Domain Data\n\n[24] Improving Neural Machine Translation with Conditional Sequence Generative Adversarial Nets\n\n[25 undocumented paper] Zhirui Zhang, Shujie Liu, Mu Li, Ming Zhou and Enhong Chen, Bidirectional Generative Adversarial Networks for Neural Machine Translation, The SIGNLL Conference on Computational Natural Language Learning (CoNLL 2018).\n\n[26] Unsupervised Neural Machine Translation with Weight Sharing\n\n[27] PHRASE-BASED ATTENTIONS\n\n[28] MULTILINGUAL NEURAL MACHINE TRANSLATION WITH KNOWLEDGE DISTILLATION\n\n[29] A Smorgasbord of Features to Combine Phrase-Based and Neural Machine Translation\n\n[30] Unveiling the Linguistic Weaknesses of Neural Machine Translation\n\n[31] Pre-Translation for Neural Machine Translation\n\n[32] Improving Lexical Choice in Neural Machine Translation\n\n[33] Improving Neural Machine Translation through Phrase-based Forced Decoding \n\n[34] Guiding Neural Machine Translation with Retrieved Translation Pieces\n\n[35] Sentence Weighting for Neural Machine Translation Domain Adaptation\n\n[36] Instance Weighting for Neural Machine Translation Domain Adaptation\n\n[37] Sentence Embedding for Neural Machine Translation Domain Adaptation\n\n[38] Cost Weighting for Neural Machine Translation Domain Adaptation\n\n[39] Stanford Neural Machine Translation Systems for Spoken Language Domains\n\n[40] Sequence to Sequence Learning with Neural Networks\n\n[41] Ensemble Distillation for Neural Machine Translation\n\n[42] Automatic Evaluation of Machine Translation Quality Using Longest Common Subsequence and Skip-Bigram Statistics\n\n[43] The Best Lexical Metric for Phrase-Based Statistical MT System Optimization.pdf\n\n[44] Efficient Extraction of Oracle-best Translations from Hypergraphs\n\n[45] Distilling the Knowledge in a Neural Network\n\n[46 slide] GAN and its application to NLP \n\n[47 slide] Knowledge Distillation via GAN\n\n[48] Adversarial Generation of Natural Language\n\n[49] Refining Source Representations with Relation Networks for Neural Machine Translation\n\n[50] Neural Machine Translation of Rare Words with Subword Units\n\n[51] Fully Character-Level Neural Machine Translation without Explicit Segmentation\n\n[52] Improving Zero-Shot Translation of Low-Resource Languages\n\n[53] Findings of the Second Shared Task on Multimodal Machine Translation and Multilingual Image Description\n\n[54] MULTILINGUAL IMAGE DESCRIPTION WITH NEURAL SEQUENCE MODELS\n\n[55] 2016WMT Multimodal translation---A Shared Task on Multimodal Machine Translation and Crosslingual Image Description\n\n[56] Show, Attend and Tell- Neural Image Caption Generation with Visual Attention\n\n[57] Show and Tell: A Neural Image Caption Generator\n\n[58] WMT2017多模态第一名CMU 评测报告\n\n[59] PhD thesis of Raj,京都大学 低资源多语种翻译\n\n[60] Multi-Task Learning for Multiple Language Translation\n\n[61] THUMT: An Open Source Toolkit for Neural Machine Translation\n\n[62] Zero-Resource Translation with Multi-Lingual Neural Machine Translation\n\n[63] Multi-Source Neural Translation\n\n[64 Msc_thesis] Domain Adaptation for Multilingual Neural Machine Translation\n\n[65] Transfer Learning for Low-Resource Neural Machine Translation\n\n[66] Multi-Source Neural Machine Translation with Missing Data\n\n[67] A Tree-based Decoder for Neural Machine Translation\n\n[68 dynet NMT] XNMT: The eXtensible Neural Machine Translation Toolkit\n\n[69](reinforcement leanring for NMT)Sequence level training with recurrent neural networks .pdf\n\n[70] A Study of Reinforcement Learning for Neural Machine Translation\n\n[71] Training Tips for the Transformer Model\n\n[72] Bidirectional Generative Adversarial Networks for Neural Machine Translation\n\n[73] Dual Learning for Machine Translation\n\n[74] A Teacher-Student Framework for Zero-Resource Neural Machine Translation\n\n[75] Dual Transfer Learning for Neural Machine Translation with Marginal Distribution Regularization\n\n[76] When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?\n\n[77] Meta-Learning for Low-Resource Neural Machine Translation\n\n[78] Phrase-Based \u0026 Neural Unsupervised Machine Translation\n\n[79] Model-Level Dual Learning\n\n[80] You May Not Need Attention\n\n[81] An Analysis of Encoder Representations in Transformer-Based Machine Translation\n\n[82] An Introductory Survey on Attention Mechanisms in NLP Problems\n\n[83] On Zero-shot Cross-lingual Transfer of Multilingual Neural Machine Translation.pdf\n\n[84] Bilingual-GAN: Neural Text Generation and Neural Machine Translation as Two Sides of the Same Coin\n\n[85] GraphSeq2Seq- Graph-Sequence-to-Sequence for Neural Machine Translation\n\n[86] Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks\n\n[87] Recurrent Additive Networks\n\n[88] Factored Neural Language Models\n\n[89] Neural Machine Translation By Generating Multiple Linguistic Factors\n\n[90] Deep Architectures for Neural Machine Translation\n\n[91] A Context-Aware Recurrent Encoder for Neural Machine Translation\n\n[92] Regularization techniques for fine-tuning in neural machine translation\n\n[93] Effective Domain Mixing for Neural Machine Translation\n\n[94] Multi-Domain Neural Machine Translation with Word-Level Domain Context Discrimination\n\n[95] Towards Linear Time Neural Machine Translation with Capsule\n\n[96] Agreement on Target Bidirectional LSTMs for Sequence-to-Sequence Learning\n\n[97] Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks\n\n[98] Does String-Based Neural MT Learn Source Syntax? \n\n[99] Graph Convolutional Networks for Text Classification\n\n[100] Incorporating Structural Alignment Biases into an Attentional Neural Translation Model\n\n[101] The Importance of Being Recurrent for Modeling Hierarchical Structure\n\n[102] An Empirical Exploration of Skip Connections for Sequential Tagging\n\n[103] Incorporating Copying Mechanism in Sequence-to-Sequence Learning\n\n[104] Attention Focusing for Neural Machine Translation by Bridging Source and Target Embeddings\n\n[105] Highway Networks\n\n[106] Extreme Adaptation for Personalized Neural Machine Translation\n\n[107] Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder\n\n[108] Rico Sennrich NMT- what’s linguistics got to do with it?\n\n[110] A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings\n\n[111] Unsupervised Nerual Machine Translation\n\n[112] Unsupervised Statistical Machine Translation\n\n[113] An Effective Approach to Unsupervised Machine Translation\n\n[114] lkaiser-Tensor2Tensor Transformers New Deep Models for NLP\n\n[115] Temporal dynamics of semantic relations in word embeddings- an application to predicting armed conflict participants\n\n[116] A Tutorial on Deep Latent Variable Models of Natural Language\n\n[117] Latent Alignment and Variational Attention\n\n[118] Pervasive Attention- 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction\n\n[119] Semi-Autoregressive Neural Machine Translation\n\n[120] Insertion Transformer\n","funding_links":[],"categories":["Other MT Lists 📝"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falphadl%2Finspiring_papers","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Falphadl%2Finspiring_papers","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falphadl%2Finspiring_papers/lists"}