Skip to content

Latest commit

 

History

History
113 lines (77 loc) · 5.2 KB

Translate.md

File metadata and controls

113 lines (77 loc) · 5.2 KB

pytorch https://github.com/pytorch/translate

开源神经网络机器翻译框架OpenNMT的Python实现 https://github.com/sebastianGehrmann/diverse_ensembling

PyTorch实现的Transformer神经机器翻译 https://github.com/MultiPath/Squirrel

复制即翻译的外文辅助阅读翻译解决方案 https://github.com/elliottzheng/CopyTranslator/blob/master/README_zh.md

OpenNMT是个开源的神经网络机器翻译系统,用Torch实现。其设计目标是简单易用、易于扩展,同时保证效率和一流的翻译准确性。 https://github.com/OpenNMT/OpenNMT

面向教学目的的最小化神经网络机器翻译实现(PyTorch) https://github.com/joeynmt/joeynmt

Meta learning for Neural Machine Translation https://github.com/MultiPath/MetaNMT

BERT论文中文翻译 https://github.com/yuanxiaosc/BERT_Paper_Chinese_Translation

Source code to reproduce the results in the ACL 2019 paper "Syntactically Supervised Transformers for Faster Neural Machine Translation" https://github.com/dojoteef/synst

Source code for the paper https://github.com/ictnlp/OR-NMT

Incorporating BERT into Neural Machine Translation https://github.com/bert-nmt/bert-nmt

'Bob:一款 Mac 端翻译软件,翻译方式支持划词翻译和截图翻译,翻译引擎支持有道翻译、百度翻译和谷歌翻译~' https://github.com/ripperhe/Bob

LaNMT: Latent-variable Non-autoregressive Neural Machine Translation with Deterministic Inference https://github.com/zomux/lanmt

Implementation of "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019 https://github.com/pfnet-research/vat_nmt

机器翻译中的强化学习:优点、缺点以及不足 https://mp.weixin.qq.com/s/yILXP7EJFC_ST75mrhi8ng

The implementation of "Learning Deep Transformer Models for Machine Translation" https://github.com/wangqiangneu/dlcl

哈佛OpenNMT开源神经网络机器翻译 TensorFlow 2.0 实现版 https://github.com/OpenNMT/OpenNMT-tf

开源翻译平台 traduora,支持多人协作在线翻译,可导入导出 JSON、CSV、YAML 等多种文本格式。 https://github.com/traduora/traduora

Deep Encoder, Shallow Decoder: Reevaluating the Speed-Quality Tradeoff in Machine Translation https://github.com/jungokasai/deep-shallow

NiuTrans.NMT:基于Transformer的轻量快速神经网络机器翻译系统 https://github.com/NiuTrans/NiuTrans.NMT

极高质量的科普文章《Visualizing A Neural Machine Translation Model》,文章将算法原理用通俗的动画来演示,非常靠谱。是某目前为止看到对深度学习算法最好的梳理和解读 http://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/

Argos Translate:Python写的基于OpenNMT的开源离线机器翻译App https://github.com/argosopentech/argos-translate

deep-translator:Python多语言机器翻译工具 https://github.com/nidhaloff/deep-translator

LibreTranslate:可完全本地化部署的开源机器翻译API服务,基于Argos Translate https://github.com/uav4geo/LibreTranslate

EasyNMT - 易于使用的最先进神经网络机器翻译,支持100+种语言 https://github.com/UKPLab/EasyNMT

The implementation of "Does Multi-Encoder Help? A Case Study on Context-AwareNeural Machine Translation" https://github.com/libeineu/Context-Aware

We release a dataset based on Wikipedia sentences and the corresponding translations in 6 different languages along with the scores (scale 1 to 100) generated though human evaluations that represent the quality of the translations.Paper Title Unsupervised Quality Estimation for Neural Machine Translation https://github.com/facebookresearch/mlqe

DL Translate:基于Huggingface transformers库和mBART-Large的深度学习机器翻译库 github.com/xhlulu/dl-translate

EasyNMT - 易于使用的最先进神经网络机器翻译,支持100+种语言 http://easynmt.net/demo/ https://github.com/UKPLab/EasyNMT

TensorFlow实例教程:基于注意力的神经机器翻译 https://www.tensorflow.org/text/tutorials/nmt_with_attention

Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation github.com/MGheini/xattn-transfer-for-mt

LibreTranslate:可完全本地化部署的开源机器翻译API服务,基于Argos Translate github.com/LibreTranslate/LibreTranslate

MT Conferences & Journals:机器翻译相关会议/期刊列表 github.com/NiuTrans/MTVenues

LibreTranslate:可完全本地化部署的开源机器翻译API服务,基于Argos Translate github.com/LibreTranslate/LibreTranslate

ChineseNMT:基于transformer的英译中翻译模型 github.com/hemingkx/ChineseNMT

Building Machine Translation Systems for the Next Thousand Languages 为了能翻译一千多种语言——尤其是长尾语言——不用双语语料库,而是用单语源语言内容,来训练机器翻译引擎。 https://arxiv.org/abs/2205.03983

HighMMT: Towards Modality and Task Generalization for High-Modality Representation Learning 设计一个通用多通道Transformer,可以在文本、图像、视频、音频、时间序列、传感器、表格等模态之间进行通用化,同时改善性能和效率之间的权衡。 https://arxiv.org/abs/2203.01311