https://github.com/awslabs/sockeye

Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch

https://github.com/awslabs/sockeye

Science Score: 33.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Committers with academic emails
    4 of 60 committers (6.7%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.9%) to scientific vocabulary

Keywords

attention-is-all-you-need attention-mechanism attention-model deep-learning deep-neural-networks encoder-decoder machine-learning machine-translation neural-machine-translation pytorch seq2seq sequence-to-sequence sequence-to-sequence-models sockeye transformer transformer-architecture transformer-network translation

Keywords from Contributors

mxnet natural-language-understanding gluon gluonnlp natural-language-generation natural-language-inference nlg nlu
Last synced: 5 months ago · JSON representation

Repository

Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch

Basic Info
Statistics
  • Stars: 1,216
  • Watchers: 48
  • Forks: 323
  • Open Issues: 12
  • Releases: 76
Topics
attention-is-all-you-need attention-mechanism attention-model deep-learning deep-neural-networks encoder-decoder machine-learning machine-translation neural-machine-translation pytorch seq2seq sequence-to-sequence sequence-to-sequence-models sockeye transformer transformer-architecture transformer-network translation
Created over 8 years ago · Last pushed over 1 year ago
Metadata Files
Readme Changelog Contributing License Codeowners

README.md

Sockeye

Sockeye has entered maintenance mode and is no longer adding new features. We are grateful to everyone who has contributed to Sockeye throughout its development with pull requests, issue reports, and more.

PyPI version GitHub license GitHub issues Documentation Status Torch Nightly

Sockeye is an open-source sequence-to-sequence framework for Neural Machine Translation built on PyTorch. It implements distributed training and optimized inference for state-of-the-art models, powering Amazon Translate and other MT applications. Recent developments and changes are tracked in our CHANGELOG.

For a quickstart guide to training a standard NMT model on any size of data, see the WMT 2014 English-German tutorial.

For questions and issue reports, please file an issue on GitHub.

Version 3.1.x: PyTorch only

With version 3.1.x, we remove support for MXNet 2.x. Models trained with PyTorch and Sockeye 3.0.x remain compatible with Sockeye 3.1.x. Models trained with 2.3.x (using MXNet) and converted to PyTorch with Sockeye 3.0.x's conversion tool can NOT be used with Sockeye 3.1.x.

Version 3.0.0: Concurrent PyTorch and MXNet support

Starting with version 3.0.0, Sockeye is also based on PyTorch. We maintain backwards compatibility with MXNet models of version 2.3.x with 3.0.x. If MXNet 2.x is installed, Sockeye can run both with PyTorch or MXNet.

All models trained with 2.3.x (using MXNet) can be converted to models running with PyTorch using the converter CLI (sockeye.mx_to_pt). This will create a PyTorch parameter file (<model>/params.best) and backup the existing MXNet parameter file to <model>/params.best.mx. Note that this only applies to fully-trained models that are to be used for inference. Continued training of an MXNet model with PyTorch is not supported (because we do not convert training and optimizer states). sockeye.mx_to_pt requires MXNet to be installed into the environment.

All CLIs of Version 3.0.0 now use PyTorch by default, e.g. sockeye-{train,translate,score}. MXNet-based CLIs/modules are still operational and accessible via sockeye-{train,translate,score}-mx.

Sockeye 3 can be installed and run without MXNet, but if installed, an extended test suite is executed to ensure equivalence between PyTorch and MXNet models. Note that running Sockeye 3.0.0 with MXNet requires MXNet 2.x to be installed (pip install --pre -f https://dist.mxnet.io/python 'mxnet>=2.0.0b2021')

Installation

Download the current version of Sockeye: bash git clone https://github.com/awslabs/sockeye.git

Install the sockeye module and its dependencies: bash cd sockeye && pip3 install --editable .

For faster GPU training, install NVIDIA Apex. NVIDIA also provides PyTorch Docker containers that include Apex.

Documentation

Older versions

  • Sockeye 3.0, based on PyTorch & MXNet 2.x is available in the sockeye_30 branch.
  • Sockeye 2.x, based on the MXNet Gluon API, is available in the sockeye_2 branch.
  • Sockeye 1.x, based on the MXNet Module API, is available in the sockeye_1 branch.

Citation

For more information about Sockeye, see our papers (BibTeX).

Sockeye 3.x

Felix Hieber, Michael Denkowski, Tobias Domhan, Barbara Darques Barros, Celina Dong Ye, Xing Niu, Cuong Hoang, Ke Tran, Benjamin Hsu, Maria Nadejde, Surafel Lakew, Prashant Mathur, Anna Currey, Marcello Federico. Sockeye 3: Fast Neural Machine Translation with PyTorch. ArXiv e-prints.

Sockeye 2.x

Tobias Domhan, Michael Denkowski, David Vilar, Xing Niu, Felix Hieber, Kenneth Heafield. The Sockeye 2 Neural Machine Translation Toolkit at AMTA 2020. Proceedings of the 14th Conference of the Association for Machine Translation in the Americas (AMTA'20).

Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar. Sockeye 2: A Toolkit for Neural Machine Translation. Proceedings of the 22nd Annual Conference of the European Association for Machine Translation, Project Track (EAMT'20).

Sockeye 1.x

Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton, Matt Post. The Sockeye Neural Machine Translation Toolkit at AMTA 2018. Proceedings of the 13th Conference of the Association for Machine Translation in the Americas (AMTA'18).

Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton and Matt Post. 2017. Sockeye: A Toolkit for Neural Machine Translation. ArXiv e-prints.

Research with Sockeye

Sockeye has been used for both academic and industrial research. A list of known publications that use Sockeye is shown below. If you know more, please let us know or submit a pull request (last updated: May 2022).

2023

  • Zhang, Xuan, Kevin Duh, Paul McNamee. "A Hyperparameter Optimization Toolkit for Neural Machine Translation Research". Proceedings of ACL (2023).

2022

  • Currey, Anna, Maria Nădejde, Raghavendra Pappagari, Mia Mayer, Stanislas Lauly, Xing Niu, Benjamin Hsu, Georgiana Dinu. "MT-GenEval: A Counterfactual and Contextual Dataset for Evaluating Gender Accuracy in Machine Translation". Proceedings of EMNLP (2022).
  • Domhan, Tobias, Eva Hasler, Ke Tran, Sony Trenous, Bill Byrne and Felix Hieber. "The Devil is in the Details: On the Pitfalls of Vocabulary Selection in Neural Machine Translation". Proceedings of NAACL-HLT (2022)
  • Fischer, Lukas, Patricia Scheurer, Raphael Schwitter, Martin Volk. "Machine Translation of 16th Century Letters from Latin to German". Workshop on Language Technologies for Historical and Ancient Languages (2022).
  • Knowles, Rebecca, Patrick Littell. "Translation Memories as Baselines for Low-Resource Machine Translation". Proceedings of LREC (2022)
  • McNamee, Paul, Kevin Duh. "The Multilingual Microblog Translation Corpus: Improving and Evaluating Translation of User-Generated Text". Proceedings of LREC (2022)
  • Nadejde Maria, Anna Currey, Benjamin Hsu, Xing Niu, Marcello Federico, Georgiana Dinu. "CoCoA-MT: A Dataset and Benchmark for Contrastive Controlled MT with Application to Formality". Proceedings of NAACL (2022).
  • Weller-Di Marco, Marion, Matthias Huck, Alexander Fraser. "Modeling Target-Side Morphology in Neural Machine Translation: A Comparison of Strategies ". arXiv preprint arXiv:2203.13550 (2022)

2021

  • Bergmanis, Toms, Mārcis Pinnis. "Facilitating Terminology Translation with Target Lemma Annotations". arXiv preprint arXiv:2101.10035 (2021)
  • Briakou, Eleftheria, Marine Carpuat. "Beyond Noise: Mitigating the Impact of Fine-grained Semantic Divergences on Neural Machine Translation". arXiv preprint arXiv:2105.15087 (2021)
  • Hasler, Eva, Tobias Domhan, Sony Trenous, Ke Tran, Bill Byrne, Felix Hieber. "Improving the Quality Trade-Off for Neural Machine Translation Multi-Domain Adaptation". Proceedings of EMNLP (2021)
  • Tang, Gongbo, Philipp Rönchen, Rico Sennrich, Joakim Nivre. "Revisiting Negation in Neural Machine Translation". Transactions of the Association for Computation Linguistics 9 (2021)
  • Vu, Thuy, Alessandro Moschitti. "Machine Translation Customization via Automatic Training Data Selection from the Web". arXiv preprint arXiv:2102.1024 (2021)
  • Xu, Weijia, Marine Carpuat. "EDITOR: An Edit-Based Transformer with Repositioning for Neural Machine Translation with Soft Lexical Constraints." Transactions of the Association for Computation Linguistics 9 (2021)
  • Müller, Mathias, Rico Sennrich. "Understanding the Properties of Minimum Bayes Risk Decoding in Neural Machine Translation". Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (2021)
  • Popović, Maja, Alberto Poncelas. "On Machine Translation of User Reviews." Proceedings of RANLP (2021)
  • Popović, Maja. "On nature and causes of observed MT errors." Proceedings of the 18th MT Summit (Volume 1: Research Track) (2021)
  • Jain, Nishtha, Maja Popović, Declan Groves, Eva Vanmassenhove. "Generating Gender Augmented Data for NLP." Proceedings of the 3rd Workshop on Gender Bias in Natural Language Processing (2021)
  • Vilar, David, Marcello Federico. "A Statistical Extension of Byte-Pair Encoding." Proceedings of IWSLT (2021)

2020

  • Dinu, Georgiana, Prashant Mathur, Marcello Federico, Stanislas Lauly, Yaser Al-Onaizan. "Joint translation and unit conversion for end-to-end localization." Proceedings of IWSLT (2020)
  • Exel, Miriam, Bianka Buschbeck, Lauritz Brandt, Simona Doneva. "Terminology-Constrained Neural Machine Translation at SAP". Proceedings of EAMT (2020).
  • Hisamoto, Sorami, Matt Post, Kevin Duh. "Membership Inference Attacks on Sequence-to-Sequence Models: Is My Data In Your Machine Translation System?" Transactions of the Association for Computational Linguistics, Volume 8 (2020)
  • Naradowsky, Jason, Xuan Zhan, Kevin Duh. "Machine Translation System Selection from Bandit Feedback." arXiv preprint arXiv:2002.09646 (2020)
  • Niu, Xing, Prashant Mathur, Georgiana Dinu, Yaser Al-Onaizan. "Evaluating Robustness to Input Perturbations for Neural Machine Translation". arXiv preprint arXiv:2005.00580 (2020)
  • Niu, Xing, Marine Carpuat. "Controlling Neural Machine Translation Formality with Synthetic Supervision." Proceedings of AAAI (2020)
  • Keung, Phillip, Julian Salazar, Yichao Liu, Noah A. Smith. "Unsupervised Bitext Mining and Translation via Self-Trained Contextual Embeddings." arXiv preprint arXiv:2010.07761 (2020).
  • Sokolov, Alex, Tracy Rohlin, Ariya Rastrow. "Neural Machine Translation for Multilingual Grapheme-to-Phoneme Conversion." arXiv preprint arXiv:2006.14194 (2020)
  • Stafanovičs, Artūrs, Toms Bergmanis, Mārcis Pinnis. "Mitigating Gender Bias in Machine Translation with Target Gender Annotations." arXiv preprint arXiv:2010.06203 (2020)
  • Stojanovski, Dario, Alexander Fraser. "Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation." arXiv preprint arXiv preprint arXiv:2004.14927 (2020)
  • Stojanovski, Dario, Benno Krojer, Denis Peskov, Alexander Fraser. "ContraCAT: Contrastive Coreference Analytical Templates for Machine Translation". Proceedings of COLING (2020)
  • Zhang, Xuan, Kevin Duh. "Reproducible and Efficient Benchmarks for Hyperparameter Optimization of Neural Machine Translation Systems." Transactions of the Association for Computational Linguistics, Volume 8 (2020)
  • Swe Zin Moe, Ye Kyaw Thu, Hnin Aye Thant, Nandar Win Min, and Thepchai Supnithi, "Unsupervised Neural Machine Translation between Myanmar Sign Language and Myanmar Language", Journal of Intelligent Informatics and Smart Technology, April 1st Issue, 2020, pp. 53-61. (Submitted December 21, 2019; accepted March 6, 2020; revised March 16, 2020; published online April 30, 2020)
  • Thazin Myint Oo, Ye Kyaw Thu, Khin Mar Soe and Thepchai Supnithi, "Neural Machine Translation between Myanmar (Burmese) and Dawei (Tavoyan)", In Proceedings of the 18th International Conference on Computer Applications (ICCA 2020), Feb 27-28, 2020, Yangon, Myanmar, pp. 219-227
  • Müller, Mathias, Annette Rios, Rico Sennrich. "Domain Robustness in Neural Machine Translation." Proceedings of AMTA (2020)
  • Rios, Annette, Mathias Müller, Rico Sennrich. "Subword Segmentation and a Single Bridge Language Affect Zero-Shot Neural Machine Translation." Proceedings of the 5th WMT: Research Papers (2020)
  • Popović, Maja, Alberto Poncelas. "Neural Machine Translation between similar South-Slavic languages." Proceedings of the 5th WMT: Research Papers (2020)
  • Popović, Maja, Alberto Poncelas. "Extracting correctly aligned segments from unclean parallel data using character n-gram matching." Proceedings of Conference on Language Technologies & Digital Humanities (JTDH 2020).
  • Popović, Maja, Alberto Poncelas, Marija Brkic, Andy Way. "Neural Machine Translation for translating into Croatian and Serbian." Proceedings of the 7th Workshop on NLP for Similar Languages, Varieties and Dialects (2020)

2019

  • Agrawal, Sweta, Marine Carpuat. "Controlling Text Complexity in Neural Machine Translation." Proceedings of EMNLP (2019)
  • Beck, Daniel, Trevor Cohn, Gholamreza Haffari. "Neural Speech Translation using Lattice Transformations and Graph Networks." Proceedings of TextGraphs-13 (EMNLP 2019)
  • Currey, Anna, Kenneth Heafield. "Zero-Resource Neural Machine Translation with Monolingual Pivot Data." Proceedings of EMNLP (2019)
  • Gupta, Prabhakar, Mayank Sharma. "Unsupervised Translation Quality Estimation for Digital Entertainment Content Subtitles." IEEE International Journal of Semantic Computing (2019)
  • Hu, J. Edward, Huda Khayrallah, Ryan Culkin, Patrick Xia, Tongfei Chen, Matt Post, and Benjamin Van Durme. "Improved Lexically Constrained Decoding for Translation and Monolingual Rewriting." Proceedings of NAACL-HLT (2019)
  • Rosendahl, Jan, Christian Herold, Yunsu Kim, Miguel Graça,Weiyue Wang, Parnia Bahar, Yingbo Gao and Hermann Ney “The RWTH Aachen University Machine Translation Systems for WMT 2019” Proceedings of the 4th WMT: Research Papers (2019)
  • Thompson, Brian, Jeremy Gwinnup, Huda Khayrallah, Kevin Duh, and Philipp Koehn. "Overcoming catastrophic forgetting during domain adaptation of neural machine translation." Proceedings of NAACL-HLT 2019 (2019)
  • Tättar, Andre, Elizaveta Korotkova, Mark Fishel “University of Tartu’s Multilingual Multi-domain WMT19 News Translation Shared Task Submission” Proceedings of 4th WMT: Research Papers (2019)
  • Thazin Myint Oo, Ye Kyaw Thu and Khin Mar Soe, "Neural Machine Translation between Myanmar (Burmese) and Rakhine (Arakanese)", In Proceedings of the Sixth Workshop on NLP for Similar Languages, Varieties and Dialects, NAACL-2019, June 7th 2019, Minneapolis, United States, pp. 80-88

2018

  • Domhan, Tobias. "How Much Attention Do You Need? A Granular Analysis of Neural Machine Translation Architectures". Proceedings of 56th ACL (2018)
  • Kim, Yunsu, Yingbo Gao, and Hermann Ney. "Effective Cross-lingual Transfer of Neural Machine Translation Models without Shared Vocabularies." arXiv preprint arXiv:1905.05475 (2019)
  • Korotkova, Elizaveta, Maksym Del, and Mark Fishel. "Monolingual and Cross-lingual Zero-shot Style Transfer." arXiv preprint arXiv:1808.00179 (2018)
  • Niu, Xing, Michael Denkowski, and Marine Carpuat. "Bi-directional neural machine translation with synthetic parallel data." arXiv preprint arXiv:1805.11213 (2018)
  • Niu, Xing, Sudha Rao, and Marine Carpuat. "Multi-Task Neural Models for Translating Between Styles Within and Across Languages." COLING (2018)
  • Post, Matt and David Vilar. "Fast Lexically Constrained Decoding with Dynamic Beam Allocation for Neural Machine Translation." Proceedings of NAACL-HLT (2018)
  • Schamper, Julian, Jan Rosendahl, Parnia Bahar, Yunsu Kim, Arne Nix, and Hermann Ney. "The RWTH Aachen University Supervised Machine Translation Systems for WMT 2018." Proceedings of the 3rd WMT: Shared Task Papers (2018)
  • Schulz, Philip, Wilker Aziz, and Trevor Cohn. "A stochastic decoder for neural machine translation." arXiv preprint arXiv:1805.10844 (2018)
  • Tamer, Alkouli, Gabriel Bretschner, and Hermann Ney. "On The Alignment Problem In Multi-Head Attention-Based Neural Machine Translation." Proceedings of the 3rd WMT: Research Papers (2018)
  • Tang, Gongbo, Rico Sennrich, and Joakim Nivre. "An Analysis of Attention Mechanisms: The Case of Word Sense Disambiguation in Neural Machine Translation." Proceedings of 3rd WMT: Research Papers (2018)
  • Thompson, Brian, Huda Khayrallah, Antonios Anastasopoulos, Arya McCarthy, Kevin Duh, Rebecca Marvin, Paul McNamee, Jeremy Gwinnup, Tim Anderson, and Philipp Koehn. "Freezing Subnetworks to Analyze Domain Adaptation in Neural Machine Translation." arXiv preprint arXiv:1809.05218 (2018)
  • Vilar, David. "Learning Hidden Unit Contribution for Adapting Neural Machine Translation Models." Proceedings of NAACL-HLT (2018)
  • Vyas, Yogarshi, Xing Niu and Marine Carpuat “Identifying Semantic Divergences in Parallel Text without Annotations”. Proceedings of NAACL-HLT (2018)
  • Wang, Weiyue, Derui Zhu, Tamer Alkhouli, Zixuan Gan, and Hermann Ney. "Neural Hidden Markov Model for Machine Translation". Proceedings of 56th ACL (2018)
  • Zhang, Xuan, Gaurav Kumar, Huda Khayrallah, Kenton Murray, Jeremy Gwinnup, Marianna J Martindale, Paul McNamee, Kevin Duh, and Marine Carpuat. "An Empirical Exploration of Curriculum Learning for Neural Machine Translation." arXiv preprint arXiv:1811.00739 (2018)
  • Swe Zin Moe, Ye Kyaw Thu, Hnin Aye Thant and Nandar Win Min, "Neural Machine Translation between Myanmar Sign Language and Myanmar Written Text", In the second Regional Conference on Optical character recognition and Natural language processing technologies for ASEAN languages 2018 (ONA 2018), December 13-14, 2018, Phnom Penh, Cambodia.
  • Tang, Gongbo, Mathias Müller, Annette Rios and Rico Sennrich. "Why Self-attention? A Targeted Evaluation of Neural Machine Translation Architectures." Proceedings of EMNLP (2018)

2017

  • Domhan, Tobias and Felix Hieber. "Using target-side monolingual data for neural machine translation through multi-task learning." Proceedings of EMNLP (2017).

Owner

  • Name: Amazon Web Services - Labs
  • Login: awslabs
  • Kind: organization
  • Location: Seattle, WA

AWS Labs

GitHub Events

Total
  • Watch event: 13
  • Delete event: 2
  • Issue comment event: 1
  • Push event: 3
  • Pull request review event: 1
  • Pull request event: 5
  • Fork event: 2
  • Create event: 2
Last Year
  • Watch event: 13
  • Delete event: 2
  • Issue comment event: 1
  • Push event: 3
  • Pull request review event: 1
  • Pull request event: 5
  • Fork event: 2
  • Create event: 2

Committers

Last synced: 10 months ago

All Time
  • Total Commits: 821
  • Total Committers: 60
  • Avg Commits per committer: 13.683
  • Development Distribution Score (DDS): 0.507
Past Year
  • Commits: 3
  • Committers: 1
  • Avg Commits per committer: 3.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Felix Hieber f****r 405
Tobias Domhan t****n@g****m 114
Michael Denkowski m****s@a****m 66
Matt Post p****t@c****u 41
David Vilar d****r@a****m 20
Mathias Müller m****r@u****h 19
artemsok 2****k 16
Michael Denkowski m****s@c****u 15
Kellen Sunderland k****d@g****m 11
Xing Niu n****r@g****m 10
Loris Bazzani l****i@g****m 7
Daniel Vidal D****s 7
Pavel Danchenko l****n 7
artemsok a****k@a****e 6
Samuel Larkin S****n 6
Steven Bradtke s****e@g****m 5
lefterav l****v@p****e 4
Hazem Mashlah h****h 4
kpuatamazon 5****n 4
Abdelrahman Badawy a****y@g****m 3
gonzaloiglesiasiglesias g****s 3
Hoang Cuong h****1@g****m 3
Brenton Chu b****u@n****m 2
marismmm 5****m 2
Ke Tran k****m 2
Jonathan Graehl g****l@g****m 2
Henri Yandell h****l@a****m 2
Anna Currey a****y 2
David Vilar d****r@g****m 2
rongzha1 r****g@i****m 1
and 30 more...
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 59
  • Total pull requests: 72
  • Average time to close issues: 3 months
  • Average time to close pull requests: about 1 month
  • Total issue authors: 23
  • Total pull request authors: 18
  • Average comments per issue: 4.02
  • Average comments per pull request: 1.13
  • Merged pull requests: 62
  • Bot issues: 0
  • Bot pull requests: 1
Past Year
  • Issues: 0
  • Pull requests: 3
  • Average time to close issues: N/A
  • Average time to close pull requests: 3 days
  • Issue authors: 0
  • Pull request authors: 3
  • Average comments per issue: 0
  • Average comments per pull request: 0.33
  • Merged pull requests: 1
  • Bot issues: 0
  • Bot pull requests: 1
Top Authors
Issue Authors
  • RamoramaInteractive (10)
  • AmitMY (8)
  • SamuelLarkin (7)
  • bricksdont (4)
  • tomsbergmanis (3)
  • alvations (3)
  • etexaco123 (3)
  • SKT-T1-Thecai (2)
  • vince62s (2)
  • chrkell (2)
  • ejkitchen (2)
  • Verkolle (2)
  • tuglat (1)
  • jingshu-liu (1)
  • nitin-bansal46 (1)
Pull Request Authors
  • fhieber (25)
  • mjdenkowski (23)
  • tdomhan (4)
  • AmitMY (4)
  • hoangcuong2011 (4)
  • Proyag (2)
  • graehl (2)
  • wyattscarpenter (2)
  • iPRET (2)
  • xingniu (2)
  • dependabot[bot] (2)
  • thammegowda (1)
  • surafelml (1)
  • shuoyangd (1)
  • jvamvas (1)
Top Labels
Issue Labels
sockeye_2 (3) question (2) bug (1) sockeye_1 (1)
Pull Request Labels
feature (4) sockeye_3 (4) enhancement (2) bug (2) sockeye_2 (2) dependencies (2)

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 615 last-month
  • Total docker downloads: 8
  • Total dependent packages: 0
  • Total dependent repositories: 4
  • Total versions: 84
  • Total maintainers: 2
  • Total advisories: 1
pypi.org: sockeye

Sequence-to-Sequence framework for Neural Machine Translation

  • Versions: 84
  • Dependent Packages: 0
  • Dependent Repositories: 4
  • Downloads: 615 Last month
  • Docker Downloads: 8
Rankings
Stargazers count: 1.9%
Forks count: 2.9%
Docker downloads count: 4.6%
Average: 5.9%
Dependent repos count: 7.5%
Downloads: 8.7%
Dependent packages count: 10.0%
Maintainers (2)
Last synced: 6 months ago

Dependencies

requirements/requirements.dev.txt pypi
  • check-manifest * development
  • isort <5.0.0 development
  • matplotlib * development
  • mypy ==0.790 development
  • pillow * development
  • pylint ==2.6.0 development
  • pytest * development
  • pytest-cov <2.6.0 development
  • setuptools >=38.6.0 development
  • twine >=1.11.0 development
  • wheel >=0.31.0 development
requirements/requirements.docs.txt pypi
  • pyyaml >=5.1
  • recommonmark *
  • sphinx >=1.7.4
  • sphinx-autodoc-typehints >=1.3.0
  • sphinx_rtd_theme *
requirements/requirements.txt pypi
  • numpy >1.16.0,<2.0.0
  • pyyaml >=5.1
  • sacrebleu ==1.4.14
  • torch >=1.10.0,<1.12.0
sockeye_contrib/vistools/requirements.txt pypi
  • networkx ==2.0
.github/workflows/push_pr.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v1 composite
.github/workflows/python-publish.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
.github/workflows/torch_nightly.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v1 composite
sockeye_contrib/fast_align/Dockerfile docker
  • amazonlinux 2017.09.0.20170930 build
requirements/requirements.deepspeed.txt pypi
  • deepspeed *
requirements/requirements.faiss-cpu.txt pypi
  • faiss-cpu >=1.7.2
requirements/requirements.faiss-gpu.txt pypi
  • faiss-gpu >=1.7.2
setup.py pypi