attention-is-all-you-need-pytorch and transformer

These two tools are competitors, as both are independent PyTorch implementations of the Transformer model from the same foundational paper, forcing a choice between them based on factors like community support or additional features not reflected in the provided descriptions.

transformer
35
Emerging
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 0/25
Adoption 7/25
Maturity 16/25
Community 12/25
Stars: 9,651
Forks: 2,094
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 37
Forks: 5
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About attention-is-all-you-need-pytorch

jadore801120/attention-is-all-you-need-pytorch

A PyTorch implementation of the Transformer model in "Attention is All You Need".

This project helps machine learning engineers build and train custom machine translation systems. It takes parallel text in two languages (like English and German) and outputs a trained model that can translate new text between those languages. It is designed for developers working on natural language processing tasks who need a flexible, state-of-the-art translation framework.

natural-language-processing machine-translation deep-learning neural-networks language-modeling

About transformer

tranquoctrinh/transformer

This is a PyTorch implementation of the Transformer model in the paper Attention is All You Need

This project helps machine learning engineers and researchers implement transformer models for sequence-to-sequence tasks like machine translation. It takes text in one language (like English) as input and outputs the translated text in another language (like Vietnamese). This is designed for those building and experimenting with neural machine translation systems.

natural-language-processing machine-translation deep-learning-research text-generation sequence-modeling

Scores updated daily from GitHub, PyPI, and npm data. How scores work