jaketae/fnet

PyTorch implementation of FNet: Mixing Tokens with Fourier transforms

39
/ 100
Emerging

This project helps machine learning engineers and researchers by providing a PyTorch implementation of FNet. It takes in sequence data, similar to what a Transformer would process, and outputs model predictions with significantly faster computation. If you work with large language models or other sequence-to-sequence tasks, this offers a more efficient alternative to traditional Transformer architectures.

No commits in the last 6 months.

Use this if you are a machine learning practitioner looking for a faster, more computationally efficient deep learning model for sequence processing tasks than a standard Transformer.

Not ideal if you need a pre-trained model for immediate use without any model architecture modifications or are not familiar with deep learning frameworks like PyTorch.

natural-language-processing deep-learning-optimization sequence-modeling large-language-models model-efficiency
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

29

Forks

7

Language

Python

License

MIT

Last pushed

May 17, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/jaketae/fnet"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.