ccfco/External-Attention-tensorflow

🍀 Tensorflow implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐

23
/ 100
Experimental

This project offers a collection of various 'attention mechanisms' and other model components implemented in Tensorflow. It provides ready-to-use code snippets that researchers and machine learning engineers can integrate into their neural networks to explore different architectural approaches. The inputs are typically tensor data (like image features or sequence embeddings), and the outputs are modified tensors, often with enhanced feature representations.

No commits in the last 6 months.

Use this if you are a machine learning researcher or practitioner looking to quickly experiment with different attention mechanisms, multi-layer perceptrons, or convolutional blocks in your TensorFlow models to improve performance or understand their impact.

Not ideal if you are looking for a high-level, production-ready solution for a specific application, as this project focuses on providing building blocks rather than end-to-end solutions.

deep-learning-research neural-network-architecture computer-vision natural-language-processing tensorflow-development
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 8 / 25

How are scores calculated?

Stars

41

Forks

3

Language

Python

License

Last pushed

Apr 12, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ccfco/External-Attention-tensorflow"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.