takara-ai/SwarmFormer
A pytorch implementation of SwarmFormer for text classification.
This project offers a specialized tool for analyzing the sentiment of written English text, helping you understand if a review, comment, or social media post is positive or negative. You feed it raw text, and it outputs a classification (e.g., positive, negative) along with performance metrics. It's designed for data scientists, machine learning engineers, or researchers building or evaluating text classification systems.
Use this if you need to perform highly accurate sentiment analysis on English text and want to leverage a novel, efficient transformer architecture.
Not ideal if you need to classify text in languages other than English, generate text, perform machine translation, or process very long text sequences exceeding 768 tokens.
Stars
16
Forks
4
Language
Python
License
—
Category
Last pushed
Feb 28, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/takara-ai/SwarmFormer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
jadore801120/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
bhavnicksm/vanilla-transformer-jax
JAX/Flax implimentation of 'Attention Is All You Need' by Vaswani et al....
kyegomez/SparseAttention
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with...
AbdelStark/attnres
Rust implementation of Attention Residuals from MoonshotAI/Kimi