corl-team/rebased
Official implementation of the paper "Linear Transformers with Learnable Kernel Functions are Better In-Context Models"
This project offers an improved approach to building efficient language models, particularly focusing on how well they learn from context. It takes existing neural network architectures, specifically 'Based' models, and refines their core mathematical functions. The outcome is a model that demonstrates enhanced ability to understand and generate text based on provided examples. Researchers and practitioners working on large language models would use this.
169 stars. No commits in the last 6 months.
Use this if you are a machine learning researcher or engineer developing large language models and want to improve their in-context learning capabilities and overall language modeling performance with more efficient architectures.
Not ideal if you are an end-user simply looking to apply an existing language model for tasks like text generation or summarization without needing to develop or deeply modify the underlying model architecture.
Stars
169
Forks
3
Language
Python
License
Apache-2.0
Category
Last pushed
Jan 16, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/corl-team/rebased"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...