corl-team/rebased

Official implementation of the paper "Linear Transformers with Learnable Kernel Functions are Better In-Context Models"

31
/ 100
Emerging

This project offers an improved approach to building efficient language models, particularly focusing on how well they learn from context. It takes existing neural network architectures, specifically 'Based' models, and refines their core mathematical functions. The outcome is a model that demonstrates enhanced ability to understand and generate text based on provided examples. Researchers and practitioners working on large language models would use this.

169 stars. No commits in the last 6 months.

Use this if you are a machine learning researcher or engineer developing large language models and want to improve their in-context learning capabilities and overall language modeling performance with more efficient architectures.

Not ideal if you are an end-user simply looking to apply an existing language model for tasks like text generation or summarization without needing to develop or deeply modify the underlying model architecture.

natural-language-processing large-language-models deep-learning-research neural-network-architecture in-context-learning
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 5 / 25

How are scores calculated?

Stars

169

Forks

3

Language

Python

License

Apache-2.0

Last pushed

Jan 16, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/corl-team/rebased"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.