grypesc/AdaGauss

2024 Neurips paper on Continual Learning and Class Incremental Learning

22
/ 100
Experimental

This project helps machine learning researchers overcome the 'task-recency bias' problem when training models incrementally without storing old data. It takes a pre-trained feature extractor and an auxiliary neural network, along with a stream of new data, and outputs a more stable, incrementally trained model that retains knowledge of older classes better. It's designed for researchers working on advanced machine learning algorithms, specifically in continual learning.

No commits in the last 6 months.

Use this if you are an ML researcher developing or experimenting with continual learning models, particularly in exemplar-free class incremental learning scenarios where you cannot store past data and need to mitigate task-recency bias.

Not ideal if you are looking for a plug-and-play solution for a business problem, or if you are not deeply involved in machine learning model development and research.

continual-learning class-incremental-learning machine-learning-research deep-learning-algorithms representation-learning
No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 7 / 25

How are scores calculated?

Stars

11

Forks

1

Language

Python

License

Last pushed

May 21, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/grypesc/AdaGauss"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.