vitali-sialedchyk/stability-first-ai
Solving catastrophic forgetting with Recursive Time architecture, Active Sleep (generative replay), and Temporal LoRA. Proving the "Lazarus Effect" in neural networks.
This project offers new ways for AI models to learn new things without forgetting old knowledge. It takes a model that might be struggling with 'catastrophic forgetting' (where learning something new erases old skills) and helps it remember or even recover lost information. The output is a more stable, efficient, and adaptable AI model, especially useful for those working with large language models, image recognition, or complex AI systems.
Use this if your AI models struggle to retain previously learned information when trained on new data, or if you need to recover lost capabilities from a damaged or pruned model without retraining from scratch.
Not ideal if you are looking for a plug-and-play solution for simple, one-off training tasks where catastrophic forgetting is not a significant concern.
Stars
10
Forks
1
Language
Python
License
—
Category
Last pushed
Jan 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/vitali-sialedchyk/stability-first-ai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aimagelab/mammoth
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of...
LAMDA-CL/PyCIL
PyCIL: A Python Toolbox for Class-Incremental Learning
GMvandeVen/continual-learning
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR,...
LAMDA-CL/LAMDA-PILOT
🎉 PILOT: A Pre-trained Model-Based Continual Learning Toolbox
mmasana/FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.