SimplexLab/TorchJD

Library for Jacobian descent with PyTorch. It enables the optimization of neural networks with multiple losses (e.g. multi-task learning).

55
/ 100
Established

This library helps machine learning engineers and researchers efficiently train neural networks that have multiple goals or "loss functions" simultaneously, like in multi-task learning. Instead of combining these goals into a single average and risking poorer performance on some, it takes all individual loss functions as input and produces an optimized set of model parameters where each goal is better balanced. It's designed for anyone building advanced AI models with PyTorch.

306 stars. Available on PyPI.

Use this if you are training neural networks with PyTorch and encounter scenarios where you need to optimize for multiple conflicting objectives or loss functions at the same time, such as in multi-task learning.

Not ideal if your neural network only has a single objective to optimize or if you are not working with PyTorch.

multi-task learning neural network training machine learning research model optimization deep learning
Maintenance 10 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 10 / 25

How are scores calculated?

Stars

306

Forks

15

Language

Python

License

MIT

Last pushed

Mar 11, 2026

Commits (30d)

0

Dependencies

4

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SimplexLab/TorchJD"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.