SimplexLab/TorchJD
Library for Jacobian descent with PyTorch. It enables the optimization of neural networks with multiple losses (e.g. multi-task learning).
This library helps machine learning engineers and researchers efficiently train neural networks that have multiple goals or "loss functions" simultaneously, like in multi-task learning. Instead of combining these goals into a single average and risking poorer performance on some, it takes all individual loss functions as input and produces an optimized set of model parameters where each goal is better balanced. It's designed for anyone building advanced AI models with PyTorch.
306 stars. Available on PyPI.
Use this if you are training neural networks with PyTorch and encounter scenarios where you need to optimize for multiple conflicting objectives or loss functions at the same time, such as in multi-task learning.
Not ideal if your neural network only has a single objective to optimize or if you are not working with PyTorch.
Stars
306
Forks
15
Language
Python
License
MIT
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Dependencies
4
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SimplexLab/TorchJD"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)