profullstack/infernet-protocol

Infernet: A Peer-to-Peer Distributed GPU Inference Protocol

32
/ 100
Emerging

This is a foundational development repository for building a peer-to-peer network for distributed GPU inference. It provides the core application structure and database schema for managing nodes, providers, aggregators, clients, models, and jobs within such a network. Developers working on the Infernet Protocol would use this to set up and run the local development environment for the web and desktop applications.

Use this if you are a developer building or extending the Infernet Protocol and need to set up its web or desktop application locally.

Not ideal if you are an end-user looking to simply consume or run GPU inference tasks without developing the underlying protocol.

distributed-systems peer-to-peer-networking gpu-computing protocol-development backend-development
No Package No Dependents
Maintenance 10 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

22

Forks

Language

JavaScript

License

ISC

Last pushed

Mar 13, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/profullstack/infernet-protocol"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.