mithril-security/bastionlab
A simple framework for privacy-friendly data science collaboration
This helps data owners and data scientists collaborate on sensitive datasets without compromising privacy. Data owners upload their datasets along with specific privacy rules, and data scientists can then run analyses or train AI models remotely. The data owner receives anonymized results, enabling secure data exploration and AI development on confidential information.
174 stars. No commits in the last 6 months.
Use this if you need to share a confidential dataset for analysis or AI model training with an external data scientist while maintaining strict control over data privacy and ensuring only anonymized results are revealed.
Not ideal if your data doesn't contain any personally identifiable information or other sensitive details that require strict privacy controls during analysis.
Stars
174
Forks
11
Language
Rust
License
Apache-2.0
Category
Last pushed
Sep 29, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/mithril-security/bastionlab"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ICME-Lab/jolt-atlas
Fast zkVM born at a16z Crypto substantially adapted by ICME Labs (NovaNet) for verifiable...
socathie/circomlib-ml
Circom Circuits Library for Machine Learning
mithril-security/blindai
Confidential AI deployment with secure enclaves :lock:
SpeyTech/certifiable-inference
Deterministic, bit-perfect AI inference for safety-critical systems
gizatechxyz/LuminAIR
A zkML framework for ensuring the integrity of computational graphs using Circle STARK proofs