uclnlp/ntp
End-to-End Differentiable Proving
This is experimental research code for developers and researchers working on advanced AI systems. It allows you to explore techniques for making logical reasoning (like deducing facts from rules) compatible with neural networks. You would input facts and logical rules, and it helps you understand how a neural network can learn to prove new statements.
No commits in the last 6 months.
Use this if you are a machine learning researcher or AI developer exploring differentiable logic and knowledge graph reasoning.
Not ideal if you need a production-ready tool for general-purpose knowledge representation or logical inference, as it is highly experimental and unmaintained.
Stars
90
Forks
18
Language
NewLisp
License
Apache-2.0
Category
Last pushed
Nov 21, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/uclnlp/ntp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
pymc-devs/pytensor
PyTensor allows you to define, optimize, and efficiently evaluate mathematical expressions...
arogozhnikov/einops
Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
lava-nc/lava-dl
Deep Learning library for Lava
tensorly/tensorly
TensorLy: Tensor Learning in Python.
tensorpack/tensorpack
A Neural Net Training Interface on TensorFlow, with focus on speed + flexibility