tyrell/llm-ollama-llamaindex-bootstrap-ui

This is a LlamaIndex project bootstrapped with create-llama to act as a full stack UI to accompany Retrieval-Augmented Generation (RAG) Bootstrap Application.

33
/ 100
Emerging

This is a user interface that works with a "Retrieval-Augmented Generation" (RAG) system to help you get answers from your own documents. You ask questions through this interface, and it retrieves information from your data to give you relevant answers. It's designed for anyone who wants to easily query their proprietary information and receive intelligent, context-aware responses.

No commits in the last 6 months.

Use this if you have a pre-existing RAG system and need a simple, full-stack interface for users to submit queries and view results.

Not ideal if you're looking for a standalone RAG system or a tool to build the backend logic for document ingestion and indexing.

information-retrieval document-qa internal-knowledge-base enterprise-search
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 18 / 25

How are scores calculated?

Stars

32

Forks

14

Language

TypeScript

License

Last pushed

Feb 23, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/tyrell/llm-ollama-llamaindex-bootstrap-ui"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.