ai-dial and ai-dial-core

The core backend service and its documentation are **ecosystem siblings** — the documentation provides guidance for deploying and integrating the unified API that the core component exposes.

ai-dial
57
Established
ai-dial-core
57
Established
Maintenance 10/25
Adoption 10/25
Maturity 16/25
Community 21/25
Maintenance 17/25
Adoption 10/25
Maturity 16/25
Community 14/25
Stars: 123
Forks: 36
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: Apache-2.0
Stars: 653
Forks: 39
Downloads:
Commits (30d): 18
Language: Java
License: Apache-2.0
No Package No Dependents
No Package No Dependents

About ai-dial

epam/ai-dial

Documentation for AI DIAL

This project helps developers and DevOps engineers deploy and manage a unified platform for large language model (LLM) applications. It takes various LLM services (Azure OpenAI, AWS Bedrock, GCP VertexAI, or self-hosted models) and provides a customizable, integrated chat application with administrative controls. The primary users are developers building LLM-powered applications and operations teams responsible for deploying and maintaining them.

LLM deployment chatbot development cloud infrastructure DevOps AI platform management

About ai-dial-core

epam/ai-dial-core

The main component of AI DIAL, which provides unified API to different chat completion and embedding models, assistants, and applications

DIAL Core acts as a central hub for connecting various AI large language models, like chat completion and embedding models, to different applications. It takes requests from your applications and routes them to the appropriate AI model, then delivers the model's response back to your application. This is ideal for developers or platform engineers building applications that need to interact with multiple AI models.

AI platform engineering Large language model integration API management Application development Backend services

Scores updated daily from GitHub, PyPI, and npm data. How scores work