mobarski/aidapter
Adapter / facade for language models (OpenAI, Anthropic, Cohere, local transformers, etc)
This tool helps developers streamline their interaction with various language models, both commercial (like OpenAI and Anthropic) and open-source (like Hugging Face Transformers). It allows them to provide text prompts and receive completions or generate numerical representations (embeddings) of text. Developers building applications that leverage different AI models will find this useful for consistent access and management.
No commits in the last 6 months.
Use this if you are a developer building an application that needs to interact with multiple AI language models and you want a unified, consistent way to send prompts, get completions, or generate text embeddings.
Not ideal if you are a non-developer seeking a user-friendly application to directly interact with AI models for creative writing, data analysis, or other end-user tasks.
Stars
20
Forks
—
Language
Python
License
MIT
Category
Last pushed
Sep 21, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mobarski/aidapter"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
gaussalgo/adaptor
ACL 2022: Adaptor: a library to easily adapt a language model to your own task, domain, or...
ylsung/VL_adapter
PyTorch code for "VL-Adapter: Parameter-Efficient Transfer Learning for Vision-and-Language...
intersun/LightningDOT
source code and pre-trained/fine-tuned checkpoint for NAACL 2021 paper LightningDOT
kyegomez/M2PT
Implementation of M2PT in PyTorch from the paper: "Multimodal Pathway: Improve Transformers with...