adapters and aidapter
About adapters
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
This library helps machine learning engineers and researchers fine-tune large language models (LLMs) more efficiently. It takes a pre-trained Transformer model and task-specific datasets, then allows you to add and train small, specialized 'adapter' modules. The output is a highly optimized model for specific NLP tasks like text classification or question answering, without needing to retrain the entire large model.
About aidapter
mobarski/aidapter
Adapter / facade for language models (OpenAI, Anthropic, Cohere, local transformers, etc)
This tool helps developers streamline their interaction with various language models, both commercial (like OpenAI and Anthropic) and open-source (like Hugging Face Transformers). It allows them to provide text prompts and receive completions or generate numerical representations (embeddings) of text. Developers building applications that leverage different AI models will find this useful for consistent access and management.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work