adapters and aidapter

adapters
72
Verified
aidapter
22
Experimental
Maintenance 13/25
Adoption 12/25
Maturity 25/25
Community 22/25
Maintenance 0/25
Adoption 6/25
Maturity 16/25
Community 0/25
Stars: 2,802
Forks: 375
Downloads:
Commits (30d): 1
Language: Python
License: Apache-2.0
Stars: 20
Forks:
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No risk flags
Stale 6m No Package No Dependents

About adapters

adapter-hub/adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

This library helps machine learning engineers and researchers fine-tune large language models (LLMs) more efficiently. It takes a pre-trained Transformer model and task-specific datasets, then allows you to add and train small, specialized 'adapter' modules. The output is a highly optimized model for specific NLP tasks like text classification or question answering, without needing to retrain the entire large model.

natural-language-processing machine-learning-engineering deep-learning-research model-optimization transfer-learning

About aidapter

mobarski/aidapter

Adapter / facade for language models (OpenAI, Anthropic, Cohere, local transformers, etc)

This tool helps developers streamline their interaction with various language models, both commercial (like OpenAI and Anthropic) and open-source (like Hugging Face Transformers). It allows them to provide text prompts and receive completions or generate numerical representations (embeddings) of text. Developers building applications that leverage different AI models will find this useful for consistent access and management.

AI-application-development language-model-integration natural-language-processing text-embedding API-abstraction

Scores updated daily from GitHub, PyPI, and npm data. How scores work