Shark-NLP/OpenICL
OpenICL is an open-source framework to facilitate research, development, and prototyping of in-context learning.
This framework helps AI/ML researchers and developers quickly set up and test different ways to make large language models (LLMs) perform specific tasks, like sentiment analysis, by providing examples directly in the prompt. It takes your dataset and a large language model, then allows you to experiment with various methods for selecting and formatting those examples. The output helps you compare how well different approaches make the LLM understand and complete its task.
584 stars. No commits in the last 6 months. Available on PyPI.
Use this if you are a researcher or developer prototyping and comparing various in-context learning techniques for large language models.
Not ideal if you are an end-user looking for a ready-to-use application of in-context learning without needing to write code or experiment with different methods.
Stars
584
Forks
30
Language
Python
License
Apache-2.0
Category
Last pushed
Oct 03, 2023
Commits (30d)
0
Dependencies
15
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/Shark-NLP/OpenICL"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
n-waves/multifit
The code to reproduce results from paper "MultiFiT: Efficient Multi-lingual Language Model...
princeton-nlp/SimCSE
[EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821
yxuansu/SimCTG
[NeurIPS'22 Spotlight] A Contrastive Framework for Neural Text Generation
alibaba-edu/simple-effective-text-matching
Source code of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".
alibaba-edu/simple-effective-text-matching-pytorch
A pytorch implementation of the ACL2019 paper "Simple and Effective Text Matching with Richer...