twitter-research/lmsoc
Code for reproducing our paper: LMSOC: An Approach for Socially Sensitive Pretraining
This project helps natural language processing (NLP) researchers and engineers improve how language models understand social context. It takes social context data, like geographical location or time, and integrates it into language model pretraining. The output is a language model that produces more contextually accurate predictions. NLP practitioners developing advanced language models would use this.
No commits in the last 6 months.
Use this if you are an NLP researcher or engineer looking to enhance your language models' ability to understand and generate language that is sensitive to social, geographical, or temporal contexts.
Not ideal if you are looking for a pre-trained, production-ready language model or a tool for general text analysis without a focus on social context integration.
Stars
13
Forks
1
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Oct 22, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/twitter-research/lmsoc"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
PaddlePaddle/PaddleNLP
Easy-to-use and powerful LLM and SLM library with awesome model zoo.
meta-llama/llama-cookbook
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started...
arcee-ai/mergekit
Tools for merging pretrained large language models.
changyeyu/LLM-RL-Visualized
๐100+ ๅๅ LLM / RL ๅ็ๅพ๐๏ผใๅคงๆจกๅ็ฎๆณใไฝ่ ๅทจ็ฎ๏ผ๐ฅ๏ผ100+ LLM/RL Algorithm Maps ๏ผ
mindspore-lab/step_into_llm
MindSpore online courses: Step into LLM