ropensci/pangoling
An R package for estimating the log-probabilities of words in a given context using transformer models.
This tool helps psycholinguistic researchers analyze how predictable words are within sentences. You provide your experimental text, and it outputs numerical log-probabilities for each word, indicating its expectedness in context. It's designed for researchers studying language comprehension or production.
Use this if you are a psycholinguist who needs to quantify word predictability in your linguistic data using advanced language models like GPT-2.
Not ideal if your primary goal is general natural language processing tasks or machine learning applications outside of psycholinguistics.
Stars
12
Forks
—
Language
R
License
—
Category
Last pushed
Feb 23, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ropensci/pangoling"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
LoicGrobol/zeldarose
Train transformer-based models.
CPJKU/wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of...
yuanzhoulvpi2017/zero_nlp
中文nlp解决方案(大模型、数据、模型、训练、推理)
minggnim/nlp-models
A repository for training transformer based models
IntelLabs/nlp-architect
A model library for exploring state-of-the-art deep learning topologies and techniques for...