yulang/phrasal-composition-in-transformers
This repo contains datasets and code for Assessing Phrasal Representation and Composition in Transformers, by Lang Yu and Allyson Ettinger.
This project helps natural language processing researchers understand how Transformer models handle the meaning of phrases, not just individual words. It takes pre-trained Transformer models and specialized linguistic datasets as input, then outputs analysis metrics showing how well these models represent and combine word meanings into complex phrases. NLP researchers and computational linguists would use this to evaluate and compare different Transformer architectures.
No commits in the last 6 months.
Use this if you are an NLP researcher investigating the internal workings of Transformer models, specifically their ability to compose meanings of words into phrases.
Not ideal if you are looking for a tool to directly improve the performance of your downstream NLP application without needing to analyze model internals.
Stars
11
Forks
1
Language
Python
License
—
Category
Last pushed
Jul 19, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/yulang/phrasal-composition-in-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action