princeton-nlp/dyck-transformer
[ACL 2021] Self-Attention Networks Can Process Bounded Hierarchical Languages
This project helps evaluate how different language models, specifically Transformers and LSTMs, handle complex hierarchical structures found in language. It takes configuration settings for generating synthetic 'Dyck languages' and language model parameters, then outputs metrics showing how well models learn and process these structures. Researchers in natural language processing and computational linguistics would use this to understand model capabilities.
No commits in the last 6 months.
Use this if you are an NLP researcher or computational linguist investigating the inductive biases and processing capabilities of neural language models on structured, hierarchical data.
Not ideal if you are looking for a tool to apply to real-world natural language tasks, as this focuses on synthetic language structures for theoretical research.
Stars
13
Forks
2
Language
Python
License
—
Category
Last pushed
Jun 01, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/princeton-nlp/dyck-transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action