JoaoLages/RATransformers

RATransformers 🐭- Make your transformer (like BERT, RoBERTa, GPT-2 and T5) Relation Aware!

44
/ 100
Emerging

This project helps machine learning engineers and NLP researchers improve how transformer models understand and extract information from structured data like tables. It takes your existing transformer model and adds a layer that understands relationships within the input, allowing the model to perform better on tasks like answering questions about tabular data. The output is a more accurate, "relation-aware" transformer model.

No commits in the last 6 months. Available on PyPI.

Use this if you are a machine learning engineer or NLP researcher working with transformer models on tasks involving structured or semi-structured data, such as tables, and want to enhance their comprehension of inherent relationships within that data.

Not ideal if you are looking for a pre-trained, ready-to-use model for general text processing without a focus on structured data relationships or if you are not familiar with transformer model fine-tuning.

natural-language-processing table-question-answering information-extraction transformer-models structured-data-analysis
Stale 6m No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 25 / 25
Community 11 / 25

How are scores calculated?

Stars

42

Forks

5

Language

Python

License

MIT

Last pushed

Dec 14, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/JoaoLages/RATransformers"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.