gdufsnlp/PMAES

Code for Paper "PMAES: Prompt-mapping Contrastive Learning for Cross-prompt Automated Essay Scoring" ACL2023

20
/ 100
Experimental

This project helps educators, researchers, and administrators automatically score essays that respond to various prompts. It takes essays written for different prompts as input and provides consistent, objective scores as output. The tool is designed for anyone needing to efficiently evaluate large volumes of student writing across diverse assignments.

No commits in the last 6 months.

Use this if you need to reliably score essays written in response to different prompts without having to retrain a scoring model for each new assignment.

Not ideal if you only need to score essays for a single, consistent prompt or require highly nuanced, qualitative feedback beyond a score.

automated-essay-scoring education-technology writing-assessment educational-measurement language-arts-scoring
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 7 / 25

How are scores calculated?

Stars

11

Forks

1

Language

Python

License

Last pushed

Oct 06, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/gdufsnlp/PMAES"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.