XingLuxi/Cal-FLOPs-for-PLM

Calculating FLOPs of Pre-trained Models in NLP

29
/ 100
Experimental

This tool helps machine learning engineers and researchers quickly understand the computational cost and memory footprint of their Natural Language Processing (NLP) models. You provide your pre-trained NLP model, and it outputs the number of floating-point operations (FLOPs) and parameters required. This allows you to evaluate model efficiency before deployment or large-scale training.

No commits in the last 6 months.

Use this if you need to compare the efficiency of different NLP models or optimize existing ones for deployment on resource-constrained devices.

Not ideal if you are a business user looking for a no-code solution to optimize your NLP application; this is a developer tool requiring Python and PyTorch knowledge.

NLP-model-optimization computational-efficiency model-profiling deep-learning-engineering
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 15 / 25

How are scores calculated?

Stars

18

Forks

4

Language

Python

License

Last pushed

Mar 29, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/XingLuxi/Cal-FLOPs-for-PLM"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.