tejasvaidhyadev/ALBERT.jl
ALBERT(A Lite BERT for Self-Supervised Learning of Language Representations) implementation in Julia
This project provides an implementation of ALBERT, a lightweight language model, in Julia. It takes raw text data as input and helps train a model that understands language context and relationships between sentences. Data scientists or AI researchers working with Julia who need efficient natural language processing capabilities would use this.
No commits in the last 6 months.
Use this if you are a data scientist or AI researcher using Julia and need a resource-efficient language model for tasks like text analysis or natural language understanding.
Not ideal if you are not working within the Julia programming language ecosystem or need a production-ready, fully mature NLP library without further development.
Stars
7
Forks
—
Language
Julia
License
MIT
Category
Last pushed
Aug 24, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/tejasvaidhyadev/ALBERT.jl"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Tongjilibo/bert4torch
An elegent pytorch implement of transformers
nyu-mll/jiant
jiant is an nlp toolkit
lonePatient/TorchBlocks
A PyTorch-based toolkit for natural language processing
monologg/JointBERT
Pytorch implementation of JointBERT: "BERT for Joint Intent Classification and Slot Filling"
grammarly/gector
Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite"...