partarstu/transformers-in-java
Experimental project for AI and NLP based on Transformer Architecture
This is an experimental project for AI and NLP based on Transformer Architecture implemented in Java. It allows developers to build, train, and modify Transformer-based models for tasks like masked language modeling and text generation. The project uses the DeepLearning4J framework and provides configurable layers and blocks to create various model architectures.
No commits in the last 6 months.
Use this if you are a Java developer looking to experiment with or implement Transformer models for AI and NLP tasks using DeepLearning4J.
Not ideal if you are an end-user seeking a ready-to-use NLP application or if you prefer frameworks other than DeepLearning4J for your machine learning projects.
Stars
16
Forks
5
Language
Java
License
Apache-2.0
Category
Last pushed
Jan 01, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/partarstu/transformers-in-java"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ThilinaRajapakse/simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling,...
jsksxs360/How-to-use-Transformers
Transformers 库快速入门教程
google/deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences...
Denis2054/Transformers-for-NLP-2nd-Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning,...
abhimishra91/transformers-tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks