affjljoo3581/polyglot-jax-inference
TPU에서 한국어용 LLM 추론을 위한 Jax/Flax 구현체입니다.
This project offers a way to run large language models (LLMs) specifically for the Korean language on Google's Tensor Processing Units (TPUs). It takes an existing Korean LLM, such as Polyglot-ko or KORani, and processes it to generate Korean text. This is designed for developers and researchers working with large-scale Korean natural language processing tasks who need high-performance inference.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher focused on deploying or experimenting with Korean language models based on the GPT-NeoX architecture for efficient text generation on TPUs.
Not ideal if you are looking to deploy LLaMA-based Korean models or if you do not have access to or experience with TPU environments and Jax/Flax.
Stars
12
Forks
2
Language
Python
License
MIT
Category
Last pushed
Jun 12, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/affjljoo3581/polyglot-jax-inference"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ThilinaRajapakse/simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling,...
jsksxs360/How-to-use-Transformers
Transformers 库快速入门教程
google/deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences...
Denis2054/Transformers-for-NLP-2nd-Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning,...
abhimishra91/transformers-tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks