newfull5/NLLB-200-Distilled-350M-en-ko
nllb-200 distilled 350M for English to Korean translation
This project offers a compact English-to-Korean translation tool that takes English text as input and provides fluent Korean text as output. It's designed for anyone needing to translate English documents, articles, or messages into Korean efficiently, especially those with standard computers or limited computing resources.
No commits in the last 6 months.
Use this if you need to translate English text into Korean quickly and your computer doesn't have a high-end graphics card or extensive memory.
Not ideal if you require the absolute highest translation accuracy for very complex or nuanced texts, where slightly larger models might offer an advantage.
Stars
28
Forks
1
Language
Jupyter Notebook
License
—
Category
Last pushed
Apr 27, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/newfull5/NLLB-200-Distilled-350M-en-ko"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
rasbt/LLMs-from-scratch
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
facebookresearch/LayerSkip
Code for "LayerSkip: Enabling Early Exit Inference and Self-Speculative Decoding", ACL 2024
FareedKhan-dev/train-llm-from-scratch
A straightforward method for training your LLM, from downloading data to generating text.
kmeng01/rome
Locating and editing factual associations in GPT (NeurIPS 2022)
datawhalechina/llms-from-scratch-cn
仅需Python基础,从0构建大语言模型;从0逐步构建GLM4\Llama3\RWKV6, 深入理解大模型原理