BobMcDear/llaf
LLMs in Futhark
This project offers a way to perform large language model (LLM) inference using Futhark, a functional programming language. It takes pre-trained LLM parameters and an initial text context as input, then generates additional text tokens. This tool is designed for developers who are building high-performance deep learning applications and are interested in exploring alternative languages for GPU-accelerated array processing.
No commits in the last 6 months.
Use this if you are a developer with a functional programming background (like Haskell or ML family languages) looking to implement and experiment with LLM inference using Futhark for data-parallel performance on GPUs or multi-threaded CPUs.
Not ideal if you need state-of-the-art performance for LLM inference, as dedicated deep learning frameworks like PyTorch will be significantly faster.
Stars
12
Forks
1
Language
Futhark
License
MIT
Category
Last pushed
Sep 01, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/BobMcDear/llaf"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AI-Planning/l2p
Library for LLM-driven action model acquisition via natural language
datawhalechina/self-llm
《开源大模型食用指南》针对中国宝宝量身打造的基于Linux环境快速微调(全参数/Lora)、部署国内外开源大模型(LLM)/多模态大模型(MLLM)教程
microsoft/LMOps
General technology for enabling AI capabilities w/ LLMs and MLLMs
liguodongiot/llm-action
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
theaniketgiri/create-llm
The fastest way to build and start training your own LLM. CLI tool that scaffolds...