zechenli03/ZARA
Official implementation of "ZARA: Zero-shot Motion Time-Series Analysis via Knowledge and Retrieval Driven LLM Agents"
This project helps researchers and practitioners classify human activities from motion sensor data without needing to train a specific model for each activity. It takes raw motion time-series data from various sensors and identifies what activity is happening. Anyone involved in fields like health monitoring, sports analytics, or smart environments would find this useful.
Use this if you need to accurately identify human activities from sensor data, even for activities you haven't explicitly trained a model on.
Not ideal if your primary need is to build or fine-tune task-specific classification models for human activity recognition.
Stars
14
Forks
2
Language
Python
License
—
Category
Last pushed
Oct 30, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/zechenli03/ZARA"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
llmware-ai/llmware
Unified framework for building enterprise RAG pipelines with small, specialized models
Sinapsis-AI/sinapsis-chatbots
Monorepo for sinapsis templates supporting LLM based Agents
aimclub/ProtoLLM
Framework for prototyping of LLM-based applications
Azure-Samples/azureai-foundry-finetuning-raft
A recipe that will walk you through using either Meta Llama 3.1 405B or OpenAI GPT-4o deployed...
xi029/Qwen3-VL-MoeLORA
在千问最新的多模态image-text模型Qwen3-VL-4B-Instruct 进行多种lora微调对比效果,通过langchain+RAG+多智能体(Multi-Agent)进行部署