EvolvingLMMs-Lab/EgoLife
[CVPR 2025] EgoLife: Towards Egocentric Life Assistant
This project helps individuals manage their daily lives by providing an AI assistant that understands their activities from a first-person perspective. It takes egocentric video and audio recordings, typically from smart glasses, and generates detailed captions of events and actions. The output is a highly contextualized memory aid that can answer questions about past activities, track habits, and help recall events, useful for personal productivity and memory support.
403 stars. No commits in the last 6 months.
Use this if you need an AI to automatically log and interpret your daily experiences from a first-person viewpoint, helping you remember events, track routines, and answer specific questions about your past actions.
Not ideal if you're looking for a general-purpose AI assistant that doesn't rely on continuous, first-person visual and auditory data capture.
Stars
403
Forks
19
Language
Python
License
—
Category
Last pushed
Mar 19, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/EvolvingLMMs-Lab/EgoLife"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
visual-haystacks/mirage
🔥 [ICLR 2025] Official PyTorch Model "Visual Haystacks: A Vision-Centric Needle-In-A-Haystack Benchmark"
Devanik21/xylia-vision
Vision transformer-powered knowledge extraction. Analyze any image: botanical taxonomy, cultural...
anishalle/YOLO
You Only Look Once, fine-tuned LLM + scene graph reasoning used for navigation by visually...