kilian-kier/Dino-Game-AI
A simple pygame dino game which can also be trained and played by a NEAT AI
This project lets you experience how an AI learns to play the classic Dino Game. You can either play the game yourself or watch an artificial intelligence learn and master the jumping mechanics. It takes game state information, like the dino's position and upcoming obstacles, and outputs decisions on when to jump, perfect for someone curious about how simple AIs learn through trial and error.
No commits in the last 6 months.
Use this if you want to understand or demonstrate how a basic neural network can learn to play a simple game through repeated training.
Not ideal if you're looking for an AI that solves complex, real-world problems or if you need a sophisticated game AI for a production environment.
Stars
8
Forks
3
Language
Python
License
GPL-3.0
Category
Last pushed
May 14, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/kilian-kier/Dino-Game-AI"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Paperspace/DinoRunTutorial
Accompanying code for Paperspace tutorial "Build an AI to play Dino Run"
utay/dino-ml
🦎 Simple AI to teach Google Chrome's offline dino to jump obstacles
simply-TOOBASED/dino-bot-3000
A simple bot to play Google's dinosaur game using neural networks and genetic algorithms to get smarter.
Dewep/T-Rex-s-neural-network
T-Rex's neural network (AI for the game T-Rex / Dinosaur on Google Chrome)
GuglielmoCerri/pytorch-dino-ai-game
Repository for training an AI model to play the Chrome Dino Game using PyTorch and EfficientNet