3a1b2c3/seeingSpace

A lightfield botanist's guide to neural rendering

23
/ 100
Experimental

This project offers an explanation and practical guide to Neural Radiance Fields (NeRFs), a new way to create photorealistic 3D environments from standard images or videos. It takes in multiple 2D images or video frames of a scene and outputs an interactive 3D model that you can view from any angle. This resource is for computer graphics artists, 3D modelers, virtual reality developers, and anyone interested in cutting-edge 3D scene reconstruction.

No commits in the last 6 months.

Use this if you want to understand, learn, and apply neural rendering techniques to create realistic 3D scenes from ordinary 2D media.

Not ideal if you are looking for a simple drag-and-drop tool without any technical learning curve, as this delves into the underlying concepts and frameworks.

3D modeling virtual reality computer graphics scene reconstruction digital twins
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 8 / 25
Community 7 / 25

How are scores calculated?

Stars

57

Forks

3

Language

Jupyter Notebook

License

Last pushed

Sep 25, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/3a1b2c3/seeingSpace"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.