yang-su2000/Voice2Action

ALICE and its prior work, Voice2Action: Language Models as Agent for Efficient Real-Time Interaction in Virtual Reality

35
/ 100
Emerging

This package helps creators and designers rapidly build and modify 3D environments in virtual reality (VR) or game engines like Unity or Unreal. By simply speaking natural language commands, users can generate terrain, place objects, or change properties of existing structures. It takes your spoken instructions and outputs precise, real-time modifications within the virtual space, ideal for anyone working on virtual simulations, urban planning, or interactive game design.

No commits in the last 6 months.

Use this if you need to efficiently create, manipulate, and explore 3D virtual environments using intuitive voice commands rather than complex menus or mouse-and-keyboard inputs.

Not ideal if your primary goal is to generate text, code, or perform analytical tasks outside of direct 3D environment manipulation.

virtual-reality-development game-design urban-planning architectural-visualization 3d-modeling
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 11 / 25

How are scores calculated?

Stars

44

Forks

5

Language

C#

License

MIT

Last pushed

Sep 04, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/yang-su2000/Voice2Action"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.