
We built a Vision Pro eye-gaze arcade—with AI as a true copilot
Gravitas Dark Matter is (as far as we can tell) the first controller-free, eye-gaze space arcade shooter for Apple Vision Pro. You stand inside an e.g. 10×10×100 ft play volume, look to lock, pinch to fire. It runs in Room, Arena, and Immersive Galaxy modes, with a headset-anchored HUD and a custom gravity solver for smooth, readable motion.
Why this is relevant to OpenAI folks
We treated AI as a full-stack teammate—not just for code hints:
Prototyping & engine glue: AI-assisted Swift/RealityKit snippets (head-locked HUD, starfield batching, pooled SFX, laser system), then hand-tuned.
Design ops: fast iterations on UI comps, iconography, infographics, and store artwork; copy passes for App Store, TestFlight, and outreach.
Go-to-market checklists: submission flows, feature nominations, and CPP strategy distilled into actionables we could execute same day.
What’s novel
Pure eye-gaze targeting (no controller)
Volumetric arena + mixed reality & full immersion
Real-time gravitational motion (not on rails)
60 bite-size levels tuned for “one-more-round” play
Try it / tear it apart
TestFlight (free): https://testflight.apple.com/join/SsmPTnh8
60s/3-min reel: https://youtu.be/tDIaEWzw44o?si=K-ySLbW6EyHOh2NS
App Store page: https://apps.apple.com/us/app/gravitas-dark-matter/id6749031598
We’re looking for blunt feedback on comfort, gaze latency, and HUD clarity—and we’re happy to share what worked (and didn’t) about using AI day-to-day to ship a spatial game fast. We completed this game in 3-4 weeks (with the core physics already developed from previous projects with o3-pro).
Replies