XRAI v1.0 · MIT-licensed · shipping 2026-04-29

XRAI

X-ray vision. God's-eye view. Infinite zoom.

Voice your world into being. See through. See over. See across.

Open spatial graph format — authored by any LLM, rendered by any engine, queryable by any agent. Shipping today in Portals on iPhone: 60 FPS, 360+ VFX, persistent stateful worlds (CVPR 2026). Free forever. No CLA. No gatekeeper. No take-backs.

Visionary AI. God Glasses for the Masses.
Join jARvis. Jam with friends. Harness holographic intelligence.
Inspire infinite imagination. See more. Be more.
Open the doors of perception. The singularity is near.
A far-out future is here — open, not captured.
Get started → Read the manifesto

The problem XRAI answers

We are blind. We are island minds. Code is opaque. AI is black-boxed. The web is flat. Search is broken. Education is broken. Communication is dated.

Social media brainwashes and isolates. GUIs cripple bodies and minds. Profit and algorithms drive conflict and instant gratification. People feel small and powerless. Civilization is at risk.

XRAI is the substrate for a different medium — one where intent becomes space, memory persists, minds bridge, and the agent's view of the world is legible to you, not hidden behind a prompt.

Infinite zoom

A graph that scales in four directions. Every XRAI scene can be navigated along each axis.

↔ Across time Past, present, future. Replay last Tuesday. Branch into "what if." Memex + time-travel as first-class schema, not a feature bolt-on.
⌥ Across possibilities Counterfactual branches, alternative compositions, multiverse observation. The unrealized is as addressable as the rendered.
∞ Across minds Your graph + my graph + the agent's graph, federated. Island minds bridge. Collective cognition with sovereignty preserved.
⌖ Across scales Molecule → room → city → planet → cosmos. No native zoom ceiling. The graph describes its own structure at every level.

Proof it ships

Portals — the reference implementation — shipped on iOS and at CVPR 2026. The numbers below are from the peer-reviewed camera-ready paper.

60 FPS on iPhone 14 Pro — 2.7–4.1× speedup over prior art. LOD-adaptive Gaussian splatting (SPAG) + shared spatial-media compute substrate fusing depth, stencil, audio, and ML-pose channels. Drives 360+ source-agnostic VFX.

Stateful worlds, not stateless scenes. Persistent geospatial scene-state with layered world metadata, reloadable payloads, anchor-guided re-alignment across sessions, devices, users.

Voice-driven semantic authoring. On-device intent parsing + cloud fallback for ambiguous utterances. No-code composition pipeline bridging reconstruction and generation.

Cross-platform today. iOS + web viewers (Apple Vision Pro compatible) for reconstructed environments, volumetric humans, holographic spatial media.

Paper: Portals: Persistent, Editable 4D Spatial World Models on Edge Devices — Tunick, Brant, Pennock, Kasowski — H3M Inc. + IMC Lab — CVPR 2026 Workshop on 4D World Models, submitted 2026-04-10. Prior clinical deployment of volumetric AR at Memorial Sloan Kettering established the real-time rendering primitives.

What XRAI is

A typed hypergraph of entities, relationships, events, and intentions — anchored in space and time, renderable in any medium, authored by any LLM, queryable by any agent.

One format. Scene graphs for AR / VR / XR today. Episodic memory for voice assistants tomorrow. Living knowledge graphs the year after. Language of thought eventually.

MIT licensed CC0 schema No CLA No gatekeeper No tracking Forever open

What XRAI is not

What you can say — 20 use cases

Every example maps one voice command → XRAI payload → existing Portals spec → rendered on any runtime. Not reinvented — curated from shipping work. See full details: USE_CASES.md.

First jARvis demos

Short videos that sell the thesis — spatial AI that gives users superpowers. See the full demo plan in demos/jarvis_demos.md.

Demo 1 — Voice to World (60s)
Say a sentence, see a world. Spatial meditation garden materializes from voice. XRAI graph visible. Saved scene → portable to any device via QR code.
Demo 2 — Time Travel (30s)
"Show me the last 5 conversations I had about Portals." Spatial timeline of past sessions. Tap one. Replay the workspace as it was.
Demo 3 — Ambient Expertise (15s)
Look at a plant. Ask "what's wrong?" Glasses overlay diagnosis grounded in XRAI knowledge graph.

Design principles

  1. Radical simplicity — grokkable in 1 hour
  2. LLM-authorable — any model can emit valid XRAI from natural input
  3. Permissionless adoption — no gatekeeper, no tracking
  4. Forgiving parsing — partial inputs still work (Postel's law)
  5. Extensible without breaking — v1 stays valid as v2 ships
  6. Spatial + temporal + modal — scenes, episodes, alternatives
  7. Typed n-ary relations — hyperedges, not just pairs (v1.1+)
  8. Personal + federated — your graph is yours; publish slices
  9. MCP-native — any agent reads/writes via standard tools
  10. Self-referential — the schema describes itself

The lineage

Every generation tried. Each missed one piece that now exists.

LLMs solve the authoring burden. Cameras ground the symbols. MCP federates the agents. The window just opened.

Get started

Start here — the vision: VISION.md 🌟 — problem + Sight Triad + jARvis goal + CVPR proof. Load-bearing source for everything else.

Read the manifesto: MANIFESTO.md

Read the spec: SPEC.md v1.0 draft

Demos: demos/jarvis_demos.md

Spatial MCP server: mcp-server/ — 11 tools (9 xrai.* + 2 paint.*). Any MCP-compatible agent (Claude Desktop, Cursor, Cline, OpenAI, Gemini-with-shim) can author XRAI scenes + procedural paint strokes. Replaces Open Brush's port-40074 HTTP API with typed, stateless, multi-agent MCP.

Engine runtimes: runtimes/ — Three.js · PlayCanvas · Unity · Unreal · WebXR · visionOS RealityKit. One XRAI doc, every engine.

Repo: github.com/imclab/xra1 (launching 2026-04-29)

Contact: hello@xrai.dev

Governance (Year 1)

BDFL: @jamestunick (IMC Lab + H3M Inc.). Public RFC process. Weekly updates. No corporate sponsor controlling direction.

Year 2+: transfer to Apache Software Foundation or W3C Community Group once 1000+ external adopters validated. Never to a single vendor's foundation.

Commercial relationship

The spec is free. Forever. Period. Portals (H3M Inc.) builds commercial products on top: hosted XRAI cloud, best-in-class renderer, priority-quality LLM authoring, enterprise spatial intelligence API, vertical apps. The format stays free; services built on top are paid.

Git → GitHub pattern. HTML → Chrome. Markdown → Notion. The community can never be held hostage.