# XRAI Manifesto

**X-ray vision. God's-eye view. Infinite zoom. For everyone. Spoken into being.**

MIT-licensed. Forever. No CLA. No gatekeeping.

---

## The problem we are here to answer

We are blind. We are island minds.

Code is opaque. AI is black-boxed. The web is flat. Search is broken. Education is broken. Communication is dated.

Social media brainwashes and makes us sick. PCs, phones, and GUIs cripple our minds and bodies. Profit and algorithms drive conflict and instant gratification. People feel small and powerless. Apathy and isolation are rampant. Civilization is at risk.

None of this is destiny. It is the *shape* of the medium we inherited — rectangles of text, linear feeds, attention markets, one-way links. Aldous Huxley named the problem in 1954: our perceptual apparatus is a reducing valve. The medium we have narrows the aperture further.

**XRAI is the substrate for a different medium.** One where intent becomes space. Where memory persists instead of scrolling away. Where minds bridge instead of islanding. Where the agent's view of the world is legible to you, not hidden behind a prompt.

> *"If the doors of perception were cleansed, every thing would appear to man as it is — Infinite."* — Blake, via Huxley

That is the frame. The spec is the mechanism.

## The promise

**Three ways of seeing — the Sight Triad.**

- **X-ray vision** — see *through* opacity. Through the AI black box. Through the code. Through provenance chains. Through invisible infrastructure costs. Reasoning becomes inspectable, computation becomes accountable.
- **God's-eye view** — see *over* topology. The landscape of a field, the dynamics of a market, the shape of a civilizational problem. Zoom out until the pattern is obvious.
- **Infinite zoom** — see *across* time, possibilities, minds, scales.

- **Across time.** Past, present, future. Memex + time-travel + counterfactual branches as first-class schema. Ask: *"show me what I was thinking last Tuesday."* The answer is addressable.
- **Across possibilities.** The unrealized is as first-class as the rendered. Branch a scene. See the alternatives. Pick.
- **Across minds.** Your graph + my graph + the agent's graph, federated. Island minds bridge. Collective cognition without surrendering sovereignty.
- **Across scales.** Molecule → room → city → planet → cosmos. The graph describes its own structure at every level. No zoom ceiling.

**Visionary AI. God Glasses for the Masses.** Join jARvis. Jam with friends. Harness holographic intelligence. Inspire infinite imagination. See more. Be more.

**Open-source agents. AI world models. XR tools.** Empowering the next generation of creatives — and refusing to gate the medium they'll invent.

The singularity is near. It arrives open, or it arrives captured. We choose now.

A far-out future is here.

---

## The 80-year search

Vannevar Bush imagined **memex** in 1945 — personal associative trails between artifacts of thought. Technology wasn't ready.

Ted Nelson specified **Xanadu** in 1965 — bidirectional links, transclusion, versioning. Over-designed; never shipped.

Tim Berners-Lee gave us **WWW** in 1989 — one-way links that worked. The web of documents. Radical simplicity won; depth was sacrificed.

Larry Page + Sergey Brin showed in 1998 that **importance emerges from graph topology** (PageRank), not author claim. Citation as substrate.

Tim Berners-Lee proposed the **Giant Global Graph** in 2007 — connecting people and things with meaningful relationships. Adoption stalled because authoring was expensive and siloed.

Douglas Hofstadter taught us in *Gödel, Escher, Bach* and *Surfaces and Essences* that **analogy is the fuel of thinking** — cognition is structure-mapping, not symbol-shuffling.

Each generation got closer. Each hit the same wall: **structured cognition needs structured authoring, and humans won't pay that cost.**

## What changed

LLMs produce structured outputs for free at scale. Cameras in glasses and phones ground symbols in real space. MCP standardizes how agents share context. Embodied AI fuses language, perception, and action.

**The authoring burden that killed every prior attempt has dropped to zero.**

The window just opened.

## What XRAI is

**XRAI is an open format for typed hypergraphs of entities, relationships, events, and intentions — anchored in space and time, renderable in any medium, authored by any LLM, queryable by any agent.**

One format. Infinite uses:
- **Scene graph** for AR/VR/XR (today)
- **Episodic memory** for voice assistants (near-term)
- **Living knowledge graph** for research + creative work (mid-term)
- **Language of thought** — the substrate where minds, agents, and the physical world converge (the bet)

## What XRAI is NOT

- Not a company's proprietary format
- Not a vendor-controlled standard
- Not a committee-designed artifact
- Not paywalled, not tracked, not rent-extractive
- Not a replacement for HTML / glTF / USD — XRAI composes with them all

## Why it matters

Flat AI chat is dying. Users want context that persists, space that remembers, agents that share the same world. Current AI lives in rectangles of text. The AI we're building lives in space.

Someone will define the format that AI agents use to describe and share 3D / 4D / n-dimensional cognition. **The question is whether it will be open or captured.** Every major format has gone one way or the other:

- HTML open → WWW flourished
- USDZ Apple-gated → ecosystem captured by default (USD itself is open; the authoring + distribution paths are Apple-shaped)
- Wikidata open → Wikipedia powers half the internet
- Meta's social graph closed → walled gardens

XRAI is our chance to ship an open one before a closed one takes the slot.

## Design principles

1. **Radical simplicity** — grokkable in 1 hour, implementable in 1 week
2. **LLM-authorable** — any model can emit valid XRAI from natural input
3. **Permissionless adoption** — no license check, no gatekeeper, no tracking
4. **Forgiving parsing** — partial/wrong inputs still work (Postel's law)
5. **Extensible without breaking** — v1 stays valid as v2 ships
6. **Spatial + temporal + modal** — scenes, episodes, alternatives, observations
7. **Typed n-ary relations** — hyperedges, not just pairs
8. **Personal + federated** — your graph is yours; publish slices
9. **MCP-native** — any agent can read/write/query via standard tools
10. **Self-referential** — the schema describes itself; the graph can describe its own structure

## Anti-goals (what we refuse to build)

The shape of this project is partly defined by what we will not do. Inherited from the Portals constitution `specs/constitution.md §Anti-Goals`:

1. **No proprietary forks of the spec.** v1.0 is CC0; reference parsers are MIT. Anyone proposing a "Pro" or "Enterprise" XRAI dialect with paywalled features is not building XRAI.
2. **No committee-driven governance before 1000+ external adopters.** RFCs first, foundation later. We have seen what a 200-person standards body does to a young format.
3. **No silent versioning.** Every breaking change touches an RFC. Every RFC stays public, including rejected ones with a postscript explaining why.
4. **No hidden state in the wire format.** If a decoder needs a fact to render, that fact appears in the SSE stream — never in a vendor cloud, never in a lookup table only one company holds.
5. **No tracking, no telemetry, no auth gates on the spec text.** You can read, ship, and remix XRAI without ever pinging us.
6. **No platform lock-in.** Reference runtimes ship for Unity, Three.js, PlayCanvas, WebXR/Needle, visionOS, MCP. If a major platform refuses to ship a decoder, we ship a third-party adapter for it. The bus is neutral.

## Foundation principles (5+ year design)

Inherited from Portals constitution `§Foundation Principles`. These outlive any RFC or release:

1. **Platform agnostic** — runs everywhere XRAI runs. iOS, Android, Web, visionOS, Quest, Android XR, glasses. Any LLM backend. Any 3D-gen API. Any hardware form factor. The bus + format + MCP layer are the abstractions.
2. **Minimal surface area** — fewer files, fewer bugs. Each new RFC justifies its own existence vs extending an existing one.
3. **Boring technology** — stable, proven tools over newest releases. JSON not Cap'n Proto. WebGL not WebGPU-unless-we-need-it. MD over .docx, always.
4. **Explicit over clever** — readable beats compact. Schema fields named for what they mean, not what saves bytes.
5. **User-feedback-driven** — build what real adopters ask for; refuse to ship speculative kitchen-sink features.
6. **Standards over proprietary** — glTF / OpenXR / MCP / OpenTelemetry / WebRTC. Compose with the open stack; never replace it.
7. **Local-first over cloud** — AI on device, data with user. Cloud is a convenience accelerator, not a dependency.

## Governance

Year 1: BDFL (benevolent dictator — James Tunick) maintains direction. Public RFC process for changes. Weekly public updates. No corporate sponsor controlling direction.

Year 2+: transfer to Apache Software Foundation or W3C Community Group once 1000+ external adopters validated. Never to a single vendor's foundation.

## Commercial relationship

**The spec is free. Forever. Period.**

Portals (H3M Inc.) builds commercial products on top:
- Hosted XRAI cloud (collaboration, versioning, encrypted personal graphs)
- Best-in-class XRAI renderer (Portals AR app)
- Priority-quality LLM authoring
- Enterprise spatial intelligence API
- Vertical applications (education, architecture, therapy, training)

This is the Git → GitHub, HTML → Chrome, Markdown → Notion pattern. The format stays free; services built on top are paid. **The community can never be held hostage.**

## How to contribute

Today:
- Star + watch the repo at `github.com/imclab/xra1`
- Read `SPEC.md` v1.0 draft, file issues for missing primitives
- Try the LLM prompt library (`/prompts/`) — emit XRAI from any model
- Share what you build

Soon:
- Reference parsers (TS + Rust + Python)
- Unity + React Native + WebXR + visionOS reference runtimes
- MCP server (Claude / Gemini / GPT agents read/write XRAI directly)
- Showcase apps

## The stakes

If XRAI wins, the next decade of spatial AI runs on an open substrate. Minds and machines collaborate in a shared, persistent, editable space. The Giant Global Graph finally ships — not as someone's product, but as everyone's infrastructure.

If XRAI doesn't ship in the next 12 months, someone else will capture the slot. Meta's Horizon format, Apple's USDZ variant, OpenAI's agent memory format, a Chinese super-app's closed world graph. Walled gardens for the next 20 years.

The open web happened because someone shipped the flag before the corporations did. This is that moment again.

## License

MIT for the spec text, reference parsers, runtimes, prompt libraries, and documentation. Public domain (CC0) for the normative JSON schema.

**Forever. Irreversibly. No take-backs.**

— @jamestunick, IMC Lab + H3M Inc. — 2026-04-22

---

**Navigate:** [`VISION.md`](./VISION.md) · [`SPEC.md`](./SPEC.md) · [`landing.html`](./landing.html) · [`sitemap.html`](./sitemap.html) · [`examples/`](./examples/)
