[*AIUAR]

AIUAR – AI Universal Address Reference

Part of the AIURM Protocol

AIURM/AIUAR is a set of conventions and abstractions for persistent, governable, and expansible cognitive workflows.

It creates an operational space in which data, logic, and results can be organized, addressed, reused, and executed across any LLM and any substrate, without initially relying on specific infrastructure.

Requires: LLM + CLI + filesystem – everyone already has this.

“I need to define what, with what, and for what. The AI resolves the how.”

The Structure

aiuar_root/
└── *****contextspace ← operational universe
└── ****entity ← owner / team / organization
├── governance/ ← pipeline contract (read-only)
└── ***project ← a workflow
└── **session ← an execution instance
├── data/ ← *data_x raw inputs
├── logic/ ← *logic_x rules
└── result/ ← *result_x outputs
Every artifact has a universal address:
*****contextspace****entity***project**session*marker
One address. Any substrate. Zero ambiguity.

How It Works (Claude Code)

The protocol stack

This is everything that makes AIURM/AIUAR work:

aiuar_root/
├── CLAUDE.md ← instructs the AI to load the skills before any operation
└── skill/
└── aiurm_protocol/
├── aiurm/SKILL.md ← marker protocol — [*markers], DLR, intention suffixes
├── aiuar/SKILL.md ← addressing system — 5-level hierarchy, execute command
├── governance/SKILL.md ← contract standard — mandatory sections, field rules
└── changelog/SKILL.md ← mutation tracking — snapshots, undo, change history
**CLAUDE.md** tells Claude Code to read these files before any operation.
**Each SKILL.md** is plain text — readable by any human, executable by any LLM.
*This is the entire protocol stack.*
---
**Governance** is the contract. A plain text file that defines:
- what data the pipeline consumes
- what logic it applies
- what results it must produce
The AI reads the governance and executes. No code. No framework.
**DLR** is the execution atom:
| Node | Marker | Meaning |
|--------|-------------|--------------------------------|
| Data | `*data_x` | the input — the fact |
| Logic | `*logic_x` | the rule — what to do |
| Result | `*result_x` | the output — what was produced |
**AIUAR address** is the handle to any artifact, anywhere:
*****contextspace_example****adaoaper***project_hr**session_1*result_performance

The Operational Environment

Built-in environment projects handle cross-cutting concerns automatically:
*****contextspace_environment
└── ****general
├── ***project_audit ← execution records
├── ***project_log ← step-by-step log
├── ***project_exception ← failure records
├── ***project_code ← generated code artifacts
└── ***project_changelog ← governance & logic mutations
And extensible by design — add `project_semaphore`, `project_contract`, `project_communication`
when your workflows need coordination, validation, or messaging.
These extensions are still in the ideation phase and have not yet been implemented.

Execute a Workflow

execute {governance_path}

The AI reads the governance file, loads data and logic markers, executes all result steps in order,
and deposits each result at the correct address. One command. Deterministic. Auditable.

Substrate Agnostic

The same workflow runs on:
| Substrate | Form |
|-----------------|------------------------|
| Filesystem | folders and files |
| JSON | a single portable file |
| Key-value store | hierarchical keys |
Every artifact lives within the AIUAR space — addressable by any agent
that can resolve the notation, on any substrate.
The JSON substrate, for example, enables portability — opening many possibilities.
Substrate materialization is only limited by the resolver's capability.

Practical Demo – Step by Step

What you will see:

  • Agent 1 executes a 14-step HR analysis pipeline from a governance file
  • Agent 1 materializes the entire project into a single portable JSON file
  • Agent 2 — fresh session, zero context — receives only the JSON path and reproduces the full pipeline

Experience AIURM in practice:

Use the step-by-step onboarding for a hands-on introduction.
See the Onboarding page for detailed instructions.

Why This Matters

Current orchestration tools treat the AI as a function inside a software stack.
AIURM/AIUAR inverts this: the AI is the runtime. The protocol is the operating system. The substrate is storage.

No server. No daemon. No vendor lock-in. No framework to install.

The workflow is a text file. The executor is the model. The contract is governance.