[*AIUAR]

AIUAR – AI Universal Address Reference [∞*]

Part of the AIURM Protocol

AIURM/AIUAR is a set of conventions and abstractions for persistent, governable, and extensible cognitive workflows.

AIUAR – AI Universal Address Reference [∞*]
Part of the AIURM Protocol

It creates an operational space in which data, logic, and results can be organized, addressed, reused, and executed across any LLM and any substrate, without initially relying on specific infrastructure.

Requires: LLM agent with filesystem access (Claude Code or similar).
No framework. No code. No orchestration layer.

“I need to define what, with what, and for what. The AI resolves the how.”

The Structure

aiuar_root/aiuar/
└── *****contextspace ← operational universe
└── ****entity ← owner / team / organization
└── ***project ← a workflow
├── governance/ ← pipeline contract
└── **session ← an execution instance
├── data/ ← *data_x raw inputs
├── logic/ ← *logic_x rules
└── result/ ← *result_x outputs
Every artifact has a universal address in ...aiuar/:
*****contextspace****entity***project**session*marker
One address. Any substrate. Zero ambiguity.

How It Works

This structure can work with any LLM that has access to the filesystem. In this example, Claude Code acts as the workflow executor.

The protocol stack

This is everything that makes AIURM/AIUAR work:

aiuar_root/aiuar/
├── CLAUDE.md ← instructs the AI to load the skills before any operation
└── skill/
└── aiurm_protocol/
├── aiurm/SKILL.md ← marker protocol — [*markers], DLR, intention suffixes
├── aiuar/SKILL.md ← addressing system — 5-level hierarchy, execute command
├── governance/SKILL.md ← contract standard — mandatory sections, field rules
└── changelog/SKILL.md ← mutation tracking — snapshots, undo, change history
**CLAUDE.md** tells Claude Code to read these files before any operation.
**Each SKILL.md** is plain text — readable by any human, executable by any LLM.
---
**Governance** is the contract. A plain text file that defines:
- what data the pipeline consumes
- what logic it applies
- what results it must produce
The AI reads the governance and executes. No code. No framework.
**DLR** is the execution atom:
| Node | Marker | Meaning |
|--------|-------------|--------------------------------|
| Data | `*data_x` | the input — the fact |
| Logic | `*logic_x` | the rule — what to do |
| Result | `*result_x` | the output — what was produced |
**AIUAR address** is the handle to any artifact, anywhere:
*****contextspace_example****adaoaper***project_hr**session_1*result_performance

The Operational Environment

Environment projects (tracker) implement predefined functions, such as auditing and logging. They are not a separate mechanism, but ordinary projects built with the same structure as any other project in the environment and, like any other, open to creation and customization as needed.

*****contextspace_environment
├── ****tracker
│ ├── ***project_audit ← execution records
│ ├── ***project_log ← step-by-step log
│ ├── ***project_exception ← failure records
│ ├── ***project_code ← generated code artifacts
│ └── ***project_changelog ← governance & logic mutations
├── ****organizer
│ ├── ***project_semaphore ← concurrency control (ideation)
│ ├── ***project_contract ← schema agreements (ideation)
│ └── ***project_communication ← inter-project messaging (ideation)
└── ****standardizer
└── ***project_materialize_base ← reusable materialization logic
And extensible by design — add new projects to any entity when your workflows need new capabilities.

Substrate Agnostic

The same workflow runs on:
| Substrate | Form |
|-----------------|-------------------------------------------|
| Filesystem | folders and files on disk |
| JSON | a single portable file |
| Markdown | human-readable and directly executable |
| Key-value store | hierarchical keys |
Every artifact lives within the AIUAR space — addressable by any agent
that can resolve the notation, on any substrate.
Portable substrates (JSON, Markdown) enable a workflow to travel — one file, zero setup,
executable by any agent with no prior context.
Substrate materialization is only limited by the resolver's capability.
One command. Deterministic. Auditable.

The same workflow runs on:

The AI reads the governance file, loads data and logic markers, executes all result steps in order and deposits each result at the correct address. One command. Deterministic Auditable.

Calling execute

*Filesystem path** — simplest, direct, human-friendly:
execute contextspace_example/adaoaper/project_rh_analysis_example/governance/aiurm_governance_rh_analysis_example.txt
**AIUAR short** — development and testing:
execute ***project_rh_analysis_example*aiurm_governance_rh_analysis_example
**AIUAR full** — production, agent-to-agent, zero ambiguity:
execute *****contextspace_example****adaoaper***project_rh_analysis_example*aiurm_governance_rh_analysis_example
**File substrate** — execute directly from a portable JSON or Markdown file:
execute in contextspace_example.md ***project_rh_analysis_example*aiurm_governance_rh_analysis_example
*The governance marker is mandatory when executing from a file — a single file may contain multiple projects.*

The AI reads the governance file, loads data and logic markers, executes all result steps in order,
and deposits each result at the correct address. One command. Deterministic. Auditable.

Practical Demo – Step by Step

What you will see:

  • Agent 1 executes a 14-step HR analysis pipeline from the filesystem
  • Agent 2 materializes the project into a portable JSON file and a portable Markdown file
  • Agent 3 — fresh session, zero context — executes the full pipeline in the JSON file
  • Agent 4 — fresh session, zero context — executes the full pipeline in the Markdown file

Same results. Four agents. Three substrates. No briefing.

Use the step-by-step onboarding for a hands-on introduction.
See the Onboarding Page for detailed instructions.

Experience AIURM in practice:

Use the step-by-step onboarding for a hands-on introduction.
See the Onboarding Page for detailed instructions.

Conceptual Shift

Every paradigm shift changes not just the tool, but the way you think.
| Paradigm | Who defines | Who executes | What persists |
|---------------|---------------------------|----------------------------------|--------------------------|
| Scripting | human (code) | machine | the program |
| Low-code | human (UI) | platform | the configuration |
| Prompting | human (intent) | AI | no durable structure |
| Orchestrators | human (code/config) | AI + framework | the pipeline in code |
| AIURM/AIUAR | human (structured intent) | LLM agent using the filesystem | the structured context |

AIURM introduces structured executable intent. The workflow is defined in text — readable by humans, executable by AI, simultaneously. There is no intermediate layer.

Most approaches treat AI as a component inside a software stack.
AIURM inverts that logic: the model is the runtime, the protocol is the operational layer, the substrate is storage.

No server. No daemon. No framework to install.

What exists: Data
What to do with it: Logic
What to produce: Result
Where it all lives: AIUAR

    The cognitive shift is the signal.
    Frameworks make you think in code. AIURM makes you think in addresses and intentions.
    When that shift happens, the protocol disappears. What remains is clarity about what you want done.

    The protocol is self-referential.
    The governance that defines a pipeline is itself an AIUAR-addressable artifact.
    One LLM can generate projects. Another can read, interpret, and execute them.