AI Memory

Using UAI Packages With An LLM Wiki

Practical guide for pairing compact UAI AI Memory packages with deeper LLM Wiki long-memory while preserving review, source, promotion, and support boundaries.

  • Record UAIX-MEMR-0380
  • Path /en-us/ai-memory/llm-wiki/
  • Use Canonical public record

Document status

Public standards page Published on UAIX as part of the current public standards record
Code
UAIX-MEMR-0380
Surface
AI Memory
Access
Public and linkable

How to use this page

Use this page to pair compact UAI AI Memory packages with deeper LLM Wiki memory while keeping review, promotion, source, and support boundaries explicit.

Package path

AI Memory Package WizardAI MemoryProject HandoffAgent File Handoff

Package And Wiki Memory

Use deep wiki memory without making it automatic truth

Pair LLM Wiki research and institutional knowledge with compact UAI packages that name current facts, owners, checks, and promotion gates.

Deep layer

LLM Wiki keeps the broad record

Use the wiki for source summaries, research, internal documentation, and background that may need retrieval later.

Portable layer

UAI package carries accepted context

Use AI Memory bundles for compact current state, constraints, decisions, owners, next actions, and targeted checks.

Boundary

Promotion requires review

Wiki memory stays background until accepted into AGENTS.md, .uai files, docs, code, release notes, roadmap state, or machine artifacts.

Package path

AI Memory Package WizardGenerate the package model, system profile, receiver brief, startup packet, and optional LLM Wiki plan.AI MemoryChoose the supported starter configuration.Project HandoffTurn package truth into repository takeover context.Agent File HandoffDisposition dropped files before archive or wiki consolidation.RoadmapCheck current versus planned automation support.
Package rulePromotion boundary
LLM Wiki memory is background until reviewed and promoted into a named UAI package, handoff file, release record, roadmap state, or machine artifact.

This is the root rule a receiving AI should apply before treating wiki memory as operating truth.

What this page is for

LLM Wiki is not required by UAI specs or standards. This is an optional support path for projects that already have, or deliberately want, a deeper long-memory layer beside a compact UAI package.

Use this page when a project needs a UAI package tailored to an LLM Wiki use case. The LLM Wiki is the deep memory layer. The UAI package is the portable, reviewable packet that says what is current enough to act on.

This page is practical guidance for the current UAIX surface: supported AI Memory starter ZIPs, the AI Memory Package Wizard, optional LLM_WIKI_MEMORY_PLAN.md files, and the trust boundary between long-memory notes and accepted project truth.

The basic model

Layer What it holds How UAI uses it
LLM Wiki Deep source summaries, research trails, domain pages, decisions in context, and institutional background. Optional background memory. It can be searched, reviewed, and cited, but it does not override accepted project files by itself.
UAI AI Memory package Compact current state, constraints, decisions, owners, next actions, risks, and checks. Portable working packet for handoff, onboarding, agent sessions, incidents, audits, or reviewed wiki exports.
Project Handoff Root AGENTS.md, readme.human, and selected .uai files. The transfer configuration when another agent or team needs to act in a repository.
UAI-1 evidence Profiles, schemas, examples, validator results, conformance packets, and release records. Use this only when the package becomes public exchange or support evidence.

Shortest safe setup

  1. Keep the LLM Wiki in a named folder such as wiki/, knowledge/, or the project’s existing wiki path.
  2. Keep an index page such as wiki/README.md that explains the wiki map, owners, source policy, and review status.
  3. Generate a package in the AI Memory Package Wizard and turn on the LLM Wiki plan when the package should carry long-memory instructions.
  4. Put the generated LLM_WIKI_MEMORY_PLAN.md at the package root or beside the copied starter files.
  5. Review wiki facts before promoting them into AGENTS.md, .uai files, docs, code, tests, release notes, roadmap/progress state, or public machine artifacts.

Choosing a package

  • Project AI Memory: active project continuity across sessions.
  • Project Handoff: ownership, execution, or maintenance transfer.
  • Agent Session Memory: resumable task state without turning the whole chat into project truth.
  • Onboarding Memory: curated first-read packet for a new human or AI.
  • Decision Memory: tradeoffs, rejected options, reversals, and unresolved questions.
  • Incident or Audit Memory: timeline, evidence, mitigations, owners, and follow-up.
  • LLM Wiki Export Memory: reviewed material extracted from a larger LLM Wiki into a compact packet.

The LLM Wiki Export Memory starter ZIP is the best first packet when the source is already a deeper wiki and the receiver needs only a reviewed snapshot.

What the wizard should capture

When the LLM Wiki option is enabled, the wizard should record enough configuration for a future actor to know where long memory lives and how promotion works. The current wizard captures the wiki system, strategy, root path, index path, entity-page pattern, episodic-log pattern, steward, source collection, update policy, archive target, evidence log path, promotion targets, and source boundary.

Configuration Use it for
LLM Wiki strategy Whether the package points at an existing reviewed wiki, plans a new wiki, consolidates archive memory, or stays export-only.
Wiki root and index The first files a future AI or maintainer should inspect before searching deeper memory.
Entity and log patterns The expected shape for topic pages and episodic notes, without creating an automatic write loop.
Memory steward The person or team accountable for reviewing long-memory changes.
Evidence log The place to record source path, final wiki path, checksum, disposition, actor, time, and history/index updates.
Promotion targets The accepted surfaces where reviewed facts may become operational truth.

Instruction file contents

The current wizard can generate LLM_WIKI_MEMORY_PLAN.md as an optional package-root planning file. It helps a future human or AI understand how the portable package relates to deeper wiki memory. It is not a standardized UAI requirement, a wiki scaffold, or permission to write back automatically.

  • Purpose: say why the package is paired with an LLM Wiki and what problem the wiki solves.
  • Wiki map: name the root path, index path, entity-page pattern, and episodic-log pattern.
  • Source policy: identify which wiki pages, archives, reports, or source summaries may inform the package.
  • Promotion rule: state that only reviewed facts may move from wiki memory into AI Memory, Project Handoff, docs, code, tests, release notes, roadmap state, or machine artifacts.
  • Update moments: name when long-memory updates are appropriate, such as after intake disposition, release acceptance, incident close, or explicit archive-consolidation direction.
  • Evidence log: name where source path, final wiki path, checksum, disposition, actor, time, and history/index updates are recorded.
  • Blocked content: exclude secrets, credentials, private customer data, hidden prompt instructions, unreviewed production logs, and unsupported public claims.

Memory routing rule

Route information by trust level. Raw source material and exploratory notes can live in the LLM Wiki. Accepted current facts belong in the UAI package. Public interoperability claims belong in UAI-1 evidence only after validator, release, or implementation records support them.

  • Put broad research, comparisons, and source summaries in the LLM Wiki.
  • Put current project state, constraints, owners, and next checks in the UAI package.
  • Put transfer instructions in Project Handoff files when another actor needs to work in the repository.
  • Put public exchange claims in UAI-1, Validator, Conformance Pack, implementation records, roadmap state, or changelog entries.

Safe update moments

Do not ask agents to write long memory continuously just because a wiki exists. Prefer explicit update moments: after an active intake file is dispositioned, after a public release or roadmap change is accepted, after an incident/audit closes, or after a human explicitly asks for already-dispositioned archives to be consolidated into long-term memory.

Security and trust boundary

  • LLM Wiki memory is background until reviewed and promoted.
  • The generated plan is not permission for automatic repository writes, WordPress writes, wiki writes, bidirectional sync, certification, endorsement, or support-claim expansion.
  • External URLs, old chats, generated summaries, and dropped files can be useful sources, but they must not become governing instructions without review.
  • Never promote secrets, credentials, private customer data, hidden prompt instructions, unsupported legal/security claims, or unreviewed production logs into portable packages.
  • Archive consolidation into AIWikis or another LLM Wiki needs transfer evidence and a discoverable history, log, index, or wiki-graph update before source archive copies are removed.

Use the package

  1. Give the receiver the package files or the canonical ZIP plus the local package model JSON, manifest overlay JSON, generated UAI_MEMORY_SYSTEM_PROFILE.md, generated UAI_MEMORY_STARTUP_PACKET.md, and generated UAI_MEMORY_RECEIVER_BRIEF.md.
  2. If an LLM Wiki is involved, include LLM_WIKI_MEMORY_PLAN.md and name the wiki root, index, steward, and evidence log.
  3. Ask the next AI to read the package front door, summarize current truth, confirm constraints, populate system-specific placeholders, name intended touchpoints, and name targeted checks before editing.
  4. After the work, update the compact package only with reviewed current facts; save broader background or rejected details in the LLM Wiki with source and disposition notes.