a builder's codex
codex · operators · Andrej Karpathy · ins_llm-wiki-pattern

An LLM should maintain a wiki, not re-derive knowledge per query

By Andrej Karpathy · AI researcher; ex-OpenAI, ex-Tesla; founder of Eureka Labs · 2026-04 · essay · LLM Wiki (gist)

Tier A · TL;DR
An LLM should maintain a wiki, not re-derive knowledge per query

Claim

Replace traditional RAG ("retrieve documents, answer once, forget") with a persistent wiki the LLM owns and edits. Knowledge compiles once and stays current, instead of being rediscovered repeatedly across sessions.

Mechanism

RAG treats each query as a cold start: fetch chunks, ground the answer, discard the synthesis. The same insight is rebuilt many times across sessions, and contradictions across sources are never resolved — they're rediscovered each time. A wiki inverts this: the LLM writes a synthesis page once, links it to raw sources, lints for contradictions, and reads the page (cheap) instead of regrounding from chunks (expensive and lossy). Three layers stabilize the pattern: immutable raw sources (append-only), LLM-owned synthesis pages, and a small constitution that defines the operations.

Conditions

Works when:

Fails when:

Evidence

"Traditional RAG rediscovers knowledge repeatedly rather than accumulating it. The LLM builds and maintains a persistent wiki instead of just retrieving documents. Knowledge compiles once and stays current, not re-derived per query."

— Andrej Karpathy, LLM Wiki gist, 2026-04

The pattern has been adopted across multiple operators' personal knowledge systems within months of publication, including the system this corpus is built on.

Signals

Counter-evidence

For genuinely novel queries with no precedent in the wiki, RAG over a fresh corpus is still required. The wiki accumulates synthesis; it doesn't replace retrieval against new material. A wiki without a steady ingest discipline becomes a stale snapshot.

Cross-references

Adoption note

The codex corpus you are reading is itself an instance of this pattern, applied at the operator-insight scope: raw/ holds source transcripts and articles, insights/ holds the synthesis layer, 00_meta/ holds the constitution.

Open the interactive view → View original source → Markdown source →