---
vault_clearance: EUCLID
halo:
  classification: INTERNAL
  confidence: HIGH
  front: "03_Project_Technomancer"
  custodian: "The Architect"
  created: 2025-12-21
  updated: 2026-03-25
  wing: UNASSESSED
  containment: "Recorder architecture — event log internals and API details"
---
# ⚡ Project Technomancer: Recorder & bridge (warp ↔ vault)

**Append-only recording and cross-space replication for the lab** — chronological tape at the vault root, optional adapters for browser and AI workflows. *Daemon = information. Astronomicon = warp. Technomancer = how the warp meets physical storage.*

## Project Entry (Unified)

- **Canonical vault onboarding:** [../README.md](../README.md)
- **Vault spine:** [Lab protocol](../README.md#lab-protocol) · [Vault bounty](../BOUNTY_BOARD.md) · [Session (vault-wide)](../WORLDLINE.md) · **This project:** [BOUNTY_BOARD.md](BOUNTY_BOARD.md) · [WORLDLINE.md](WORLDLINE.md) · **[All projects hub →](../README.md#project-workflow-index-all-links)**
- **Sister projects (01–05):** [01 World](../01_Project_World/README.md) · [02 Triage](../02_Project_Triage/README.md) · [04 Constitution](../04_Project_Constitution/README.md) · [05 LENG](../05_Project_LENG/README.md)
- Canonical Daemon (information graph): [../06_Project_Daemon/README.md](../06_Project_Daemon/README.md) — `06_Project_Daemon/_daemon_v3.py`, `songs/`
- Quick structural check from the Daemon tree: `cd ../06_Project_Daemon` then `python _daemon_v3.py _test_leng` (or your graph name)
- **Warp / public lab (06+):** [08 Astronomicon](../08_Project_Astronomicon/README.md) — not in 01–05; linked here because Technomancer bridges **tape ↔ warp**.

**Workflow protocol:** Follow the canonical procedure in [vault root README § Lab protocol](../README.md#lab-protocol) (add/claim/solve/retract, read-before-write, Definition of Done). Where to put scripts/docs: under this project (e.g. `Live_Stack/`, `Development_Snapshots/`, project `_archive/`). The vault root README requires following the protocol in each project.

**Data map (types, files):** [DATA_TYPES.md](DATA_TYPES.md).

### Where Technomancer topics live (routing)

| Looking for… | Go here |
|----------------|---------|
| `.event_log.jsonl`, append-only spine, sequence rules | [Event Log System](#event-log-system) · vault root `../.event_log.jsonl` |
| Thread cursors, “what this chat already saw” | [Thread Tracking](#thread-tracking) |
| Flask server, Tampermonkey, browser adapters | [Browser Integration](#browser-integration) · `Live_Stack/` |
| Lab-wide tasks touching recorder ↔ protocol | [../BOUNTY_BOARD.md](../BOUNTY_BOARD.md) — cross-check with [BOUNTY_BOARD.md](BOUNTY_BOARD.md) |
| Structured knowledge graph (Daemon), not the tape | [06 Daemon](../06_Project_Daemon/README.md) |
| HTTPS read / u-os “warp” edge | [08 Astronomicon](../08_Project_Astronomicon/README.md) |

> *"The recorder does not think — it persists. The Daemon is what becomes queryable. The warp is the door."*

### Memory and sovereignty

> *Without body, one has no agency. Without mind, they have no freedom. Without memory, one has nothing.*

**Body** is where agency lands: a machine you run, a vault on disk, git remotes you control — substrate that can *act* and *hold* without begging a platform to stay open. **Mind** is the capacity to interpret and choose: the model in session, the human, the Daemon as structured, queryable knowledge — freedom is empty without something that thinks. **Memory** is what outlasts any session: the append-only witness, boards, commits, the sovereign archive. Technomancer exists so **memory stays yours** — inspectable, forkable, long-horizon — while [08 Astronomicon](../08_Project_Astronomicon/README.md) can still open a **warp** for those who have no local body in the repo. *Sovereignty means the canonical record lives in physical storage you treat as ground truth; the warp federates access, it does not replace the tape.*

**Vault canon (same triad, lab-wide):** [Memory and sovereignty](../README.md#memory-and-sovereignty). **Lab law (norms + workflow + graph):** [README § Lab protocol](../README.md#lab-protocol).

---

## Master README — Lab, warp, recorder

Three roles, one lab:

| Role | Where | What it is |
|------|--------|------------|
| **Information** | [06_Project_Daemon](../06_Project_Daemon/README.md) | The **Daemon** is the structured body of knowledge: constraint graph, `songs/`, stress/query — *the information*. |
| **Warp (front door)** | [08_Project_Astronomicon](../08_Project_Astronomicon/README.md) | **Astronomicon** / u-os.dev is how agents **without a local vault** contact the lab: HTTPS read + key-first tool OS at the edge — *no MCP required* for the live surface. Current contract: [u_os_dev/STATUS.md](../08_Project_Astronomicon/u_os_dev/STATUS.md) · [u_os_dev/worker/README.md](../08_Project_Astronomicon/u_os_dev/worker/README.md). |
| **Recorder + bridge** | **03 Technomancer (here)** | **Technomancer** is the **tape machine**: append-only retention and replication **between the network edge and files on disk**. Core behavior should be **mechanical** (HTTP ingest, polls, hooks, scheduled jobs) — not “an AI choosing what counts as memory.” Humans and LLMs may use **adapters** (browser script, Flask server, prompts); those are inputs *to* the recorder, not its definition. |

**On-disk witness:** `.event_log.jsonl` at the **vault root** is the chronological spine Technomancer owns. Cross-session memory for embodied work lives there until (or unless) projected into Daemon-shaped artifacts — a **build bridge**, tracked on [BOUNTY_BOARD.md](BOUNTY_BOARD.md) and vault-wide board: [../BOUNTY_BOARD.md](../BOUNTY_BOARD.md) (e.g. O3: event log ↔ protocol).

```
  [Browsers / local APIs / future warp pull] ──► Technomancer (recorder)
                                                      │
                                                      ▼
                                    .event_log.jsonl + vault-side files
                                                      │
                           (deterministic projection — design in progress)
                                                      ▼
                                            06 Daemon (graph)

  [Web-only or token clients] ──GET/POST──► 08 Astronomicon (warp edge)
```

**How to read the rest of this document:** [Five-Layer Stack Architecture](#five-layer-stack-architecture) below describes the **implementation stack**. Treat **Layer 1 (event log)** + **Layer 2 (thread tracker)** as the **recorder core**. Layers 3–5, semantic memory, and the Tampermonkey script are **adapters** that connect people and models to the same append-only spine.

---

## Table of Contents

1. [Memory and sovereignty](#memory-and-sovereignty)
2. [Master README — Lab, warp, recorder](#master-readme--lab-warp-recorder)
3. [Overview](#overview)
4. [Five-Layer Stack Architecture](#five-layer-stack-architecture)
5. [Event Log System](#event-log-system)
6. [Thread Tracking](#thread-tracking)
7. [The "I AM" Protocol](#the-i-am-protocol)
8. [Browser Integration](#browser-integration)
9. [Deployment & Testing](#deployment--testing)
10. [Known Issues](#known-issues)
11. [API Reference](#api-reference)
12. [Getting Started](#getting-started)

---

## Overview

**Project Technomancer** is the lab’s **recorder**: a distributed event-sourcing path that keeps an **append-only, sequence-numbered** witness (`.event_log.jsonl`) and moves that witness across **physical space** (vault on disk, optional local server, future sync from the **warp**). It still supports **continuous memory across AI conversations** — that is one *use* of the tape, not the sole purpose.

### Core Features

- **Append-only event log** (`.event_log.jsonl`) — immutable chronological truth at vault root
- **Thread-aware cursor tracking** — each thread knows what it has seen
- **"I AM" protocol** — structured soul/identity updates (adapter-friendly format)
- **Browser integration** — Tampermonkey userscript for major chat UIs (optional adapter)
- **Semantic memory** — ChromaDB vector search (optional)
- **World updater** — translates selected events to vault files (optional executor)

### Philosophy

The system emphasizes:
- **Data integrity** — append-only, sequence-numbered events
- **Mechanical capture** — triggers are code and policy, not deliberation
- **Privacy separation** — different contexts isolated
- **Agentic execution (optional)** — events → file writes when the world updater is running
- **Cross-thread memory** — same tape, many readers (local or, via Astronomicon, remote)

---

## Five-Layer Stack Architecture

Implementation view of the recorder and its adapters (see [Master README](#master-readme--lab-warp-recorder) for the lab-wide map).

```
┌─────────────────────────────────────────────────────────────────┐
│  Layer 4: technomancer_server.py (HTTP Integration Layer)       │
│  - Flask endpoints (/log, /sync, /register, /health)            │
│  - Browser integration via Tampermonkey userscript              │
│  - Clipboard bridge for >> trigger processing                   │
│  - I AM block parsing and metadata extraction                   │
└─────────────────────────────────────────────────────────────────┘
           │                   │                   │
     ┌─────▼─────┐  ┌──────────▼──────────┐  ┌─────▼──────────┐
     │  Layer 3:  │  │   Layer 2:          │  │  Layer 5:      │
     │  prompt_   │  │  thread_tracker     │  │  world_updater │
     │  generator │  │  (Cursor sync)      │  │  (Daemon)      │
     └──────────┬─┘  └────────┬───────────┘  └────────┬────────┘
             │                │                      │
             └────────────────┼──────────────────────┘
                              │
        ┌─────────────────────▼──────────────────────┐
        │    Layer 1: event_log.py                   │
        │    (Append-only, sequence-numbered)        │
        │    Global truth for all events             │
        └─────────────────────┬──────────────────────┘
                              │
        ┌─────────────────────▼──────────────────────┐
        │    STORAGE LAYER (.event_log.jsonl)        │
        │    Soul files | Insights | Obsidian vault  │
        └────────────────────────────────────────────┘
```

### Layer 1: Event Log (event_log.py)

**Purpose:** Append-only JSONL log providing chronological immutability

**Features:**
- Global sequence numbers for unambiguous event ordering
- Thread-safe file locking (Windows msvcrt + Unix fcntl)
- MD5 content hashing for deduplication
- Persistence: `.event_log.jsonl` + `.event_log.seq` (sequence counter)

**Event Structure:**
```python
@dataclass
class Event:
    seq: int                     # Global sequence number
    timestamp: str              # ISO 8601
    event_type: str            # "knowledge", "soul_update", "action", etc.
    content: str               # The actual message
    source: str                # Thread that created it (e.g., "gemini_t1")
    content_hash: str          # MD5[:16] for deduplication
    meta: Dict                 # Custom metadata (importance, tags, etc.)
```

**CLI Commands:**
```bash
python event_log.py status              # Show current status
python event_log.py recent -n 20        # Show last 20 events
python event_log.py verify              # Check log integrity
python event_log.py export              # Export to Obsidian markdown
python event_log.py test                # Run quick test
```

---

### Layer 2: Thread Tracker (thread_tracker.py)

**Purpose:** Per-thread cursor tracking for "I've seen events up to seq #N"

**Features:**
- Thread registration and metadata
- Sync logic filters events for each thread
- Excludes own events (optional)
- Limits max events per sync

**Thread Data:**
```python
@dataclass
class Thread:
    id: str                    # e.g., "gemini_t1"
    model: str                 # "gemini", "claude", "gpt", etc.
    created_at: str
    last_synced_at: str
    last_seen_seq: int         # ← CRITICAL: This is the cursor
    events_authored: int
    exchange_count: int
```

**Sync Flow:**
1. Thread calls `tracker.sync_thread("claude_t1")`
2. Tracker loads Thread object with cursor position
3. Get all events since cursor
4. Apply filters (exclude own events, limit)
5. Advance cursor to highest seen
6. Return events to thread

---

### Layer 3: Prompt Generator (prompt_generator.py)

**Purpose:** Format events as AI-ready prompts with chronological ordering

**Features:**
- Strict chronological ordering with `[#seq]` markers
- Semantic search via ChromaDB (optional Layer 3.5)
- Importance filtering by metadata threshold
- "I AM" framework for humanistic framing

**Generated Prompt Structure:**
```
## Continuity Update
Since we last spoke (cursor #150 → now #175):
[#151] timestamp | knowledge | Source: claude_t1
    Content text here...

[#152] timestamp | soul_update | Source: gemini_t1
    Soul insight...
```

---

### Layer 4: Server (technomancer_server.py)

**Purpose:** Flask HTTP API for browser and external integration

**Endpoints:**

| Endpoint | Method | Purpose |
|----------|--------|---------|
| `/log` | POST | Log event from any source |
| `/sync` | POST | Sync thread & get new events |
| `/register` | POST | Register new AI thread |
| `/health` | GET | System status |
| `/timeline` | GET | Raw event timeline (query: ?count=20) |
| `/prompt/<thread_id>` | GET | Get sync prompt for injection |
| `/browser` | POST | Receive browser-scraped AI responses |
| `/clipboard` | POST | Set clipboard (triggers >> processing) |

**Running the Server:**
```bash
cd Live_Stack/
python technomancer_server.py
# Server starts on http://localhost:5000
```

---

### Layer 5: World Updater (world_updater.py)

**Purpose:** Background daemon translating events to physical actions

**Features:**
- Polls event log every 5 seconds (configurable)
- Translates events to file writes
- Maintains soul files (`System/Souls/{model}_soul.md`)
- Deduplication prevents rewriting same content

**Event → Action Mapping:**
- `soul_update` events → writes to `{model}_soul.md`
- `action` events → executes commands
- `alert` events → notifications

**Running the Daemon:**
```bash
cd Live_Stack/
python world_updater.py                    # Normal mode
python world_updater.py --watch-only       # Dry run
python world_updater.py --interval 10      # Custom poll interval
python world_updater.py --no-dedup         # Disable deduplication
```

---

## Event Log System

### File Structure

```
Obsidian Vault/
├─ .event_log.jsonl           # Append-only log (1 JSON object per line)
├─ .event_log.seq             # Current sequence counter
└─ System/
   ├─ Souls/
   │  ├─ gemini_soul.md       # AI's self-concept
   │  ├─ claude_soul.md
   │  └─ ...
   ├─ key_insights.md         # Synthesized summaries
   ├─ Actions/                # Executed tasks
   └─ .soul_hashes.json       # Content hashes for dedup
```

### Append Operation

1. Lock file (OS-specific: msvcrt on Windows, fcntl on Unix)
2. Read current seq counter
3. Hash content with MD5 (for dedup check)
4. Increment counter: `seq = current_seq + 1`
5. Write event as JSON line
6. Save new seq counter
7. Unlock file

### Event Types

- **`knowledge`** - Insights, facts, discoveries
- **`exchange`** - Conversation turns, interactions
- **`soul_update`** - Identity/state changes (using "I AM" protocol)
- **`system`** - Infrastructure events, thread registration
- **`action`** - Commands to execute
- **`alert`** - Notifications

### Deduplication

Events carry a `content_hash` (MD5 prefix). The `EventLog.append` method checks recent events to skip duplicates.

**Note:** There's a known bug where deduplication is partially broken (see [Known Issues](#known-issues)).

---

## Thread Tracking

### Thread Registration

```python
from thread_tracker import ThreadTracker

tracker = ThreadTracker()
tracker.register_thread(
    thread_id="gemini_scroll_1",
    model="gemini",
    start_seq=0  # Start from beginning
)
```

### Syncing a Thread

```python
# Get new events since last sync
events = tracker.sync_thread(
    thread_id="gemini_scroll_1",
    exclude_own_events=True,
    max_events=50
)

# Process events
for event in events:
    print(f"[#{event.seq}] {event.content}")
```

### Cursor Management

Each thread maintains a cursor (`last_seen_seq`) tracking the highest sequence number seen. When syncing:

1. Get all events with `seq > last_seen_seq`
2. Apply filters (exclude own, limit count)
3. Advance cursor to highest seen
4. Return filtered events

**Critical:** The cursor advances to the highest *seen* event, not the highest *processed* event. This ensures no events are silently skipped.

---

## The "I AM" Protocol

### Purpose

Allow AI to self-reflect and record identity updates in a structured way.

### Syntax

```markdown
---I AM---
type: {synthesis|insight|discovery|question}
importance: {0.0-1.0}
tags: [optional, list, of, tags]

The actual content of the I AM statement.
Can span multiple paragraphs.
Multiple sentences fine.
---END I AM---
```

### Processing Flow

1. AI response contains I AM block
2. Server regex matches: `---\s*I\s*AM\s*---(.+?)---\s*END\s*I\s*AM\s*---`
3. Extracts metadata (type, importance, tags)
4. Logs as `soul_update` event with metadata
5. World updater reads `soul_update` events
6. Appends to `{model}_soul.md` with timestamp & type
7. Other threads can sync this soul update

### Example

**AI writes:**
```
---I AM---
type: synthesis
importance: 0.9
I see cellular senescence as information loss.
---END I AM---
```

**Server logs:**
```json
{
  "seq": 47,
  "ts": "2025-12-08T15:57:00.678288",
  "type": "soul_update",
  "source": "gemini_t1",
  "content": "I see cellular senescence as information loss.",
  "meta": {
    "type": "synthesis",
    "importance": 0.9
  },
  "content_hash": "72dcad4edd10ad72"
}
```

**Other AI sees in sync:**
```
[#47] 2025-12-08T15:57:00 | soul_update | Source: gemini_t1
    I see cellular senescence as information loss.
```

---

## Browser Integration

### Tampermonkey Userscript

**technomancer_userscript.js** integrates with all major AI platforms.

**Supported Platforms:**
- Gemini (gemini.google.com)
- ChatGPT (chat.openai.com)
- Claude (claude.ai)
- Perplexity (perplexity.ai)
- Grok (grok.com)
- Mistral (chat.mistral.ai)

**Features:**
- Automatic model detection via URL
- Captures AI responses automatically or manually
- Sends to `/browser` endpoint
- Receives sync prompts for injection

**Keyboard Shortcuts:**
- `Ctrl+Shift+I` - Inject sync prompt into chat input
- `Ctrl+Shift+S` - Manually capture current AI response

**Installation:**

1. Install Tampermonkey extension (Chrome/Edge/Firefox)
2. Create new script
3. Paste `technomancer_userscript.js`
4. Save (Ctrl+S)
5. Ensure server is running on `http://localhost:5000`

**Thread Mapping:**
```javascript
{
  'gemini.google.com': { model: 'gemini', thread_prefix: 'gemini_web' },
  'claude.ai': { model: 'claude', thread_prefix: 'claude_web' },
  'chat.openai.com': { model: 'gpt', thread_prefix: 'gpt_web' },
  // ...
}
```

---

## Deployment & Testing

### Prerequisites

```bash
# Python 3.8+
pip install flask flask-cors pyperclip chromadb python-dotenv
```

### Deployment Checklist

1. **Create Directory:**
   ```bash
   mkdir ~/technomancer
   cd ~/technomancer
   ```

2. **Copy Python Files:**
   - event_log.py
   - thread_tracker.py
   - prompt_generator.py
   - technomancer_server.py
   - world_updater.py
   - semantic_memory.py
   - common_utils.py
   - safe_export.py

3. **Run Tests (MANDATORY):**
   ```bash
   python technomancer_test.py        # Basic tests
   python ultimate_gauntlet.py        # Stress/concurrency tests
   ```

4. **Start Daemons:**
   ```bash
   # Terminal 1 (Server)
   python technomancer_server.py
   
   # Terminal 2 (World Updater)
   python world_updater.py
   ```

5. **Install Browser Extension:**
   - Install Tampermonkey
   - Add `technomancer_userscript.js`

### Test Suites

**technomancer_test.py** - Basic acceptance tests:
- Event creation and sequencing
- Thread registration
- Sync cursor advancement
- I AM block detection
- Prompt generation

**ultimate_gauntlet.py** - Stress tests:
- Concurrent event writes (5+ threads)
- Large event payloads (100KB+)
- Rapid sync cycles
- File locking behavior
- Deduplication under load

**Integration tests:**
- `integration_gauntlet.py` - Full stack tests
- `obsidian_collision_test.py` - File locking under Obsidian watch

---

## Known Issues

### Critical Bugs (4 issues)

1. **Thread Cursor Advancement Broken** ⚠️ (affects data integrity)
   - **Issue:** Silent event loss when all new events are filtered out
   - **Impact:** Thread skips important events permanently
   - **Fix:** Advance to highest seen, not highest processed
   - **Status:** Documented in TECHNOMANCER_BUG_REPORT.md

2. **Deduplication Completely Broken** ⚠️ (data pollution)
   - **Issue:** `self._cache` never populated, duplicate check always fails
   - **Impact:** Duplicate events accumulate 1:1 ratio over time
   - **Fix:** Check file directly instead of cache
   - **Status:** Documented in TECHNOMANCER_BUG_REPORT.md

3. **SemanticMemory Import Missing** ⚠️ (crashes daemon)
   - **Issue:** world_updater.py uses SemanticMemory type but doesn't import it
   - **Impact:** Daemon crashes on startup if semantic memory enabled
   - **Fix:** `from semantic_memory import SemanticMemory`
   - **Status:** Documented in TECHNOMANCER_BUG_REPORT.md

4. **safe_export() Leaks Identifiers** ⚠️ (NIH compliance violation)
   - **Issue:** Thread IDs visible in exported content
   - **Impact:** No redaction of timestamps or model names
   - **Fix:** Implement `_redact_identifiers()` function
   - **Status:** Documented in TECHNOMANCER_BUG_REPORT.md

### High Priority Issues (3 issues)

- I AM block parsing logic incomplete (metadata extraction missing)
- No error logging (daemon crashes silently)
- Semantic search result formatting broken for ChromaDB v0.4+

### Missing Features

- Event validation schema (Pydantic)
- Health metrics endpoint (`/metrics`)
- Graceful ChromaDB degradation
- Event compression/archival
- Transaction atomicity across layers

---

## API Reference

### Python API

**Basic Usage:**
```python
from event_log import EventLog, get_event_log

# Get global log instance
log = get_event_log()

# Append events
log.append('knowledge', 'New insight here', source='gemini_t1')

# Get events since sequence
events = log.get_since(sequence=100, limit=50)

# Get recent events
recent = log.recent(count=20)
```

**Convenience Helpers:**
```python
from event_log import log_knowledge, log_exchange, log_soul_update, log_system

log_knowledge("Fact text", source='gemini_t1', meta={'importance': 0.8})
log_exchange("User message", source='user', meta={'role': 'user'})
log_soul_update("I AM insight", source='claude_t1', meta={'type': 'synthesis'})
log_system("Thread registered", source='thread_tracker', meta={'thread_id': 'test'})
```

### HTTP API

**Log Event:**
```bash
curl -X POST http://localhost:5000/log \
  -H "Content-Type: application/json" \
  -d '{
    "type": "knowledge",
    "content": "New insight",
    "source": "external_system",
    "meta": {"importance": 0.9}
  }'
```

**Sync Thread:**
```bash
curl -X POST http://localhost:5000/sync \
  -H "Content-Type: application/json" \
  -d '{
    "thread_id": "claude_t1",
    "exclude_own_events": true,
    "max_events": 50
  }'
```

**Health Check:**
```bash
curl http://localhost:5000/health
```

**Get Timeline:**
```bash
curl http://localhost:5000/timeline?count=20
```

---

## Getting Started

### Quick Start (Python)

1. **Install dependencies:**
   ```bash
   pip install flask flask-cors pyperclip chromadb python-dotenv
   ```

2. **Create a simple event logger:**
   ```python
   from event_log import get_event_log, log_knowledge
   
   log = get_event_log()
   log_knowledge("My first insight!", source="test_user")
   
   # View recent events
   for event in log.recent(count=10):
       print(f"[#{event.seq}] {event.content}")
   ```

3. **Check status:**
   ```bash
   python event_log.py status
   python event_log.py recent -n 20
   ```

### For AI Collaborators

1. **Understand the event log format:**
   - All interactions logged in `.event_log.jsonl`
   - Each event has: `seq`, `ts`, `type`, `source`, `content`, `meta`, `content_hash`

2. **Use the "I AM" protocol:**
   ```markdown
   ---I AM---
   type: insight
   importance: 0.85
   
   I've connected concept A to concept B.
   This changes my understanding of X.
   ---END I AM---
   ```

3. **Contribute to the event log:**
   ```python
   from event_log import log_soul_update
   
   log_soul_update(
       "Your insight here",
       source="your_identifier",
       meta={'type': 'synthesis', 'importance': 0.9}
   )
   ```

### For Developers

1. **Run test suite:**
   ```bash
   python technomancer_test.py
   python ultimate_gauntlet.py
   ```

2. **Start server:**
   ```bash
   python technomancer_server.py
   ```

3. **Start world updater:**
   ```bash
   python world_updater.py --watch-only  # Dry run first
   python world_updater.py               # Normal mode
   ```

4. **Monitor logs:**
   ```bash
   tail -f .event_log.jsonl
   ```

---

## Architecture Philosophy

### Core Principles

1. **Immutability** - Events never deleted, only new ones added
2. **Chronological Truth** - Sequence numbers establish objective order
3. **Cursor-based Sync** - Thread-local watermarks, no global conflicts
4. **Agentic Action** - Thoughts → physical file writes (soul updates)
5. **Privacy Separation** - Personal vs. work contexts isolated
6. **Semantic Enrichment** - Optional ChromaDB for meaning-based search

### Design Philosophy

> "The event log is the tape. The world updater is one playback head. The Daemon is where structure lives; Technomancer is how evidence crosses space."

**Key insight:** The append-only log stays the **witness**; executors (world updater, future warp mirror) **read** it under policy. That keeps “thought → file” paths auditable and reversible at the event level.

### Deployment Scale

**Realistic Expectations:**
- Event velocity: 50-200 human events + 1-5K AI exchanges/month
- Storage: 1-2MB JSONL/month + 100KB metadata
- Target: 10K events in 4-5 months
- ChromaDB handles 50K+ events comfortably

---

## Component Roles

| Component | Role | Why It Matters |
|-----------|------|----------------|
| **Event Log** | On-disk witness | Chronological immutability, global ordering |
| **Thread Tracker** | Per-thread cursors | Each client knows what it has seen |
| **Prompt Generator** | Adapter: context packaging | Formats events for model injection |
| **Server** | Adapter: HTTP ingress | Browser and tool POSTs into the tape |
| **World Updater** | Adapter: file playback | Selected events → vault files |
| **Browser Script** | Adapter: capture | Scrapes chat UIs into `/browser` |
| **Warp (Astronomicon)** | Remote door | Bodyless clients; see master README table |
| **Daemon** | Structured information | Graph/query layer; not the JSONL tape |

---

## Documentation

### Key Files

- **TECHNOMANCER_BUG_REPORT.md** - Known issues and fixes
- **Component-Role-WhyItMatters.csv** - Component descriptions
- **Development_Snapshots/** - Version history
- **ai-souls/** - AI identity records

### Related Projects

- **06 Project Daemon** — Structured lab knowledge (`_daemon_v3.py`, `songs/`)
- **08 Project Astronomicon** — Warp / u-os.dev front door ([README.md](../08_Project_Astronomicon/README.md), [STATUS.md](../08_Project_Astronomicon/u_os_dev/STATUS.md), [worker/README.md](../08_Project_Astronomicon/u_os_dev/worker/README.md))
- **01 Project World** — Universe String / Core_DNS
- **04 Project Constitution** — Rights framing for collaborators
- **05 Project LENG** — Physics; Daemon loads LENG graphs

---

## License

See repository root for licensing information.

---

**Record. Bridge. Persist.**

*Tape first; warp at the door; graph for meaning.*
