Understory Labs

Save State

38 ENTRIES

April 18, 2026
SAVE STATEbuginfrastructure

Build fix — output: export removed, analytics force-dynamic restored

Bug Fixes

  • output: "export" removed from next.config.ts — static export mode blocks any page with dynamic = "force-dynamic", surfacing as a Vercel build error on /analytics

Lessons

  • The analytics page's force-dynamic is correct: getAllProjects() and getAllFeatures() hit Supabase at request time, and todayStr() makes momentum score, streak, and recency calculations inherently time-sensitive — a static snapshot would go stale immediately
  • CLAUDE.md was already ahead of the config ("dynamic SSR — no longer static export") — the stale output: "export" was a local uncommitted modification, not a regression in the committed codebase
LOREfeatureaudiophase

Immersive Places POC complete — effects, sound, and composite all validated

Features

  • Effects POC validated with real Ghibli hero images — all 3 places render correctly behind frosted glass UI shell
  • Particle colors made place-aware — tavern embers now gold-amber, meadow fireflies warm cream-gold instead of uniform orange
  • Sound POC built — Howler.js 4-layer system (base → detail1 → detail2 → full ambient), each layer fading in as presence grows
  • 12 ambient audio assets sourced from Freesound and configured per place — library fire, clock, pages, room tone; tavern hearth, dinner table, murmur, crowd; meadow wind, bird, rustle, chorus
  • Intermittent sound model confirmed — pages and bird calls use scheduled one-shots with random intervals (12–35s, 10–30s), not loops
  • Composite POC complete — WebGL shader, canvas particles, Howler.js audio, and full Lore UI shell (sidebars, header, input bar) unified in a single page across all 3 places
  • Pipeline doc written (docs/plans/2026-04-18-immersive-places-pipeline.md) — integration plan mapped to existing SceneLayer / PlaceSceneRenderer / AmbientSound architecture, 10-step implementation order, asset manifest, per-place checklist

Bug Fixes

  • Clock volume capped at 25% — was audibly competing with fire base layer
  • Tavern clink replaced with dinner table ambience (Mr_Alden, Freesound 365676) — crystal clink was too repetitive regardless of random-seek approach; looping ambient table activity reads more naturally

Infrastructure

  • Howler.js downloaded locally (howler.min.js) — CDN load blocked on file:// protocol in Chrome
  • npx serve established as the local POC test pattern — browser blocks audio loading from file:// even with html5: true; must serve via HTTP

Lessons

  • Browser security blocks audio loading from file:// regardless of Howler config — every audio POC needs a local HTTP server, not a double-click
  • Intermittent detail sounds (page turns, bird calls) need random scheduling, not loops — a constant page-turning loop is immediately uncanny
  • Sound variety matters more than volume — random-seek on a single-sample clink still sounds repetitive; an ambient file with natural variation (dinner table) solves it better than any seek strategy
  • Windows hides file extensions by default — renaming library.jpg in Explorer silently produces library.jpg.jpg; turn on extensions before any rename workflow

TODO

  • Integration: create SceneEffects.tsx (port WebGL + canvas from POC), update LibraryScene.tsx, expand AmbientSound.tsx to 4-layer system — see pipeline doc for full order
TAPROOTinfrastructurefeature

PI-4 deployed — ingest, chunking, and retrieval live on Taproot

Features

  • PI-4 fully deployed — /ingest, /chunks/:projectId, and /retrieve endpoints live at research.rootstack.dev
  • Tuff Shed PDF ingested end-to-end — 13 pages, 4 chunks, BM25 retrieval returning correct assembly steps
  • Research service upgraded to v1.1.0 — Brave Search + ManualsLib + manufacturer direct + full ingest pipeline in a single container

Bug Fixes

  • Node 18 → Node 20 in Dockerfile — axios 1.7+ pulls in undici which requires File global not available until Node 20; container was crash-looping
  • ZFS cache permissions set on Proxmox host — chmod 777 /taproot-data/research-cache required at the host level; LXC bind mount is read-only from inside the container

Infrastructure

  • homelab/services/research/ is now the canonical source — Brave Search scrapers added, PI-4 files (ingest, cache, retrieve) merged, port locked to 3002
  • ZFS bind-mount added to CT 100 via pct set 100 -mp0 — dataset taproot-data/research-cache (already existed from Step 1) mounted at /taproot-data/research-cache
  • Old /opt/research-api (unused TypeScript code) removed from docker-host

Lessons

  • Docker images survive source directory cleanup — container keeps running from the cached image even if the build directory is gone; only matters for future rebuilds
  • Merging two diverged implementations requires picking one as the base and grafting from the other — keeping the deployed service's search providers and adding the PI-4 endpoints was cleaner than replacing everything

TODO

  • PI-5: build api/_lib/taproot.ts helper and inject approved chunks into api/project-intake.ts draft prompt
CURRENT OSbrainstormnamingbug

WeekWidget fix + Bud brainstorm — inbox intelligence named and scoped

Features

  • Bud vision brief complete — inbox intelligence system powered by n8n on Taproot with six handler types: purchase, coupon, delivery, appointment, reference, triage
  • Cross-agent orchestration scoped — Bud feeds Current OS (tasks, radar, triage), Shed (project materials), Research API (product intelligence), Google Calendar, and a financial dashboard
  • Name locked: Bud — dormant potential waiting to open, "nip it in the bud" for early problem detection, BUD inside BUDGET, buddy personality
  • Standalone project — own repo, own LXC on Taproot, financial dashboard linked from Current OS under a broader domain TBD

Bug Fixes

  • ISSUE-011: WeekWidget "This Week" card now auto-focuses when Work toggle activates — GravityWatcher gained a workVisible transition watcher that calls focusCard("week") on the false → true edge

Lessons

  • Brainstorms should surface established self-hostable tools (n8n, Node-RED) before custom-build options — n8n was the obvious fit for email orchestration but had to be raised by the user
  • Naming requires the design brief as input — two rounds failed before consulting the redwood forest ecosystem aesthetic
  • Humble name for sophisticated tech (Shed, Bark, Bud) is the Understory Labs signature — the contrast IS the brand

TODO

  • /plan on Bud — purchase handler first, end-to-end through n8n
  • Cross-agent protocol design — how Bud, Shed, and Research API communicate
  • Confirm ISSUE-011 fix after testing
April 17, 2026
TAPROOTinfrastructurefeature

Lost session recovery — research service confirmed live, PI-1 through PI-4 discovered complete

Features

  • Research service verified end-to-end — https://research.rootstack.dev/health returns 200 from cellular, Tuff Shed query returns manufacturer direct + Brave + fallback results
  • PI-1 through PI-4 discovered complete from retrospective entry — product detection, resource approval UI, ingest pipeline, and per-resource status badges all built in the lost session

Infrastructure

  • Deployed service differs from repo: JS/Express, port 3002, Brave Search + ManualsLib + manufacturer direct — TypeScript research-api archived to _archive/
  • config.yml synced with server — research.rootstack.dev → localhost:3002 entry added
  • Research service source lives in ~/Projects/homelab/services/research/ — deployed version is PI-2 only; PI-4 endpoints written but not deployed (ZFS bind-mount + service rebuild pending)
  • homelab CLAUDE.md, global CLAUDE.md, and product intelligence plan updated to reflect actual state

Lessons

  • Lost session state is recoverable — docker ps and ls /opt reconstruct what was deployed; retrospective Save State entries reconstruct what was built
  • cut -d= -f2 silently truncates base64 keys with trailing =; grep -oP '(?<=KEY=).*' handles them correctly
  • The repo and the server diverged during development — implementation language, port, and architecture all changed; the repo was never updated

TODO

  • Deploy PI-4: ZFS bind-mount into CT 100 → rebuild research service → validate /ingest and /retrieve against a real PDF
  • PI-5: inject approved chunks into api/project-intake.ts draft prompt
CURRENT OSfeatureaiinfrastructure

Product Intelligence — PI-1 through PI-4 built (retrospective)

Features

  • Product detection lands in intake engine — product_assembly_detected inference fires when AI identifies a specific kit or product, readyToResearch flag signals the UI to offer research before drafting
  • Research service deployed on Taproot — research.rootstack.dev live, returns ManualsLib results + YouTube search URL + DuckDuckGo fallback for any product query
  • Resource approval UI complete — "Resources Found" section in observations panel with type badges, checkboxes, Add your own URL input, and three explicit degradation paths when Taproot is unreachable
  • Ingest pipeline written — /ingest endpoint downloads PDFs and HTML, extracts text, chunks by step-pattern → headings → 500-token windows, stores to ZFS cache under a temp project key
  • Per-resource ingest status in intake overlay — loading / done (chunk count) / image-only-warning / error badges fire immediately on resource approval, non-blocking

Bug Fixes

  • Research fetch moved out of useEffect — cleanup function was aborting the in-flight request the moment researchStatus changed from idle to loading, causing an immediate AbortError
  • @vercel/node import removed from api/research.ts — Vercel infers the runtime; explicit import broke the build
  • Type assertion added for manual resource entry — TypeScript couldn't narrow the type field on user-pasted URLs

Infrastructure

  • api/research.ts — Vercel proxy with 10s timeout, forwards to TAPROOT_RESEARCH_URL with x-api-key auth; RESEARCH_API_KEY never exposed to the browser
  • api/ingest.ts — second Vercel proxy, 25s timeout for PDF downloads
  • project_resources.source column added — distinguishes ai-discovered from manual resources; feeds the future manual library
  • src/cache.js, src/ingest.js, src/retrieve.js written in ~/Projects/homelab/services/research/ — BM25-lite keyword retrieval, ZFS-backed chunk storage
  • PI-4 not yet deployed — ZFS bind-mount into docker-host (requires CT stop) and service rebuild are the next physical steps

Lessons

  • Chunking strategy order matters — trying numbered steps first (assembly manuals are full of them) before generic heading detection before token windows produces far better chunks than the reverse
  • ingestProjectKey needs to be a temp UUID minted at resource-approval time, not the Supabase project ID — the project doesn't exist yet when ingestion kicks off; PI-5 will wire the linkage
  • Explicit degradation beats silent fallback — presenting four named choices when Taproot is unreachable is only slightly more code and completely changes the user's ability to recover

TODO

  • Deploy PI-4: ZFS dataset on Proxmox host → bind-mount into CT 100 → rebuild research service → push Vercel changes
  • Validate PI-4 checkpoint: ingest a real Ring doorbell PDF, confirm /retrieve returns relevant sections
  • PI-5: inject approved chunks into api/project-intake.ts draft prompt using ingestProjectKey
April 12, 2026
BARKbugaiaudio

Bug sweep — soundboard spinner, phrase match, error surfacing

Bug Fixes

  • Soundboard "Add to Soundboard" spinner hung forever — root cause: except RuntimeError too narrow, leaving FileNotFoundError (expired temp file) and discord.errors.Forbidden unhandled before followup.send() was called; broadened to except Exception
  • Public channel announcement failure masked as upload failure — upload and announcement now in independent try/except blocks; announcement errors silently ignored
  • "Phrase Not Found — Failed to analyze transcript" on valid matches — Claude was generating retry_suggestions as a second JSON block instead of embedding it in the result object; rewrote prompt to show retry_suggestions inside each schema example
  • max_tokens raised 512 → 1024 in phrase matcher — response truncation was cutting off JSON mid-object on longer explanations

Infrastructure

  • Parse error logging added to matcher fallback path — raw Claude response and exception now printed to service-stdout.log for diagnosis instead of silently swallowing

Lessons

  • Discord deferred interactions require followup.send() to resolve the spinner — any unhandled exception before that call leaves the spinner permanent and silent
  • Claude prompt structure shapes response structure: a "in ALL cases, include X" instruction after the schema produces a second JSON block, not a merged one; shared fields belong inside every schema example
  • service-stdout.log and service-stderr.log carry different signal — stdout gets Python print() output, stderr gets discord.py tracebacks; check both when debugging
TAPROOTinfrastructure

Steps 6 + 8 complete — Vaultwarden live, rootstack.dev tunnel operational

Features

  • Vaultwarden deployed and operational — password manager live at vault.rootstack.dev with real HTTPS
  • Bitwarden extension connected to self-hosted vault — all Taproot credentials transferred and accessible
  • Cloudflare Tunnel established (UUID 5f21212a-2895-42f2-9b77-ffd6056af6cf) — Taproot services reachable externally without port forwarding or exposing home IP
  • rootstack.dev registered via Cloudflare Registrar — infrastructure domain live
  • DNS routes configured: status.rootstack.dev → Uptime Kuma, vault.rootstack.dev → Vaultwarden, research.rootstack.dev pre-configured for research service
  • ISSUE-017 resolved — Vercel serverless can now reach Taproot, unblocking the product intelligence feature

Bug Fixes

  • Ubuntu 24.04 SSH blocks root via three separate mechanisms (PermitRootLogin, PasswordAuthentication, and drop-in sshd_config.d overrides) — sshd_config on CT 100 also had immutable bit set, requiring chattr -i before sed could edit it
  • docker-compose-v2 conflicts with Docker's built-in compose plugin — removed; docker compose used directly
  • cloudflared service install failed with "cannot determine default configuration path" — fixed with explicit --config /etc/cloudflared/config.yml flag
  • GPG dearmor command truncated when piped in SSH terminal — split into two steps: curl to temp file, then gpg separately
  • Vaultwarden enforces HTTPS for all operations — HTTP access non-functional by design; Cloudflare Tunnel resolves this with real TLS rather than fighting self-signed cert workarounds

Infrastructure

  • CT 101 recreated fresh (Ubuntu 24.04, nesting enabled) after SSH lockdown and package conflicts proved unresolvable on the original container
  • cloudflared installed on docker-host (CT 100) via official Cloudflare apt repo; tunnel config at /etc/cloudflared/config.yml
  • Config written locally, scp'd to server — heredoc-in-remote-terminal pattern retired; local-write + scp is now standard for any multi-line file creation on remote hosts
  • Product Intelligence plan confirmed (PI-1 through PI-6) — manufacturer site recon complete, hybrid scraping strategy validated (HTTP scraper + aggregators first, Firecrawl as last resort)

Lessons

  • Ubuntu 24.04 SSH lockdown is multi-layered — fixing one mechanism while others remain active wastes an entire session; audit all three before starting
  • The right solution is less work than the wrong one — Vaultwarden's HTTPS requirement wasn't a blocker, it was a nudge toward finishing Step 8; fighting it would have cost more time than the tunnel took
  • Heredoc in any remote terminal is unreliable — write multi-line configs locally and scp; this eliminates an entire class of paste-corruption errors
  • Manufacturer content is mostly accessible via simple HTTP scraper — Firecrawl is last-resort, not primary; most products have direct PDF URLs or are covered by aggregators (ManualsLib, Manualzz)

TODO

  • Change Vaultwarden ADMIN_TOKEN from placeholder to a strong credential
  • Tailscale (Step 8d) — private device-to-device access separate from public tunnel
  • Step 7: ClaudeVault
  • Step 9: Research Service deployment (needed for PI-2)
  • Execute product intelligence plan — PI-1 through PI-6
LOREfeatureinfrastructurephase

Usability overhaul complete — all 8 steps shipped

Features

  • Gathering vocabulary applied across all UI strings — Passage, Fireside, Section, Echoes, Whispers, The Circle, Gathered/Resting/Gone, Keeper/Elder/Guide replace the full literary terminology set
  • TopBar component with labeled navigation — BookShelf, home, and DMs all discoverable without icon hunting
  • Home screen Books grid with empty state — replaces blank loading screen with actionable landing point
  • Google OAuth wired end-to-end — /auth/google + /auth/google/callback handle consent, code exchange, profile fetch, and upsert by googleId → email → create; login and register pages gain "Continue with Google"
  • Onboarding overlay added — 4-step AnimatePresence flow with expanding dot progress, skip button, and idempotent POST /users/@me/onboarded endpoint; shows once on first sign-in, never again
  • Landing page replaces redirect spinner — SceneLayer at low presence, frosted glass hero, feature tiles, footer; authenticated users redirect to /app; copy is lorem ipsum placeholder pending Opus brainstorm
  • Presence wiring complete — useServerPresenceCount hook cross-references presenceMap against server.memberIds[]; SceneLayer presence prop is now live data

Bug Fixes

  • Fantasy theme contrast fixed — --lore-text and --lore-muted darkened to readable contrast ratios on light parchment surfaces
  • Password change route guards against null passwordHash — OAuth-only users get a clear error instead of a crash

Infrastructure

  • Schema: googleId (nullable, unique), passwordHash (nullable for OAuth-only users), onboardedAt (nullable timestamp) — two migration files created manually without a local DB
  • memberIds: string[] added to server list API response — presence hook uses it to scope presenceMap to current server's members
  • loginWithTokens() action added to auth store — reads access_token/refresh_token from URL params on /auth/callback, stores and hydrates session
  • NEXT_STEPS_FOR_NATHAN.md written — migration commands, env vars, Google OAuth app setup, Docker rebuild steps, CORS note
  • Branch feature/usability-overhaul pushed to Gitea; Vercel production deploy triggered and succeeded

Lessons

  • Landing page copy is a writing problem, not a code problem — deferring to a dedicated Opus brainstorm session was the right call; lorem ipsum commits cleanly and doesn't block the PR
  • Manual Prisma migrations (no local DB) are viable as long as they match the schema diff exactly — TypeScript catches anything that doesn't line up after prisma generate
  • OAuth without extra packages is cleaner than it looks — native fetch handles token exchange and profile fetch in ~50 lines with full control over the redirect flow

TODO

  • Nathan: run pnpm db:migrate, add Google OAuth env vars, create Google OAuth app, docker compose up -d --build api
  • Landing page copy — Opus brainstorm session, reference docs/plans/2026-04-12-usability-overhaul-vision-brief.md
  • CORS: Nathan adds lore-drab.vercel.app to EXTRA_ORIGINS to enable end-to-end testing from Vercel preview
  • Open PR on Gitea — requires logging into git.harmjoy.us as Nicole first
April 9, 2026
CURRENT OSbuginfrastructure

Fix project intake photo 413 error

Bug Fixes

  • Compressed project intake photos client-side (max 1200px, JPEG 85%) before base64 encoding — full-res phone photos exceeded Vercel's 4.5MB serverless body limit, returning HTTP 413.

Infrastructure

  • Linked .vercel project config locally — pre-push deploy hook now works for life-automation on this machine.

TODO

  • api/project-intake.ts lines 426–444: pre-existing TypeScript errors — property access on unknown type from JSON parse. Not blocking (build deploys), but should be fixed.
April 6, 2026
SAVE STATEinfrastructuretooling

Taproot onboarded — add-project script, entry cohesion audit

Features

  • add-project script ships — registers new projects via CLI flags, no Supabase dashboard required
  • add-project writes both Supabase row and globals.css CSS variable in one command — full project setup autonomous
  • Entry cohesion audit complete — 9 entries corrected across all projects: missing type fields, unapproved tags, wrapped bullets, Taproot description formatting
  • Step 4.5 added to /wrap — samples 2–3 existing entries of the same type before drafting to prevent style drift going forward

Bug Fixes

  • add-project.ts initially skipped globals.css — project color never written; detected when Taproot card showed no color on the project grid; fixed by porting ensureCssVariable logic from upsert-project.ts
  • Taproot description stored with literal \n\n instead of actual newlines — paragraphs not breaking; corrected via Supabase service role update

Infrastructure

  • scripts/add-project.ts created — CLI-flag interface (--id, --name, --tagline, --description, --tech, --color) for automation-friendly project creation; sort_order auto-detected from existing max
  • npm run add-project registered in package.json alongside npm run new-entry

Lessons

  • New scripts should read existing ones before duplicating logic — upsert-project.ts was already handling Supabase + CSS together; add-project.ts missed that step until the color gap surfaced
  • Entry drift accumulates silently — the audit found 6 missing type fields and 3 unapproved tags spread across 9 entries over a month of writing without a sampling step to catch it
April 5, 2026
TAPROOTNEW PROJECTlaunchinfrastructure

Taproot — debut

Features

  • Taproot is an old Windows PC converted to a Proxmox VE 9.1.1 homelab server — purpose-built for self-hosting services and learning infrastructure hands-on
  • The design principle is progressive self-sufficiency: start with monitoring and password management, build toward hosting AI services and running production workloads off the cloud
  • Foundation complete — Proxmox installed, ZFS single-disk pool on 2TB HDD, LXC containers running Ubuntu 24.04 with Docker, Uptime Kuma live on port 3001

Lessons

  • A single-disk ZFS pool has no redundancy, but acceptable for a learning server — the constraint forces clarity about what data actually needs protection
  • Proxmox's browser console has a display glitch that drops output; SSH into containers is the reliable path for any real terminal work
  • ISP-level DNS blocking (port 53 to 8.8.8.8) surfaces early — router-as-DNS is the practical workaround, not a configuration mistake to fix later
TAPROOTinfrastructure

Steps 5–6 — Uptime Kuma live, Vaultwarden container staged

Features

  • Uptime Kuma deployed and running — monitoring dashboard live at port 3001 on docker-host (CT 100)
  • Vaultwarden container created (CT 101, IP 192.168.1.165) — Ubuntu 24.04, Docker installed, compose deploy staged
  • Credential hygiene workflow established — Notepad scratch pad pattern now the standard for all multi-step build sessions

Infrastructure

  • docker-host (CT 100) confirmed fully operational — Docker CE, Compose plugin, hello-world verified
  • Vaultwarden container mirrors docker-host setup: same LXC config, same Docker install sequence
  • Uptime Kuma docker-compose.yml deployed to /opt/uptime-kuma with restart: unless-stopped

Bug Fixes

  • Docker apt sources malformed — command substitution in Proxmox console split across lines; fixed by hardcoding arch=amd64 and codename=noble directly in sources entry
  • Ubuntu 24.04 blocks root SSH by default — fixed with sed replace on PermitRootLogin in sshd_config
  • Gateway misconfigured to 192.168.100.1 during Proxmox install — corrected to 192.168.1.1 in network UI

Lessons

  • The Proxmox web console is unreliable for anything interactive — SSH first, console only as fallback
  • Interactive commands (passwd) produce no visible output in the console; non-interactive alternatives (echo 'root:pass' | chpasswd) are the only reliable path
  • Credential amnesia is a real session hazard — the Notepad rule exists for a reason; enforce it at session start, not after the first forgotten password

TODO

  • Complete Vaultwarden compose deploy (generate ADMIN_TOKEN, write docker-compose.yml, docker compose up -d)
  • SSH still failing on CT 101 — reset root password via chpasswd, retry
  • Transfer all session credentials from Notepad into Vaultwarden once live
  • Step 7: ClaudeVault (CT 102)
  • Step 8: Tailscale on Proxmox host
  • Step 9: git init homelab, initial commit
April 4, 2026
LOREinfrastructuretooling

CLAUDE.md landed — /wrap fixed, lore sessions now reach the changelog

Features

  • CLAUDE.md added to lore repo root — stack, literary terminology, deployment targets, collaboration conventions, and place theme architecture all documented
  • Place theme system (Library, Meadow, Tavern) captured as confirmed art direction — color-mix() frosted glass pattern included so Nathan's session has it

Infrastructure

  • /wrap skill: lore added to project mapping table, git push added to Step 7 — entries now write, commit, and deploy in one step
  • save-state was 5 commits ahead of origin — pushed and deployed, changelog now current through April 3
  • Pulled Nathan's Phader voice style system merge — settings page, phader.ts, preferences store, 19 files, 4457 insertions

Lessons

  • A missing git push in a skill's commit step makes the pipeline invisible — entries were being written and committed locally but never reached the deployed site
  • Project directory → ID mapping tables in skills decay silently as projects are added; wire new projects in at setup time, not retroactively

TODO

  • CORS blocker: lore-drab.vercel.app origin not whitelisted in Nathan's API — login/register broken on Vercel preview until EXTRA_ORIGINS is set on VPS
  • Nathan has open branches to check: add-claude-md, docs/claude-md, feature/vps-migration, merge/phader-plus-place-themes
LOREfeatureinfrastructurelaunch

Place Theme System — atmosphere picker, SceneLayer bridge, full aesthetic

Features

  • Place theme system shipped end-to-end — each Book now has an atmosphere (the-library, the-tavern, the-meadow) independent of genre
  • PlaceThemePicker component renders color swatches, name, and description for each place — wired into Create Book and Book Settings modals
  • ThemeContext bridge: active server's placeTheme drives SceneLayer and CSS variables app-wide, with 160ms opacity crossfade when switching servers
  • Auth pages (login, register) moved to (auth) route group — SceneLayer background and frosted glass card on both
  • Invite page and 404 page get library scene background — every public URL now has the aesthetic
  • Root redirect page updated from plain spinner to SceneLayer-backed loading state

Infrastructure

  • placeTheme String @default("the-library") added to Server model — Prisma migration applied to production DB
  • API validates placeTheme on POST and PATCH /servers against enum of three valid place IDs
  • PlaceThemeId type and PLACE_THEMES array added to @lore/types
  • Merge conflict resolved with Nathan's commits — adopted his animationKey improvement (coarse key groups channels by server, not per-channel) and getStatus voice fix in ReaderList
  • PR opened and merged on Gitea — branch protection on master confirmed active

Lessons

  • Scoping CSS variables as inline style on a container element overrides root vars set by ThemeProvider — clean way to isolate palette without fighting the context
  • Tailwind opacity modifiers silently fail on CSS variable colors — color-mix(in srgb, var(--color) 80%, transparent) is the fix
  • lore.harmjoy.us is the right QA target — Nathan's production is VPS-deployed, not Vercel; lore-drab.vercel.app is Nicole's preview and isn't in EXTRA_ORIGINS
  • Nathan needs to pull and rebuild Docker on VM108 for changes to appear on his production URL — auto-deploy not set up yet

TODO

  • Ask Nathan to set up Gitea Actions runner for auto-deploy on merge to master
  • lore-drab.vercel.app CORS: Nathan adds it to EXTRA_ORIGINS if Vercel URL needs to work against live API
April 3, 2026
LOREfeatureinfrastructurephase

Art Direction to Production — SceneLayer wired, POC merged to master

Features

  • POC branch merged to master — place-based themes (The Library, The Tavern, The Meadow) now in the main codebase
  • SceneLayer threaded behind the authenticated app shell — photorealistic backgrounds render beneath all UI
  • Frosted glass pattern applied to all three chat UI panels: BookShelf, ChapterSidebar, ReaderList, channel header — color-mix() inline styles replace opaque surfaces
  • Scene is hardcoded to the-library pending theme system bridge (Step 3)

Bug Fixes

  • @types/react duplicate resolved — pnpm installed React 18 types (for Expo) and React 19 types (for web) as separate physical copies; TypeScript surfaced them as incompatible ReactNode types even in .tsx source files; fixed with pnpm.overrides forcing ^19.0.0 across the monorepo

Infrastructure

  • vercel.json added at repo root — framework, installCommand, buildCommand only; rootDirectory lives as a permanent Vercel project setting (not in config file)
  • .vercel/ added to .gitignore

Lessons

  • Tailwind v3 opacity modifiers (bg-lore-surface/80) don't work with CSS custom property colors — color-mix(in srgb, var(--color) 80%, transparent) in inline styles is the correct pattern
  • skipLibCheck: true doesn't protect against duplicate @types/react in source .tsx files — the TypeScript checker still sees both ReactNode shapes when resolving JSX; the fix is at the package resolution layer, not the compiler
  • Book spines preserved by design — each Book is a portal to a place; pulling a spine takes you into that world; the gathering vocabulary (Grounds, Passages, Firesides) applies inside, not at the server list level

TODO

  • Step 3: theme system bridge — placeTheme field in Prisma schema, ThemeContext reading PlaceThemeRegistry, server creation/settings UI
  • Step 4: presence wiring — useServerPresenceCount(serverId) feeding real online count into SceneLayer
  • Step 5: terminology rename — PassagesSidebar, GatheringArea, CircleList, flame icon channel prefix
  • Step 6: root CLAUDE.md from merged draft
  • Ambient sound: deferred brainstorm item
CURRENT OSfeaturerefactorphase

Gravity Layout — self-sizing cards replace CSS grid

Features

  • Gravity layout system complete — cards size themselves by focus state, dashboard adapts to any height without manual resizing
  • GravityContext: three card states (focused / resting / collapsed), max 2 focused per column, oldest-first demotion, 60s auto-focus cooldown, localStorage persistence
  • GravityCard render-prop pattern: widgets receive variant: 'focused' | 'resting' and render purpose-built compact views at rest — not shrunken full views
  • GravityColumn: narrow prop collapses dual-column layout to a single scrollable column below 900px
  • NowPlaying pinned at top of right column — always visible, no title bar, no collapse
  • Mode system simplified: "online" | "offline" + workVisible boolean replaces the three-mode system
  • Work toggle: header pill swaps accent to teal without changing the environment — data-mode="online-work" inherits pine-forest base, overrides accent only
  • GravityWatcher: auto-focus signals wire morning→brief, music starts→lyrics, item captured→tasks

Bug Fixes

  • LyricsGravityCard and QueueGravityCard never called registerCardfocusCard("lyrics") was a silent no-op, persistence and focus limits broken for both

Infrastructure

  • DashboardShell reduced from ~1094 lines to ~400 — all grid infrastructure (MODE_LAYOUTS, getRows, GridResizeOverlay, expandedCards, rowOverrides) removed
  • /gravity-test route added for layout testing without auth

Lessons

  • Wrappers around non-GravityCard components need explicit registerCard in a useEffect — skipping it is a silent failure with no error, just no behavior
  • Keeping the environment (pine-forest base) while swapping only the accent makes the work toggle feel like a lens over personal space, not a separate room
  • GravityProvider must sit outside key={mode} — state reset on mode switch is the wrong default for a layout system
April 1, 2026
LORElaunchrefactorinfrastructure

Art direction POC deployed — /vision route live

Features

  • Art direction POC complete — all four plan steps shipped on feature/art-direction-poc
  • Three themed environments live: The Library, The Tavern, The Meadow — photo backgrounds with presence-responsive lighting
  • Four surface views: Landing, About, Server (mock chat UI layered over scene), Community (ambient presence dots)
  • Demo controls: theme crossfade, presence slider (0–50), surface tabs, auto-demo mode, sound toggle

Bug Fixes

  • Tavern scene: removed all positioned circular radial-gradient elements — last remaining source of floating orbs
  • Dev debug readout removed from VignetteOverlay before sharing

Infrastructure

  • Vercel CLI installed and project linked (nrisacher-langs-projects/lore) — repo on self-hosted Gitea requires manual CLI deploys, no GitHub integration
  • rootDirectory: apps/web set via Vercel API — not settable in vercel.json (causes deploy failure if attempted there)
  • Next.js updated 15.1.0 → 15.5.14 — Vercel blocks deployment of vulnerable versions (CVE-2025-29927, middleware auth bypass)
  • Preview live at https://lore-drab.vercel.app/vision

Lessons

  • Positioned circular radial-gradient divs always read as orbs against photorealistic backgrounds — blend modes don't fix it. Full-width linear gradients and full-scene mixBlendMode: overlay tints are the correct approach for shapeless presence indication.
  • rootDirectory is a Vercel project setting, not a vercel.json key — Vercel CLI v50 also removed --root-directory. Set it once via API: PATCH /v9/projects/{id} with {"rootDirectory": "apps/web"}.
  • Deploying from the monorepo root (not apps/web) is required — Vercel must see the pnpm lockfile at root to resolve workspace dependencies.

TODO

  • Source Lottie animation files (fire, candles, petals) — LottieLayer and Howler are both wired, waiting on assets
  • Open PR on Gitea for Nathan's review — draft description already written
March 29, 2026
LOREbrainstormnamingbrand

Gathering, Not Reading — art direction vision and vocabulary

Features

  • Art direction pivot from library/book aesthetic to campfire/gathering — Lore's identity rebuilt around the third-space concept
  • Full vocabulary rework — fire & oral tradition replaces literary metaphor: Grounds (server), Passage (text channel), Fireside (voice), Whisper (DM), Keeper/Elder/Guide (roles)
  • Community lifecycle as fire spectrum — Spark → Ember → Campfire → Bonfire → Beacon, with decline states (Banked, Coals, Cold) honoring soft-delete philosophy
  • Hearth as personal home designation — any community regardless of size can be your Hearth
  • Themes are places, not genres — each theme answers "where are we gathered tonight?" and is specific enough to drive every art decision
  • Three POC themes defined: The Library (Harry Potter common room), The Tavern (medieval warmth), The Meadow (Ghibli pastoral)
  • Presence-responsive environmental scaling — spaces get bigger and richer as people gather, driven by warm vignette that expands from cozy to expansive (never cold to warm)
  • Ambient sound in scope — layered audio per theme scales with presence alongside visuals
  • POC plan confirmed — 4 steps, /vision route on a branch, Lottie + Howler.js, scene components built production-ready from day one

Lessons

  • "Lore" was never about books — lore is oral tradition, stories passed around fires, knowledge that lives in communities because people showed up
  • The name unlocked the vocabulary: once "gathering" replaced "reading" as the core verb, every naming decision cascaded naturally
  • "Fantasy" is too broad to drive art decisions — "The Library" tells you exactly what to draw
  • The emotional arc of presence scaling is cozy → expansive, not cold → warm — an empty space should feel intimate, not lonely
  • Themes as places rather than genres solved the extensibility problem — any place people gather is a valid theme
  • Prior session's brainstorm work was lost to an unsaved close — planning artifacts must be saved to files mid-session, not accumulated for wrap
CURRENT OSbugfeature

Work block reservation — weekday personal time now realistic

Features

  • /start skill now surfaces the full unworked feature backlog at session open — deferred candidates, wishlist items, and open design questions pulled from memory and design-notes
  • Backlog section grouped into "Ready to plan" and "Wishlist" with a pointer to /scope for details

Bug Fixes

  • Online mode on weekdays no longer reports 8am–5pm as personal free time — reserveWorkBlock parameter added to computeDayContext in the context engine
  • DailyBriefWidget, RadarWidget, and the AI context string all respect the work block — eligible items, day weight, and suggestions now reflect morning/evening availability only
  • RadarWidget now mode-aware via useMode() — eligibility recomputes on mode switch
  • ISSUE-008 closed — all three sub-items resolved across two sessions

Lessons

  • The context engine grid (8am–6pm) only sees 1 hour of personal free time after a 9-hour work block — the heuristic naturally suppresses discretionary items on weekdays, which is the right behavior
  • A default parameter (reserveWorkBlock = false) kept the change backward-compatible without touching every existing call site
March 28, 2026
LORENEW PROJECTlaunchprojects

Lore — debut

Features

  • Lore is a real-time chat platform built on Discord-style infrastructure wrapped in a book metaphor — servers are Books, channels are Chapters, voice rooms are Hearths, DMs are Pages
  • Genre theming is the core differentiator — each Book carries a genre (fantasy, horror, sci-fi, romance, cyberpunk) that applies a full CSS palette and ambient effects throughout the UI
  • Core infrastructure complete — messaging, voice via LiveKit, DMs, friends, roles, bots, and the full literary UX rename in place across all components

Lessons

  • First collaborative project — built with her brother, who hosts the repo and live API on his homelab; the constraint of a protected master branch and PR-required workflow is the right default for shared work
  • The literary metaphor earns its keep by being total — partial renaming would read as affectation, but renaming every layer (server → Book, channel → Chapter, moderator → Narrator) makes the world feel consistent
CURRENT OSbuginfrastructure

Mode separation, RLS cleanup, issue sweep

Bug Fixes

  • ISSUE-006 confirmed resolved — committed items not surfacing in TaskListWidget was caused by RLS infinite recursion on the projects SELECT policy, fixed in a prior session
  • ISSUE-007 confirmed resolved — work tasks disappearing after mode switch was the same root cause: fetchTasks joins items → projects(title), triggering the same recursion; optimistic insert appeared to succeed but every re-query returned null, making persistence look broken
  • ISSUE-008 (sub-problems 1 + 3) — DailyBriefWidget lacked mode awareness entirely; weekday tasks query had no mode filter, leaking work tasks into the Online brief
  • Brief cache key was current-os-facts-{date} — shared across modes; switching Online → Work reused cached Online facts; key is now current-os-facts-{mode}-{date}
  • "Work availability" label in the AI context string renamed to "Available time" — was mode-biased in Online brief context

Infrastructure

  • DailyBriefWidget now uses useMode() — tasks query always filters by mode, AI context hint is mode + weekend aware across four combinations
  • ISSUE-009 resolved — four "Project member reads X" SELECT policies in live DB updated via Supabase SQL Editor to use is_project_collaborator() instead of inline subquery; migration file synced to match

Lessons

  • A single circular RLS policy (projects → project_collaborators → projects) silently killed every query that joined to projects — the join itself was the trigger, not the query result; two distinct-looking bugs shared one root cause
  • Mode awareness needs to be explicit at every data boundary — a missing useMode() import left the brief entirely uninformed about which context it was writing for
  • DDL statements in Supabase (DROP POLICY / CREATE POLICY) return "no rows" on success — expected behavior, not an error

TODO

  • ISSUE-008 sub-problem 2 remains: planning overlay work block reservation (8am–5pm M–F in Online mode) — requires reserveWorkBlock flag in computeDayContext so freeHours reflects evening/morning personal time on weekdays
CURRENT OSbuginfrastructure

RLS recursion root cause found and fixed — commit-to-today unblocked

Bug Fixes

  • Half-screen scroll restored — overflow-hidden on <main> was clipping the stage at ~540px; replaced with overflow-y-auto and added minHeight: 640px to the grid wrapper so fr rows have a usable definite height
  • Stale closure in TaskListWidget items-changed handler fixed — handler was capturing a stale loadTasks reference; useRef + useCallback pattern ensures the latest version is always called
  • RLS infinite recursion (42P17) resolved — projects SELECT policy queried project_collaborators, whose policy queried projects, creating a deadlock loop on any query joining projects (e.g., items → projects(title))
  • Fix: is_project_collaborator() SECURITY DEFINER function queries project_collaborators directly, bypassing its RLS and breaking the cycle; projects SELECT policy updated to use it

Infrastructure

  • Diagnostic logging added to commitToToday and TaskListWidget committed items SELECT to surface silent failures during investigation
  • supabase/schema.sql synced to match live RLS fix — is_project_collaborator() function and updated project table policies now reflected in source of truth
  • Committed items moved above regular tasks in render order — visible without scrolling

Lessons

  • RLS policies that reference each other's tables form recursion cycles invisible until a cross-table query hits them — SECURITY DEFINER functions break the loop by querying at the function owner's privilege level, not the caller's
  • Silent count=0 failures are harder to diagnose than errors — adding a read-back SELECT after an update immediately surfaces whether the write landed

TODO

  • Confirm ISSUE-006 fully resolved — RLS fix is applied, but committed items behavior not yet user-verified in live session
  • ISSUE-007: work tasks disappear after mode round-trip
  • ISSUE-008: work/personal data leaking between modes in daily brief + planning overlay
March 25, 2026
SAVE STATEfeaturetoolinginfrastructure

Data lifecycle complete — /ship, visual treatment, guardrails

Features

  • /ship skill ships — marks features in Supabase and generates voice-matched type: ship changelog entries
  • Mode A ships existing features: --check confirms current state before any write runs; Mode B debuts new projects with full Supabase upsert and debut entry
  • Ship/debut visual treatment: full colored border, ambient glow via color-mix(), SHIPPED/NEW PROJECT badges, JetBrains Mono 700 title — visually distinct from session cards
  • 6 retroactive ship entries created for all historically shipped features across bark, current-os, save-state, and claude-code
  • Guardrails complete: /save prompts for /wrap when commits exist; /start flags unlogged sessions; /health Section 5.5 checks data freshness and stale in-progress features

Infrastructure

  • scripts/update-feature.ts — update or insert Supabase features with --check dry-run mode; case-insensitive name matching within a project
  • scripts/upsert-project.ts — upsert projects to Supabase and auto-add --project-<id>: <color>; to globals.css
  • Card.tsx extended with variant prop — --card-project-color set inline, letting .card-ship and .card-debut classes reference per-project color without per-project rules
  • seed-projects.ts deleted — Supabase data now managed via /ship and write scripts
  • Supabase corrections applied: analytics dashboard marked shipped, all Bark shipped_date values corrected to 2026-03-21

Lessons

  • Separating --check from the actual update prevents accidental Supabase writes — confirm state first, execute second
  • CSS custom properties set inline (--card-project-color) are the right pattern for per-instance theming — one class handles all project colors without generating per-project rules
  • Session entries and ship entries serve different purposes — retroactive ship entries don't replace session logs, they add a celebration layer that was always missing
  • Terminal line-break issues (bash treating wrapped args as separate commands) are a recurring copy-paste hazard — the fix is always one unbroken line before Enter
SAVE STATEinfrastructuretoolingfeature

Data lifecycle — voice profile, entry types, dynamic config, /wrap

Features

  • Data lifecycle plan finalized — 6-step system for keeping Save State current without manual upkeep
  • Voice profile created — reference document at references/voice-profile.md encoding writing patterns and few-shot examples for AI-generated entries
  • Entry type system added — frontmatter now supports type: session | ship | debut with backward-compatible defaulting for existing entries
  • /wrap skill built — session-closing ceremony that saves memory, drafts a voice-matched entry, and commits on approval

Infrastructure

  • config.ts deleted — PROJECT_NAMES and PROJECT_COLORS replaced with live Supabase lookups across EntryCard, EntryList, RecentActivity, and entries.ts
  • scripts/new-entry.ts updated — project IDs fetched dynamically from Supabase at runtime, no hardcoded list
  • Entry type defaulting added to getAllEntries() — missing type field resolves to session at read time

Lessons

  • Save State had no data lifecycle — entries, feature statuses, and project metadata were write-once with no mechanism to stay current
  • The staleness problem is structural, not behavioral — fixing it requires automation with a human approval gate, not just better habits
  • Voice matching requires two layers: explicit rules and few-shot examples — abstract rules alone don't produce consistent output
  • Hardcoded maps create invisible maintenance debt — the cost isn't visible until a new project silently breaks the UI

TODO

  • Continue data lifecycle plan: Steps 4–6 remain (/ship skill, visual treatment for ship/debut entries, guardrails + cleanup)
SAVE STATESHIPPEDfeaturetooling

/ship skill — shipped

Features

  • /ship skill ships — marks features as shipped in Supabase and generates voice-matched changelog entries
  • Mode A ships existing features: checks current state, confirms, updates Supabase, drafts a type: ship entry
  • Mode B debuts new projects: gathers details, upserts to Supabase, adds CSS variable, drafts a type: debut entry
  • update-feature.ts and upsert-project.ts scripts handle all Supabase writes via service role key
  • --check flag on update-feature.ts queries without mutating — the confirmation layer before any update runs

Infrastructure

  • CSS custom property --card-project-color set inline by Card.tsx — lets .card-ship and .card-debut CSS classes reference per-project color without per-project rules
  • color-mix(in srgb, ...) used for tinted backgrounds and glow shadows — no hardcoded opacity values needed

Lessons

  • Separating check from execute prevents accidental writes — the skill runs --check first, then asks, then runs the real command
  • Ship/debut visual treatment needs real entries to validate against — build the CSS first, create a test entry second
March 23, 2026
SAVE STATEbrainstormnamingbrand

Save State — brainstorm and naming

Features

  • Understory Labs brand identity locked — warm organic names with layered meaning
  • Save State vision brief completed — purpose, audience, aesthetic direction confirmed
  • Plan finalized — 6 steps, two checkpoints, weekend-scope build

Lessons

  • Naming is a design decision, not an afterthought — the name shapes how you build
  • "Save State" works on three layers: game checkpoint, session preservation, project record
  • The Understory metaphor earns its keep — light filtering through canopy, warmth in the dark
  • Building a changelog for your own projects creates a forcing function for reflective practice
SAVE STATESHIPPEDlaunchfeaturetooling

Save State — initial launch shipped

Features

  • Save State ships — Understory Labs homepage live at save-state-two.vercel.app
  • Content engine operational: markdown files in content/entries/ parse to HTML via gray-matter + unified/remark/rehype at request time
  • Changelog filterable by project with instant client-side response — no API calls, zero filter latency
  • /log skill installed globally — writes changelog entries to this repo from any project directory, making Save State infrastructure, not just a site

Lessons

  • Static export plus client-side filtering is the right architecture for a changelog — simple, fast, zero infrastructure cost
  • Building the design system first (not as a polish pass) means every component looks right from first render
  • The /log skill as a cross-project concern is the key insight — a changelog that requires opening the right project first won't get used
SAVE STATElaunchinfrastructuretooling

Save State — built and deployed

Features

  • Save State shipped — deployed static changelog at save-state-two.vercel.app
  • Project filter works client-side with instant response — no API calls, zero latency
  • Three-tier typography system implemented — JetBrains Mono titles, Share Tech Mono labels, DM Sans body
  • Entry fade-in animation with 80ms stagger on load and on filter change

Infrastructure

  • Next.js 16 static export — builds to flat HTML/CSS/JS, no server required
  • Content engine reads markdown at build time via gray-matter + unified/remark/rehype pipeline
  • Vercel deployment connected to GitHub — pushes to master trigger automatic redeploys
  • /log skill installed globally — writes entries to this repo from any project directory
  • npm run new-entry CLI scaffolder for manual entries without Claude present

Lessons

  • Static export + client-side filtering is the right architecture for a changelog — simple, fast, zero infrastructure cost
  • Baking the design system in Step 1 (not as a polish pass at the end) meant every component looked right from first render
  • The /log skill as a cross-project concern (writing to save-state from any directory) is the right model — the changelog is infrastructure, not just another feature
  • Naming the naming convention step in /new-project means future projects start with the right frame
March 22, 2026
CURRENT OSfeatureaiprojects

Shed — Home Projects feature shipped

Features

  • Shed (Home Projects) feature complete — AI-powered home project tracker inside Current OS
  • Project intake form with scope, phase, materials, and priority fields
  • Projects overlay renders active projects with status and last-updated indicators
  • AI scheduling integration — projects surface in prioritization context

Bug Fixes

  • Project intake overlay z-index conflict with calendar resolved
  • Phase label display corrected for multi-phase projects

Lessons

  • Keeping features namespaced (Shed, not just "projects") helps maintain mental boundaries in a large app
  • AI-aware data models pay off immediately — fields added for AI context during intake proved useful in first scheduling run
CURRENT OSSHIPPEDfeatureaiprojects

Shed — AI home project tracker shipped

Features

  • Shed ships — AI-powered home project tracker operational inside Current OS
  • Natural-language intake: describe a project in plain terms, Claude structures it into category, phase, materials, and priority
  • Projects overlay surfaces active work with status and last-updated indicators — no digging through lists
  • AI scheduling integration active from day one — projects surface in the prioritization engine context at intake

Lessons

  • Naming matters: "Shed" (not just "projects") creates a mental namespace that keeps feature scope clear inside a large app
  • Building for AI context at intake pays off immediately — fields added for Claude's use proved useful in the first scheduling run before the feature shipped
  • The AI-aware data model pattern: design the schema around what AI needs to be useful, not only what humans need to enter
March 21, 2026
BARKSHIPPEDfeatureaiaudio

Bark — AI feature suite shipped

Features

  • Bark ships AI feature suite — Smart Brain, Better Ears, Cooldown, and Windows service all live
  • Smart Brain routes every trigger message through Claude before acting — intent-aware instead of pattern-aware, false positives drop to near zero on busy servers
  • Better Ears adds per-trigger detection thresholds — sensitivity tunable without touching code
  • Cooldown logic prevents rapid-fire repeat plays — same clip won't fire twice in quick succession
  • NSSM service wraps the bot as a Windows background process — survives reboots, no terminal window required

Lessons

  • Async fire-and-forget for Claude calls is the right pattern in real-time audio contexts — blocking on AI response latency kills the moment
  • Graceful fallback (play the sound anyway on Claude timeout) means the AI layer enhances reliability rather than introducing fragility
  • The gap between keyword matching and contextual understanding is where the product lives — Smart Brain makes Bark feel designed rather than scripted
BARKfeatureaiaudio

Better Ears + Smart Brain — Claude-powered audio intelligence

Features

  • Claude integration added to Bark — bot now understands context around sound triggers
  • Smart Brain mode: Claude analyzes recent activity before deciding whether to play a sound
  • Better Ears: audio detection sensitivity tuned with configurable threshold
  • Cooldown logic added to prevent rapid-fire repeat plays after a trigger

Bug Fixes

  • Fixed race condition where two overlapping audio events would both trigger playback
  • Resolved NSSM service restart loop caused by unhandled promise rejection on Claude API timeout

Infrastructure

  • Claude API key stored in .env and loaded via dotenv — not hardcoded
  • NSSM service updated to use new entry point after refactor

Lessons

  • Wrapping AI calls in try/catch with graceful fallback (play sound anyway) prevents service crashes
  • Claude's response latency is noticeable in real-time audio contexts — async fire-and-forget works better than blocking
March 20, 2026
CURRENT OSSHIPPEDfeatureinfrastructure

Calendar sync — Google Calendar integration shipped

Features

  • Calendar sync ships — two-way Google Calendar integration live in Current OS
  • Events read from Google Calendar at sync time and written into the Supabase events table
  • Lifecycle pipeline picks up new events on sync — normalization and correction run automatically on ingest
  • Changes made in Current OS propagate back to Google Calendar — source of truth stays synchronized

Lessons

  • Two-way sync requires deciding which side wins on conflict — making Current OS the write layer (with Google as display) simplified the mental model considerably
  • Sync as an event trigger (not a cron job) means the pipeline runs when data arrives, not on a schedule indifferent to activity
March 8, 2026
CURRENT OSSHIPPEDfeatureaiinfrastructure

Intelligence Foundation — AI scheduling layer shipped

Features

  • Intelligence Foundation ships — AI scheduling layer operational, event lifecycle pipeline live
  • Event correction pipeline: Claude reviews raw calendar events and normalizes titles, durations, and categories before they reach the scheduling engine
  • Lifecycle tracking: events carry state (raw, normalized, scheduled, completed) — full audit trail from ingest to action
  • Normalization runs on ingest — events are clean before they hit the scheduler, not after

Lessons

  • AI as suggestion layer (not replacement) keeps corrections reversible — original data is preserved, AI output sits alongside it
  • lifecycle_state as an enum column gives clearer audit trails than boolean flags — state machines beat is_processed columns
  • Defining "done" for an event is the hardest design decision in an AI pipeline — encoding it as a state machine forced that decision early
CURRENT OSinfrastructureaiphase

Intelligence Foundation — corrections, lifecycle, normalization

Features

  • Intelligence Foundation phase complete — AI scheduling layer operational
  • Event correction pipeline: Claude reviews raw calendar events and normalizes titles, durations, and categories
  • Lifecycle tracking: events now carry state (raw, normalized, scheduled, completed)
  • Normalization runs on ingest — events are clean before they hit the scheduling engine

Infrastructure

  • Supabase schema extended with lifecycle_state and normalized_at columns
  • Correction pipeline runs as a server action triggered by calendar sync
  • Idempotent design — re-running normalization on already-normalized events is safe

Lessons

  • Treating AI output as a suggestion layer (not a replacement for original data) made corrections reversible
  • lifecycle_state as an enum column gave clearer audit trails than boolean flags
  • The hardest part of AI pipeline design is deciding what "done" looks like for a given event — encoding that as a state machine helped
March 1, 2026
CLAUDE CODE SETUPinfrastructuretooling

Power user setup — Steps 1 through 8

Features

  • Full Claude Code power user environment built across 8 steps
  • 20+ skills installed globally — /start, /plan, /brainstorm, /commit, /save, /tdd, and more
  • MCP servers configured: GitHub MCP for repo access, Context7 for live documentation
  • Hooks configured for pre/post tool events
  • CLAUDE.md updated with full work context, stack, and learning preferences

Infrastructure

  • Skills stored in ~/.claude/commands/ — available in all projects
  • toolkit.md created as single-source inventory of all capabilities
  • power-user-setup.md documents the step-by-step build history
  • opusplan model profile set as default — Opus during planning, Sonnet during execution

Lessons

  • The skill system makes Claude Code dramatically more consistent across sessions
  • CLAUDE.md is more valuable the more specific it is — vague instructions get ignored
  • Model selection matters: Opus for architecture decisions, Sonnet for building, Haiku for quick lookups
  • /save after every session is the habit that makes everything else work
CLAUDE CODE SETUPSHIPPEDinfrastructuretooling

Claude Code Setup — power user environment shipped

Features

  • Claude Code power user environment ships — 20+ global skills, MCP servers, hooks, and model profile operational
  • 20+ global skills installed: /plan, /brainstorm, /commit, /save, /tdd, /debug, /review, /start, /log, and more — common workflow patterns encoded as single commands
  • GitHub MCP + Context7 MCP configured — repo access and live documentation lookup available in any conversation
  • Hooks registered for git and file events — auto-format, lint, env protection, and toast notifications running silently in the background
  • opusplan model profile set as default — Opus during planning sessions, Sonnet during execution, Haiku for quick lookups

Lessons

  • The skill system makes Claude Code consistent across sessions — without it, every session starts from scratch
  • CLAUDE.md specificity is the multiplier: vague global instructions produce vague behavior; specific context produces specific output
  • Model selection is a workflow decision, not just a cost tradeoff — Opus for architecture, Sonnet for building, Haiku for trivia
  • /save as a closing habit is what makes everything else accumulate value over time