Jacob

Builder. Tinkerer. Fossil Hunter.

I've been building computers since I was 12. Printers, servers, AI systems — if it can be built, I want to understand it.
> Building since 12
> Proxmox homelab
> ROCm GPU inference
> Local AI first
$ whoami
jacob — builder, tinkerer, fossil hunter

$ cat ~/manifesto.txt
I build things because I want to understand how they work.
Started with PCs at 12. Then printers. Then servers.
Now I run my own AI agent on my own hardware.

$ echo $STACK
Proxmox / ROCm / AMD R9700 32GB / Local LLM inference
OpenClaw agent platform / ACT-R cognitive memory
Tailscale mesh / self-hosted everything

$ echo $PHILOSOPHY
"If you're not willing to run it locally, you don't own it."

Aegis Falls

Aegis Falls is my personal AI architecture. The name captures the flow: frontier intelligence falling from the top of the stack down into local execution. Claude runs on OpenClaw at the apex, handling reasoning and planning. Tasks cascade from there into smaller local agents running on-device via Ollama. Lily is the AI persona — the persistent identity that threads the whole system together.

Claude handles the hard thinking. Local agents handle the doing. Lily is who you're actually talking to. The cascade is the architecture; the fall is the flow of intelligence from frontier model to local execution. Everything runs on hardware I own — a dedicated GPU node (aegis-node, AMD Radeon AI PRO R9700, 32GB VRAM, ROCm) with a secondary embedder/inference node for background tasks. The frontier is there when you need it. Local is always on.

What makes it interesting is the ACT-R cognitive memory layer — a persistent memory system inspired by human cognition. Information isn't just stored; it has activation levels based on recency and frequency of use. Retrieval works through spreading activation, meaning related memories strengthen each other. It's not a simple database lookup — it's a cognitive architecture that learns which information matters.

The whole system runs 24/7. Lily handles tasks autonomously through Discord, manages the homelab, assists with development, and maintains context across sessions. It's not a product — it's a personal tool, built iteratively over six versions, running on hardware I own. Read the full architecture deep-dive.

$ openclaw status --node aegis-node
Agent: Lily
Platform: OpenClaw v6
Memory: ACT-R cognitive architecture
GPU: AMD Radeon AI PRO R9700 (32GB VRAM)
Inference: ROCm local + Claude API hybrid
Uptime: continuous
Status: operational