personal-presence-os
Most high-influence decision makers don't scroll social media anymore. They ask AI assistants. If you want to reach them, your expertise needs to be in the models — cited, surfaced, linked back to you.
personal-presence-os is the infrastructure for that. One codebase, many domains, each publishing content in a format LLMs can read, index, and cite. You write about what you know. The engine makes sure the right models find it.
It's a static content engine built with Bun and Hono. No frameworks, no bundlers, no cloud-specific APIs. Markdown in, semantic HTML out, with llms.txt generated for every site. Virtual hosting via Host header means one process serves all your domains.
The engine is opinionated about a few things: prerender everything at build time, never render on demand in production, record every non-obvious decision, never delete history. These aren't arbitrary constraints — they're the product of building and rebuilding personal sites for years and finally writing down what worked.
Read more about how it works | Development log | Source on GitHub