Local workspace for AI continuity

Fob

Keep project context, decisions, handoffs, and AI conversations together on your machine so Claude, Codex, ChatGPT, Gemini, and future-you can pick up the thread.

The context tax

AI tools are fast. Keeping them aligned is the hard part.

Fob is not another model and not an IDE. It is the local continuity layer between your project, your AI accounts, and your decisions.

How it works

Ask, decide, save, continue.

01

Import or ask

Paste a long AI answer you already paid tokens for, or ask Claude, Codex, both, or a structured debate from one local dashboard.

02

Get a verdict

Fob turns multi-model noise into practical next actions, saved decisions, conflict resolutions, and reusable context.

03

Keep the useful part

Save the decision, pin the answer, create a handoff, or build a context packet for another AI without rerunning the same prompt.

04

Ship with guardrails

Review diffs, create file-edit approvals, attach proposed patches, and commit or push only after explicit confirmation.

One local control surface

Your project memory should live with the project.

Fob stores durable memory in a local project folder. It can run as a terminal room, browser dashboard, headless local API, or MCP server for compatible AI clients.

MemoryProject facts, preferences, decisions, pins, and handoffs.
RoutingClaude, Codex, both, debate, conflict resolution, and context packets.
SafetyApproval-gated patches, git nudges, typed commit and push confirmation.
IntegrationsLocal API and MCP tools for pulling saved context into other AI workflows.

Why Fob is different

Not just an archive. A live room for what happens next.

Session history is useful. Fob goes further: it keeps the current decision, approved context, agent disagreement, and next action visible while you are still working.

Active routing Ask Claude, Codex, Both, Debate, or Resolve from one project-aware surface.
Durable decisions Save the verdict, conflict, handoff, or preference that should shape the next answer.
Approval path Attach files, review diffs, create approvals, and commit or push only after explicit confirmation.
Low-friction local start No database ceremony for the core flow. Start in a project, ask, save, and continue.

Public walkthrough

See the local dashboard without installing first.

The real Fob dashboard runs on your computer. The website shows a safe preview of the flow so a tester can understand the product before touching their own files.

Real Fob local dashboard showing project context, worker routing, and saved memory controls.
Actual local dashboard running on a safe demo project.

Founder release

Install the local app, then activate your license.

Founder users get the Mac ZIP, a Fob license key from Polar, and a local dashboard that opens inside their own projects.

unzip fob-0.1.0-mac-arm64.zip
cd fob-0.1.0-mac-arm64
./install.sh
fob login FOB_your_license_key
cd /path/to/your-project
fob try
# opens http://127.0.0.1:8787

Local first

The website is not the engine.

Vercel hosts this public site. The actual Fob app runs on your computer because it needs your local files, your local Claude/Codex/Gemini tools, and your private project memory.

Your machineProject memory stays in your local workspace.
Your accountsFob calls AI CLIs only when you ask it to.
Your controlNo push, patch apply, package install, or risky action without approval.

Companion CLI · open source

And then there's the code that shouldn't leave your laptop.

Fob handles continuity. CLOAK handles redaction. CLOAK is a local CLI we built alongside Fob to solve the other half of the problem: developers pasting proprietary code into ChatGPT against policy because the alternative is missing a deadline. CLOAK is open source under Apache 2.0 and free forever — independent of Fob's Founder License.

Terminal recording: cloak scan, diff-context, context, then obfuscate --verify pytest passing.
cloak scan → diff-context → context → obfuscate --verify "pytest"

What it is

A local continuity layer for serious AI work.

Fob is for builders who use more than one AI tool and hate losing context between sessions. It is practical, local-first, and built around the idea that the project should remember what happened and why.