VEKTOR · MAGMA · MCP · Local-First AI

What we
built:
34 tools,
6 apps,
60 seconds

The dragons are slain. The wizard runs. Now the payoff — what VEKTOR actually is, why local-first memory is the right architecture for AI agents, and what you get when an MCP server has real depth.

8 MIN READ PART 3 OF 3 VEKTOR · MAGMA · MCP · ARCHITECTURE
// THREE_PART_SERIES
PART 03 What We Built — 34 tools, 6 apps, one wizard, 60 seconds
THE_RESULT

One command.
Six apps configured.
Zero manual edits.

After everything in Parts 1 and 2 — the path hell, the ABI nightmare, the popup dragon, the Groq schema bug — what you get is this: run vektor activate <key> and a wizard walks you through everything. By the end, every AI app on your machine has persistent memory.

Terminal — the final result
✓ Claude Desktop — found ✓ Cursor — found ✓ Windsurf — found ✓ VS Code — found ✓ Continue — found ✓ Groq Desktop — found ✓ Groq Desktop schema fix applied automatically ✓ 6 apps configured with VEKTOR memory · To reconfigure any app, run: vektor setup
34
MCP tools
available
6
AI apps
configured
60s
Setup time
start to finish
AppProfileToolsAuto-fix
Claude DesktopFull34
CursorDev15
WindsurfDev15
VS CodeDev15
ContinueDev15
Groq DesktopDev15✓ + schema patch
THE_ARCHITECTURE

MAGMA: the memory
graph that never forgets

VEKTOR's memory engine is built on MAGMA — a four-layer associative graph stored entirely in local SQLite with vector extensions. No cloud. No subscriptions. No data leaving your machine. Everything lives in a single .db file.

The four layers each capture a different kind of knowledge:

SEMANTIC LAYER
Vector similarity between memories. "TypeScript preferences" connects to "project setup" connects to "build tooling."
384-dim embeddings · cosine similarity
CAUSAL LAYER
Cause-and-effect chains. "Decided to use Postgres" links to "because Redis hit memory limits" links to "because traffic grew 10x."
directed edges · reasoning chains
TEMPORAL LAYER
Time-ordered sequences. What happened before, what happened after, how decisions evolved over weeks and months.
before → after · recency decay
ENTITY LAYER
Co-occurrence of named things. People, projects, tools, concepts that appear together frequently build stronger connections.
co-occurrence · entity resolution

When you call vektor_recall, it searches across all four layers simultaneously. You don't just get semantically similar text — you get memories that are causally related, temporally connected, and entity-linked to your query. It's the difference between a search engine and actual memory.

"The goal isn't to store everything. The goal is to surface the right thing at the right moment — across every AI app you use, seamlessly."

THE_TOOLS

34 tools.
Four categories.
One MCP server.

Most MCP servers expose 3-5 tools. VEKTOR exposes 34 — because persistent memory for AI agents is actually a deep problem that touches many different capabilities. Here's what's included:

Memory tools (5)

The core: vektor_store, vektor_recall, vektor_graph, vektor_delta, vektor_briefing. Store facts, recall by semantic similarity, traverse the graph, see what changed, generate morning briefings. These are the tools that make every AI app actually remember you.

CLOAK stealth browser tools (14)

A full stealth browser layer built on Playwright: cloak_fetch for compressed page content, cloak_render for full CSS/DOM layout sensing, cloak_diff for semantic change detection, cloak_detect_captcha, cloak_solve_captcha, identity management, and a self-improving behaviour pattern system that learns which patterns get past bot detection.

SSH tools (5)

cloak_ssh_exec for running commands on remote servers, with automatic read/write classification — read operations auto-execute, write operations require explicit approval. Plus cloak_ssh_plan for multi-step transactions, cloak_ssh_backup, cloak_ssh_rollback, and cloak_ssh_session_store for end-of-session handover notes.

Utility tools (10)

cloak_passport for the AES-256 machine-bound credential vault, tokens_saved for ROI tracking, turbo_quant_compress for 87% embedding compression, cloak_cortex for project directory scanning, and the full pattern store management suite.

THE_PHILOSOPHY

Why local-first
memory always wins

Every cloud memory service for AI agents has the same fundamental problem: your data lives on someone else's server. Your preferences, your decisions, your project context, your confidential work — all of it flowing through a third party's infrastructure on every query.

VEKTOR's memory never leaves your machine. The embedding model runs locally (ONNX WASM). The vector index is SQLite. The vault is AES-256 encrypted and machine-bound — physically unreadable on any other hardware. You pay once. There's no usage meter, no rate limits, no monthly bill that scales with how much your agents think.

"One-time purchase. Local SQLite. Zero cloud. Your memory, on your machine, forever."

This isn't just a privacy argument — it's a performance argument. Local reads are microseconds, not milliseconds. No network latency on every recall query. No cold starts. No API timeouts. The memory is always there, always fast, regardless of your internet connection.

THE_LESSON

DXT is the right path
for AI app distribution

The biggest lesson from this entire journey isn't technical — it's about distribution. The old way of shipping MCP servers was a documentation problem: write a long README explaining how to edit a JSON config file, hope users follow it exactly, then handle a flood of support tickets from people who got it wrong.

DXT changes the distribution model entirely. Ship a 4KB manifest file alongside your npm package. Users drag it onto Claude Desktop. Done. The config is written correctly, the paths are resolved dynamically, the licence key is entered in a proper UI. There's nothing to get wrong.

The setup wizard extends this philosophy to every other AI app. One command, auto-detection, per-app configuration, zero manual editing. The developer experience matches what users expect from mature software — not what they expect from beta tooling held together with documentation.

"The era of 'edit this JSON file manually' for AI tooling is over. DXT, wizards, and auto-configuration are the new baseline."

TRY_IT

Install VEKTOR.
60 seconds to first memory.

Everything described in this series is in VEKTOR Slipstream v1.4.9. One-time purchase, local-first, 34 tools, auto-configures across Claude Desktop, Cursor, Windsurf, VS Code, Continue, and Groq Desktop.

VEKTOR Slipstream

34 MCP tools · Local SQLite · One-time purchase · No cloud

Get VEKTOR →

Back to Part 1  ·  Back to Part 2