BETA Hydrate is in beta. Register during beta and lock $5/mo Pro forever - free during beta + 1 month after v1 launches. Join the waitlist →
Hydrate for Mistral Vibe · v1 launch

Vibe coding with memory that persists.

Hydrate connects to Mistral Vibe via its MCP server and gives it the same persistent memory layer already shipping for Claude Code. Prior decisions, session summaries, and compressed project docs inject automatically into the context before each prompt. On-prem, auditable, EU AI Act aligned.

Ships with Hydrate v1 · on-prem by architecture · EU AI Act aligned
86-95%
token reduction measured across ten realistic scenarios on two independent codebases. The same compress path Mistral Vibe uses via MCP saved 10,903 - 36,423 tokens per ten-scenario run - direct measurement, input and output both observed.
How it works

MCP tools Mistral Vibe invokes automatically.

Hydrate exposes an MCP server at localhost:24312/mcp. Add it to Mistral Vibe once and it gains three tools. Mistral decides when to call them based on the conversation context - the same mechanism it uses for every other MCP tool.

hydrate_recall

Returns prior decisions, session summaries, and pinned canon relevant to the current task. Mistral invokes it automatically when context from past sessions would help.

hydrate_compress

Compresses long prose through the local summariser before it reaches the model. Measured 15x-100x ratios on real project docs.

hydrate_read_file

Reads a workspace file, auto-compressing when it is bigger than 4 KB. Prevents large files from dominating the context window during vibe coding sessions.

Setup: two lines

Run hydrate init to start the local server, then add http://localhost:24312/mcp as an MCP endpoint in Mistral Vibe's settings. That is the entire integration. No daemon, no cloud account, no new UI to learn.

The server handles tool registration, session capture, and recall automatically. Every Mistral-originated request carries X-Hydrate-Source: mistral so savings attribute cleanly per client in the audit log.

Integration guide →
Cross-tool memory

Switch tools. Keep context.

Hydrate does not partition memory by client. A decision captured from a Claude Code session injects into the next Mistral Vibe prompt. A file compressed by the Copilot extension is available to Mistral without re-uploading. One store, all your tools.

Claude Code captures decisions via Stop hook
hydrate-server local SQLite store, encrypted at rest
Mistral Vibe recalls via hydrate_recall MCP tool

The same flow works in reverse: Mistral captures, Claude Code recalls. All three surfaces share one store.

Claude Code

shipping

Native hooks - UserPromptSubmit, Stop, PreToolUse, PostToolUse. Zero configuration. Works on day one with your existing Claude subscription.

Claude Code page →

VS Code + Copilot

v1 launch

VS Code extension registering an @hydrate chat participant plus three Language Model Tools. Same memory store - decisions from Copilot flow to Mistral and vice versa.

Copilot page →

Mistral Vibe

v1 launch

MCP server integration. Mistral Vibe connects to the Hydrate MCP endpoint and gains the same recall, compress, and file tools the other clients use.

Details ↓
Local-first by architecture

Your sessions stay on your machine.

01

On-prem binary

hydrate-server runs locally. The MCP endpoint is localhost: Mistral Vibe never routes your session data through a Hydrate cloud service.

02

Encrypted at rest

The local SQLite store uses AES-GCM encryption. Keys are derived from your machine identity. The file is unreadable if copied to another machine.

03

Secrets scrubbed

The internal/scrubber package strips common secret patterns before any storage or LLM processing. Credentials visible in a Mistral session are never persisted or transmitted.

04

EU AI Act aligned

hydrate compliance report maps to each applicable Article. Every figure is backed by a row in hydrate_retrievals and reproducible with a single sqlite3 query.

Ships with Hydrate v1

Join the waitlist.

Hydrate for Mistral Vibe launches alongside Hydrate's public v1 release. Waitlist opens Wednesday 6 May. Beta invites roll out in groups of 50 on a first-come-first-served basis.

Beta rolls out in groups of 50, first come first served. Already invited? Install instructions →

Press, YouTuber, newsletter or researcher covering the launch? Request early access →