Context LatticeBy Private Memory Corp
Guide 3

Integration Guide

Connect ChatGPT app and Claude chat apps (desktop and web), plus Claude Code, Codex, and OpenClaw/ZeroClaw/IronClaw to Context Lattice once your stack is running locally.

Prerequisite

Bring stack up first

Use the launch mode you need, then validate core health before integrating any client.

BOOTSTRAP=1 scripts/first_run.sh
ORCH_KEY="$(awk -F= '/^MEMMCP_ORCHESTRATOR_API_KEY=/{print substr($0,index($0,"=")+1)}' .env)"

# choose mode if needed
gmake mem-up
# gmake mem-up-lite
# gmake mem-up-full

curl -fsS http://127.0.0.1:8075/health | jq
curl -fsS -H "x-api-key: ${ORCH_KEY}" http://127.0.0.1:8075/status | jq
  • Orchestrator API: http://127.0.0.1:8075
  • MCP hub memory endpoint: http://127.0.0.1:53130/memorymcp/mcp
  • MCP hub qdrant endpoint: http://127.0.0.1:53130/qdrant/mcp
Quick wiring

5-minute integration smoke test

Run this once to confirm your app can write and retrieve memory through Context Lattice before wiring UI-specific settings.

ORCH_KEY="$(awk -F= '/^MEMMCP_ORCHESTRATOR_API_KEY=/{print substr($0,index($0,"=")+1)}' .env)"

curl -fsS -H "content-type: application/json" -H "x-api-key: ${ORCH_KEY}" \
  -d '{"projectName":"_global","fileName":"smoke/integration_check.md","content":"integration smoke ok"}' \
  http://127.0.0.1:8075/memory/write | jq

curl -fsS -H "content-type: application/json" -H "x-api-key: ${ORCH_KEY}" \
  -d '{"query":"integration smoke ok","limit":3}' \
  http://127.0.0.1:8075/memory/search | jq
  • If both calls return ok: true: your app can safely integrate.
  • If you get 401: verify MEMMCP_ORCHESTRATOR_API_KEY and restart the caller process.
Messaging Surface

OpenClaw/ZeroClaw/IronClaw + Telegram + Slack command bridge

Context Lattice now supports channel command intake via orchestrator-native endpoints. The default handle is @ContextLattice.

POST /integrations/messaging/command
POST /integrations/messaging/openclaw
POST /integrations/messaging/ironclaw
POST /integrations/telegram/webhook
POST /integrations/slack/events

@ContextLattice remember deployment complete
@ContextLattice recall deployment
@ContextLattice status
# local direct test (secure default requires x-api-key)
curl -fsS -H "content-type: application/json" -H "x-api-key: ${ORCH_KEY}" \
  -d '{"channel":"openclaw","source_id":"chat-1","text":"@ContextLattice status"}' \
  http://127.0.0.1:8075/integrations/messaging/command | jq
  • BYO accounts: Telegram/Slack credentials stay in your own account.
  • Project routing: commands can include project=<name> and topic=<path> directives.
  • Default behavior: OpenClaw/ZeroClaw route directly; IronClaw is optional and feature-flagged.
Client Integrations

ChatGPT app, Claude chat apps, Claude Code, Codex

ChatGPT apps (desktop + web)

For normal ChatGPT user apps or API-driven GPT clients, use Context Lattice as the memory sidecar and call orchestrator endpoints around message processing.

  • Persist memory on key state changes: POST /memory/write
  • Retrieve context before response generation: POST /memory/search
  • Check runtime health: GET /health and GET /status with x-api-key

Claude chat apps (desktop + web)

Use desktop or browser Claude chat apps through MCP-compatible clients against the local MCP hub endpoint, then route high-value summaries through orchestrator writes.

  • MCP server URL: http://127.0.0.1:53130/memorymcp/mcp
  • Keep write payloads compact; avoid dumping full transcripts.
  • Use topic paths so retrieval stays scoped and fast.

Claude Code + Codex

Point your coding agent runtime at the same local stack and treat memory writes as explicit checkpoints.

export MEMMCP_ORCHESTRATOR_URL=http://127.0.0.1:8075
export MEMMCP_HTTP_URL=http://127.0.0.1:59081/mcp
export MCP_HUB_URL=http://127.0.0.1:53130/memorymcp/mcp
export MEMMCP_ORCHESTRATOR_API_KEY="$(awk -F= '/^MEMMCP_ORCHESTRATOR_API_KEY=/{print substr($0,index($0,"=")+1)}' .env)"

Pattern: write summaries after meaningful edits, fetch retrieval context before planning or review actions.

OpenClaw / ZeroClaw / IronClaw

Trait mapping and wiring

Map OpenClaw/ZeroClaw/IronClaw memory traits directly to Context Lattice endpoints. Keep orchestrator as the single memory control plane.

Recommended mapping

  • memory_recall_ctxPOST /memory/search
  • memory_save_storePOST /memory/write
  • messenger command hookPOST /integrations/messaging/openclaw
  • ironclaw command hookPOST /integrations/messaging/ironclaw
  • healthbeatGET /health and GET /status with x-api-key
  • tools_exec → MCP hub /memorymcp/mcp endpoint

Operational defaults

  • Keep Qdrant local-first with gRPC preferred.
  • Use BYO cloud keys only when explicitly enabled.
  • Preserve orchestrator fanout/backpressure defaults before aggressive tuning.
  • Strict security mode redacts and blocks suspected secrets on OpenClaw/ZeroClaw/IronClaw routes.
Web 3 ready

IronClaw compatibility mode

Context Lattice can expose an IronClaw-compatible command surface while keeping local-first orchestration and your existing sink stack unchanged.

# enable IronClaw bridge
IRONCLAW_INTEGRATION_ENABLED=true
IRONCLAW_DEFAULT_PROJECT=messaging

# keep strict secret protections on claw surfaces
MESSAGING_OPENCLAW_STRICT_SECURITY=true
  • Endpoint: POST /integrations/messaging/ironclaw
  • Security: suspected credentials are blocked on write and redacted in returned text/results.
  • Documentation fit: IronClaw's deep docs and ASCII architecture style map cleanly to this mode.