mcp-memory-graph
by rlgjr1971 · ai-ml · mcp-server, smithery
A context-aware memory MCP server for Claude Code and any MCP-compatible AI agent. Goes beyond basic vector search by adding authority weighting, conflict detection, and typed relationship edges between memories — so your agent always retrieves the right answer when sources disagree. Inspired by the context engine architecture described in Unblocked's "How a Context Engine Actually Works". Why this exists Standard memory MCP servers store and retrieve memories by semantic similarity. That wor
Source: https://github.com/RetroRobAI/mcp-memory-graph
Install
git clone https://github.com/RetroRobAI/mcp-memory-graphTags: mcp-server, smithery
Source: smithery
About ai-ml MCP servers and Claude skills
ai-ml MCP servers extend what AI agents can do inside Claude Code, Cursor, Copilot, Codex, and Windsurf. The Skiln directory indexes 16,000+ such integrations across 22 categories.
mcp-memory-graph is one of hundreds of ai-ml entries indexed on Skiln. Browse the full ai-ml category or the complete directory of Claude skills, MCP servers, agents, commands, and hooks.
Related ai-ml MCPs and skills
- ProxVanta by proxvanta
ProxVanta helps teams stop rebuilding the same AI setup in every tool. Discover public Agent Contexts and workflows, install a private team version, add org-only context and guardrails, and reuse it across ChatGPT, Codex, Claude, Figma, Cursor, and other MCP-capable tools. The result is less prompt sprawl, more consistency, and a shared operating layer your team can actually maintain.
- toolstem-sec-mcp-server by toolstem
SEC EDGAR signal intelligence for AI agents. Five tools that pre-compute the signals that matter: - get_company_filings_summary — filing velocity (ACCELERATING/NORMAL/SLOWING vs 365-day average), material event count, disclosure trend - get_insider_signal — Form 3/4/4A insider activity probe with derived signals - get_institutional_signal — SC 13D activist risk flag (live), recent filings, 13F flow - get_material_events_digest — 8-K item-level severity digest (RED/YELLOW/GREEN), red-flag count
- Overboard Studio by alona-assouline
Create and manage collaborative whiteboards on Overboard Studio directly from your AI assistant. Generate boards, add sticky notes/shapes/text/connectors, invite collaborators, and pull live board content — all via natural language. 17 tools across boards, elements, collaborators, and activity. OAuth 2.0 with PKCE; sign in with your Overboard account at https://overboard.studio.
- agentrails by sommerdhussain
MCP server on Smithery
- UniRate MCP by rob-brown96cc
Currency conversion and exchange rates for AI assistants. 170+ currencies (fiat + major crypto), historical data back to 1999, free tier, MIT-licensed. **Tools:** - `convert` — convert an amount between currencies at the latest rate - `latest_rate` — current rate (single pair) or full table for a base - `historical_rate` — rate on a specific date (Pro plan, back to 1999-01-04) - `list_currencies` — supported currency codes Free tier covers convert / latest_rate / list_currencies. Pro plan unlo
- AI Research Assistant by ai-research
MCP server on Smithery
- Yapi MCP Server by lzsheng
用于 Yapi 集成的 Model Context Protocol 服务器。需要在项目根目录的 .env 文件中进行外部配置,以设置环境变量,如 YAPI_TOKEN、YAPI_BASE_URL 等。
- OHMS by crashzero9
Exposes Shopify order and inventory management tools via MCP, allowing agents to fetch, update, and print orders without exposing raw Shopify credentials.
Frequently asked questions
How do I install mcp-memory-graph?
Add the install command above to your Claude Code, Cursor, or Windsurf MCP configuration. Most servers register via npx, a local command, or a Docker image. Refer to the source repository for environment variables and credential requirements.
Which clients support mcp-memory-graph?
Any MCP-compatible client works: Claude Desktop, Claude Code CLI, Cursor, Windsurf, Zed, and VS Code with the official MCP extension. OpenAI Codex and GitHub Copilot increasingly support MCP via adapter bridges.
Is mcp-memory-graph free?
The server itself is typically open source. Any upstream service (API keys, paid tiers, hosted infrastructure) may have its own pricing. Check the source repository for details.