🌟 A powerful knowledge graph server for AI agents, built with Neo4j and integrated with Model Context Protocol (MCP).
- 🔄 Dynamic knowledge graph management with Neo4j
- 🤖 Seamless integration with OpenAI models
- 🔌 MCP (Model Context Protocol) support
- 🐳 Docker-ready deployment
- 🎯 Custom entity extraction capabilities
- 🔍 Advanced semantic search functionality
- Docker and Docker Compose
- Python 3.10 or higher
- OpenAI API key
- Clone the repository:
git clone https://github.com/gifflet/graphiti-mcp-server.git
cd graphiti-mcp-server
- Set up environment variables:
cp .env.sample .env
- Edit
.env
with your configuration:
# Required for LLM operations
OPENAI_API_KEY=your_openai_api_key_here
MODEL_NAME=gpt-4o
- Start the services:
docker compose up
Default configuration for Neo4j:
- Username:
neo4j
- Password:
demodemo
- URI:
bolt://neo4j:7687
(within Docker network) - Memory settings optimized for development
You can run with environment variables directly:
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4o docker compose up
- Configure Cursor to connect to Graphiti:
{
"mcpServers": {
"Graphiti": {
"url": "http://localhost:8000/sse"
}
}
}
- Add Graphiti rules to Cursor's User Rules (see
graphiti_cursor_rules.md
) - Start an agent session in Cursor
The server consists of two main components:
- Neo4j database for graph storage
- Graphiti MCP server for API and LLM operations
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- Neo4j team for the amazing graph database
- OpenAI for their powerful LLM models
- MCP community for the protocol specification