https://github.com/redplanethq/core
Your personal plug and play memory layer for LLMs
https://github.com/redplanethq/core
neo4j-graph nodejs postgresql prisma redis remix tailwind-css typescript
Last synced: 3 months ago
JSON representation
Your personal plug and play memory layer for LLMs
- Host: GitHub
- URL: https://github.com/redplanethq/core
- Owner: RedPlanetHQ
- License: other
- Created: 2025-05-27T05:47:44.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2025-07-15T06:34:47.000Z (3 months ago)
- Last Synced: 2025-07-15T12:23:32.865Z (3 months ago)
- Topics: neo4j-graph, nodejs, postgresql, prisma, redis, remix, tailwind-css, typescript
- Language: TypeScript
- Homepage: https://core.heysol.ai/
- Size: 1.34 MB
- Stars: 420
- Watchers: 2
- Forks: 27
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
🌐 Language
English
| 简体中文
| 繁體中文
| 日本語
| 한국어
| हिन्दी
| ไทย
| Français
| Deutsch
| Español
| Italiano
| Русский
| Português
| Nederlands
| Polski
| العربية
| فارسی
| Türkçe
| Tiếng Việt
| Bahasa Indonesia
## 🧠 C.O.R.E.
**Contextual Observation & Recall Engine**
C.O.R.E is a portable memory graph built from your llm interactions and personal data, making all your context and workflow history accessible to any AI tool, just like a digital brain. This eliminates the need for repeated context sharing . The aim is to provide:
- **Unified, Portable Memory**: Add and recall context seamlessly, and connect your memory across apps like Claude, Cursor, Windsurf and more.
- **Relational, Not just Flat Facts**: CORE organizes your knowledge, storing both facts and relationships for a deeper richer memory like a real brain.
- **User Owned**: You decide what to keep, update or delete and share your memory across the tool you want and be freed from vendor lock-in.## 🎥 Demo Video
[Check C.O.R.E Demo](https://youtu.be/iANZ32dnK60)
## 🧩 Key Features
- **Memory Graph**: Visualise how your facts and preferences link together
- **Chat with Memory**: Ask questions about memory for instant insights and understanding
- **Plug n Play**: Instantly use CORE memory in apps like Cursor, Claude## ☁️ C.O.R.E Cloud Setup
1. Sign up to [Core Cloud](https://core.heysol.ai) and start building your memory graph.
2. Add your text that you want to save in memory. Once clicking on `+ Add` button your memory graph will be generated.
3. [Connect Core Memory MCP with Cursor](#connecting-core-mcp-with-cursor)## 💻 C.O.R.E Local Setup
#### Prerequisites
1. Docker
2. OpenAI API Key> **Note:** We are actively working on improving support for Llama models. At the moment, C.O.R.E does not provide optimal results with Llama-based models, but we are making progress to ensure better compatibility and output in the near future.
>
#### Run C.O.R.E locally1. **Copy Environment Variables**
Copy the example environment file to `.env`:
```bash
cp .env.example .env
```2. **Start the Application**
Use Docker Compose to start all required services:
```bash
docker-compose up
```3. **Access the App**
Once the containers are running, open your browser and go to [http://localhost:3000](http://localhost:3000).
4. **Create Account with Magic Link**
- To Create an account, click on `Continue with email` button
- Enter your email and click on `Send a Magic Link` button
- `Copy the magic link from terminal logs` and open it in your browser
5. **Create Your Private Space & Add Data**
- In the dashboard, go to the top right section -> Type a message, e.g., `I love playing badminton`, and click `+Add`.
- Your memory is queued for processing; you can monitor its status in the `Logs` section.
- Once processing is complete, nodes will be added to your private knowledge graph and visible in the dashboard.
- You can later choose to connect this memory to other tools or keep it private.6. **Search Your Memory**
- Use the dashboard's search feature to query your ingested data within your private space.
## Connecting CORE MCP with Cursor
1. Open the CORE dashboard and navigate to the API section to generate a new API token.
2. In Cursor, go to: Settings → Tools & Integrations → New MCP Server.
3. Add the CORE MCP server using the configuration format below. Be sure to replace the API_TOKEN value with the token you generated in step 1.MCP configuration to add in Cursor
```json
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["-y", "@redplanethq/core-mcp"],
"env": {
"API_TOKEN": "YOUR_API_TOKEN_HERE",
"API_BASE_URL": "https://core.heysol.ai",
"SOURCE": "cursor"
}
}
}
}
```4. Go to Settings-> User rules -> New Rule -> and add the below rule to ensure all your chat interactions are being stored in CORE memory
```
After every interaction, update the memory with the user's query and the assistant's
response to core-memory mcp. sessionId should be the uuid of the conversation
```## Documentation
Explore our documentation to get the most out of CORE
- [Basic Concepts](https://docs.heysol.ai/core/overview)
- [API Reference](https://docs.heysol.ai/core/local-setup)
- [Connect Core Memory MCP with Cursor](#connecting-core-mcp-with-cursor)## 🧑💻 Support
Have questions or feedback? We're here to help:
- Discord: [Join core-support channel](https://discord.gg/YGUZcvDjUa)
- Documentation: [docs.heysol.ai/core](https://docs.heysol.ai/core/overview)
- Email: manik@poozle.dev## Usage Guidelines
**Store:**
- Conversation history
- User preferences
- Task context
- Reference materials**Don't Store:**
- Sensitive data (PII)
- Credentials
- System logs
- Temporary data## 👥 Contributors