{"id":29019683,"url":"https://github.com/redis-developer/langgraph-redis","last_synced_at":"2026-02-03T05:26:08.415Z","repository":{"id":276873308,"uuid":"911722284","full_name":"redis-developer/langgraph-redis","owner":"redis-developer","description":"Redis integrations for LangGraph","archived":false,"fork":false,"pushed_at":"2025-06-18T20:58:10.000Z","size":1004,"stargazers_count":74,"open_issues_count":1,"forks_count":12,"subscribers_count":7,"default_branch":"main","last_synced_at":"2025-06-18T21:38:47.827Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/redis-developer.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-01-03T17:31:03.000Z","updated_at":"2025-06-18T20:58:14.000Z","dependencies_parsed_at":"2025-03-05T01:25:44.208Z","dependency_job_id":"344ff2f9-2323-421e-abf6-4f1f882c8708","html_url":"https://github.com/redis-developer/langgraph-redis","commit_stats":null,"previous_names":["redis-developer/langgraph-redis"],"tags_count":6,"template":false,"template_full_name":null,"purl":"pkg:github/redis-developer/langgraph-redis","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/redis-developer%2Flanggraph-redis","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/redis-developer%2Flanggraph-redis/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/redis-developer%2Flanggraph-redis/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/redis-developer%2Flanggraph-redis/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/redis-developer","download_url":"https://codeload.github.com/redis-developer/langgraph-redis/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/redis-developer%2Flanggraph-redis/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":261903046,"owners_count":23227932,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-06-26T01:00:49.089Z","updated_at":"2026-02-03T05:26:08.409Z","avatar_url":"https://github.com/redis-developer.png","language":"Python","readme":"# LangGraph Redis\n\n[![PyPI version](https://badge.fury.io/py/langgraph-checkpoint-redis.svg)](https://badge.fury.io/py/langgraph-checkpoint-redis)\n[![Python versions](https://img.shields.io/pypi/pyversions/langgraph-checkpoint-redis.svg)](https://pypi.org/project/langgraph-checkpoint-redis/)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n[![Tests](https://github.com/redis-developer/langgraph-redis/actions/workflows/test.yml/badge.svg)](https://github.com/redis-developer/langgraph-redis/actions/workflows/test.yml)\n[![Coverage](https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/bsbodden/4b5aae70fef2c9606648bce5d010e129/raw/langgraph-redis-coverage.json)](https://github.com/redis-developer/langgraph-redis/actions/workflows/coverage-gist.yml)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n[![Imports: isort](https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat\u0026labelColor=ef8336)](https://pycqa.github.io/isort/)\n[![Checked with mypy](https://img.shields.io/badge/mypy-checked-blue)](http://mypy-lang.org/)\n[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)\n[![Downloads](https://static.pepy.tech/badge/langgraph-checkpoint-redis)](https://pepy.tech/project/langgraph-checkpoint-redis)\n[![Redis](https://img.shields.io/badge/Redis-8.0%2B-DC382D?logo=redis\u0026logoColor=white)](https://redis.io)\n\nThis repository contains Redis implementations for LangGraph, providing both Checkpoint Savers and Stores functionality.\n\n## Overview\n\nThe project consists of three main components:\n\n1. **Redis Checkpoint Savers**: Implementations for storing and managing checkpoints using Redis\n2. **Redis Stores**: Redis-backed key-value stores with optional vector search capabilities\n3. **Redis Middleware**: LangChain agent middleware for semantic caching, tool caching, and conversation memory\n\n## Dependencies\n\n### Python Dependencies\n\nThe project requires the following main Python dependencies:\n\n- `redis\u003e=5.2.1`\n- `redisvl\u003e=0.5.1`\n- `langgraph-checkpoint\u003e=2.0.24`\n\n### Redis Modules Requirements\n\n**IMPORTANT:** This library requires Redis with the following modules:\n\n- **RedisJSON** - For storing and manipulating JSON data\n- **RediSearch** - For search and indexing capabilities\n\n#### Redis 8.0+\n\nIf you're using Redis 8.0 or higher, both RedisJSON and RediSearch modules are included by default as part of the core\nRedis distribution. No additional installation is required.\n\n#### Redis \u003c 8.0\n\nIf you're using a Redis version lower than 8.0, you'll need to ensure these modules are installed:\n\n- Use [Redis Stack](https://redis.io/docs/stack/), which bundles Redis with these modules\n- Or install the modules separately in your Redis instance\n\nFailure to have these modules available will result in errors during index creation and checkpoint operations.\n\n### Azure Managed Redis / Azure Cache for Redis / Redis Enterprise Configuration\n\nIf you're using **Azure Managed Redis**, **Azure Cache for Redis** (especially Enterprise tier) or **Redis Enterprise**, there are important configuration considerations:\n\n#### Client Configuration\n\nAzure Managed Redis, Azure Cache for Redis and Redis Enterprise use a **proxy layer** that makes the cluster appear as a single endpoint. This requires using a **standard Redis client**, not a cluster-aware client:\n\n```python\nfrom redis import Redis\nfrom langgraph.checkpoint.redis import RedisSaver\n\n# ✅ CORRECT: Use standard Redis client for Azure/Enterprise\nclient = Redis(\n    host=\"your-cache.redis.cache.windows.net\",  # or your Redis Enterprise endpoint\n    port=6379,  # or 10000 for Azure Managed Redis / Azure Enterprise with TLS\n    password=\"your-access-key\",\n    ssl=True,  # Azure/Enterprise typically requires SSL\n    ssl_cert_reqs=\"required\",  # or \"none\" for self-signed certs\n    decode_responses=False  # RedisSaver expects bytes\n)\n\n# Pass the configured client to RedisSaver\nsaver = RedisSaver(redis_client=client)\nsaver.setup()\n\n# ❌ WRONG: Don't use RedisCluster client with Azure/Enterprise\n# from redis.cluster import RedisCluster\n# cluster_client = RedisCluster(...)  # This will fail with proxy-based deployments\n```\n\n#### Why This Matters\n\n- **Proxy Architecture**: Azure Managed Redis, Azure Cache for Redis and Redis Enterprise use a proxy layer that handles cluster operations internally\n- **Automatic Detection**: RedisSaver will correctly detect this as non-cluster mode when using the standard client\n- **No Cross-Slot Errors**: The proxy handles key distribution, avoiding cross-slot errors\n\n#### Azure Specific Settings\n\nFor Azure Managed Redis \u0026 Azure Cache for Redis Enterprise tier:\n\n- **Port**: Use port `10000` with TLS, or `6379` for standard\n- **Modules**: RediSearch and RedisJSON need to be selected at creation\n- **SSL/TLS**: Always enabled, minimum TLS 1.2\n\nExample for Azure Managed Redis, Azure Cache for Redis Enterprise:\n\n```python\nclient = Redis(\n    host=\"your-host-endpoint\",\n    port=10000,  # Enterprise TLS port\n    password=\"your-access-key\",\n    ssl=True,\n    ssl_cert_reqs=\"required\",\n    decode_responses=False\n)\n```\n\n## Installation\n\nInstall the library using pip:\n\n```bash\npip install langgraph-checkpoint-redis\n```\n\n## Redis Checkpoint Savers\n\n### Important Notes\n\n\u003e [!IMPORTANT]\n\u003e When using Redis checkpointers for the first time, make sure to call `.setup()` method on them to create required\n\u003e indices. See examples below.\n\n### Standard Implementation\n\n```python\nfrom langgraph.checkpoint.redis import RedisSaver\n\nwrite_config = {\"configurable\": {\"thread_id\": \"1\", \"checkpoint_ns\": \"\"}}\nread_config = {\"configurable\": {\"thread_id\": \"1\"}}\n\nwith RedisSaver.from_conn_string(\"redis://localhost:6379\") as checkpointer:\n    # Call setup to initialize indices\n    checkpointer.setup()\n    checkpoint = {\n        \"v\": 1,\n        \"ts\": \"2024-07-31T20:14:19.804150+00:00\",\n        \"id\": \"1ef4f797-8335-6428-8001-8a1503f9b875\",\n        \"channel_values\": {\n            \"my_key\": \"meow\",\n            \"node\": \"node\"\n        },\n        \"channel_versions\": {\n            \"__start__\": 2,\n            \"my_key\": 3,\n            \"start:node\": 3,\n            \"node\": 3\n        },\n        \"versions_seen\": {\n            \"__input__\": {},\n            \"__start__\": {\n                \"__start__\": 1\n            },\n            \"node\": {\n                \"start:node\": 2\n            }\n        },\n        \"pending_sends\": [],\n    }\n\n    # Store checkpoint\n    checkpointer.put(write_config, checkpoint, {}, {})\n\n    # Retrieve checkpoint\n    loaded_checkpoint = checkpointer.get(read_config)\n\n    # List all checkpoints\n    checkpoints = list(checkpointer.list(read_config))\n```\n\n### Async Implementation\n\n```python\nfrom langgraph.checkpoint.redis.aio import AsyncRedisSaver\n\n\nasync def main():\n    write_config = {\"configurable\": {\"thread_id\": \"1\", \"checkpoint_ns\": \"\"}}\n    read_config = {\"configurable\": {\"thread_id\": \"1\"}}\n\n    async with AsyncRedisSaver.from_conn_string(\"redis://localhost:6379\") as checkpointer:\n        # Call setup to initialize indices\n        await checkpointer.asetup()\n        checkpoint = {\n            \"v\": 1,\n            \"ts\": \"2024-07-31T20:14:19.804150+00:00\",\n            \"id\": \"1ef4f797-8335-6428-8001-8a1503f9b875\",\n            \"channel_values\": {\n                \"my_key\": \"meow\",\n                \"node\": \"node\"\n            },\n            \"channel_versions\": {\n                \"__start__\": 2,\n                \"my_key\": 3,\n                \"start:node\": 3,\n                \"node\": 3\n            },\n            \"versions_seen\": {\n                \"__input__\": {},\n                \"__start__\": {\n                    \"__start__\": 1\n                },\n                \"node\": {\n                    \"start:node\": 2\n                }\n            },\n            \"pending_sends\": [],\n        }\n\n        # Store checkpoint\n        await checkpointer.aput(write_config, checkpoint, {}, {})\n\n        # Retrieve checkpoint\n        loaded_checkpoint = await checkpointer.aget(read_config)\n\n        # List all checkpoints\n        checkpoints = [c async for c in checkpointer.alist(read_config)]\n\n\n# Run the async main function\nimport asyncio\n\nasyncio.run(main())\n```\n\n### Shallow Implementations\n\nShallow Redis checkpoint savers store only the latest checkpoint in Redis. These implementations are useful when\nretaining a complete checkpoint history is unnecessary.\n\n```python\nfrom langgraph.checkpoint.redis.shallow import ShallowRedisSaver\n\n# For async version: from langgraph.checkpoint.redis.ashallow import AsyncShallowRedisSaver\n\nwrite_config = {\"configurable\": {\"thread_id\": \"1\", \"checkpoint_ns\": \"\"}}\nread_config = {\"configurable\": {\"thread_id\": \"1\"}}\n\nwith ShallowRedisSaver.from_conn_string(\"redis://localhost:6379\") as checkpointer:\n    checkpointer.setup()\n    # ... rest of the implementation follows similar pattern\n```\n\n## Redis Checkpoint TTL Support\n\nBoth Redis checkpoint savers and stores support automatic expiration using Redis TTL:\n\n```python\n# Configure automatic expiration\nttl_config = {\n    \"default_ttl\": 60,  # Expire checkpoints after 60 minutes\n    \"refresh_on_read\": True,  # Reset expiration time when reading checkpoints\n}\n\nwith RedisSaver.from_conn_string(\"redis://localhost:6379\", ttl=ttl_config) as saver:\n    saver.setup()\n    # Checkpoints will expire after 60 minutes of inactivity\n```\n\nWhen no TTL is configured, checkpoints are persistent (never expire automatically).\n\n### Removing TTL (Pinning Threads)\n\nYou can make specific checkpoints persistent by removing their TTL. This is useful for \"pinning\" important threads that should never expire:\n\n```python\nfrom langgraph.checkpoint.redis import RedisSaver\n\n# Create saver with default TTL\nsaver = RedisSaver.from_conn_string(\"redis://localhost:6379\", ttl={\"default_ttl\": 60})\nsaver.setup()\n\n# Save a checkpoint\nconfig = {\"configurable\": {\"thread_id\": \"important-thread\", \"checkpoint_ns\": \"\"}}\nsaved_config = saver.put(config, checkpoint, metadata, {})\n\n# Remove TTL from the checkpoint to make it persistent\ncheckpoint_id = saved_config[\"configurable\"][\"checkpoint_id\"]\ncheckpoint_key = f\"checkpoint:important-thread:__empty__:{checkpoint_id}\"\nsaver._apply_ttl_to_keys(checkpoint_key, ttl_minutes=-1)\n\n# The checkpoint is now persistent and won't expire\n```\n\nWhen no TTL configuration is provided, checkpoints are persistent by default (no expiration).\n\nThis makes it easy to manage storage and ensure ephemeral data is automatically cleaned up while keeping important data persistent.\n\n## Redis Stores\n\nRedis Stores provide a persistent key-value store with optional vector search capabilities.\n\n### Synchronous Implementation\n\n```python\nfrom langgraph.store.redis import RedisStore\n\n# Basic usage\nwith RedisStore.from_conn_string(\"redis://localhost:6379\") as store:\n    store.setup()\n    # Use the store...\n\n# With vector search configuration\nindex_config = {\n    \"dims\": 1536,  # Vector dimensions\n    \"distance_type\": \"cosine\",  # Distance metric\n    \"fields\": [\"text\"],  # Fields to index\n}\n\n# With TTL configuration\nttl_config = {\n    \"default_ttl\": 60,  # Default TTL in minutes\n    \"refresh_on_read\": True,  # Refresh TTL when store entries are read\n}\n\nwith RedisStore.from_conn_string(\n        \"redis://localhost:6379\",\n        index=index_config,\n        ttl=ttl_config\n) as store:\n    store.setup()\n    # Use the store with vector search and TTL capabilities...\n```\n\n### Async Implementation\n\n```python\nfrom langgraph.store.redis.aio import AsyncRedisStore\n\n\nasync def main():\n    # TTL also works with async implementations\n    ttl_config = {\n        \"default_ttl\": 60,  # Default TTL in minutes\n        \"refresh_on_read\": True,  # Refresh TTL when store entries are read\n    }\n\n    async with AsyncRedisStore.from_conn_string(\n            \"redis://localhost:6379\",\n            ttl=ttl_config\n    ) as store:\n        await store.setup()\n        # Use the store asynchronously...\n\n\nasyncio.run(main())\n```\n\n## Redis Middleware for LangChain Agents\n\nRedis middleware provides semantic caching, tool result caching, conversation memory, and semantic routing for LangChain agents. These middleware components integrate directly with `langchain.agents.create_agent()`.\n\n### Key Features\n\n- **SemanticCacheMiddleware**: Cache LLM responses by semantic similarity, reducing costs and latency\n- **ToolResultCacheMiddleware**: Cache expensive tool executions (API calls, computations)\n- **ConversationMemoryMiddleware**: Inject semantically relevant past messages into context\n- **SemanticRouterMiddleware**: Route requests based on semantic matching\n\n### Quick Start\n\n```python\nimport ast\nimport operator as op\n\nfrom langchain.agents import create_agent\nfrom langchain_core.messages import HumanMessage\nfrom langchain_core.tools import tool\nfrom langgraph.middleware.redis import (\n    SemanticCacheMiddleware,\n    SemanticCacheConfig,\n    ToolResultCacheMiddleware,\n    ToolCacheConfig,\n)\n\n# Safe math expression evaluator (no arbitrary code execution)\nSAFE_OPERATORS = {\n    ast.Add: op.add, ast.Sub: op.sub, ast.Mult: op.mul,\n    ast.Div: op.truediv, ast.Pow: op.pow, ast.USub: op.neg,\n}\n\ndef _eval_expr(node):\n    if isinstance(node, ast.Constant):\n        return node.value\n    elif isinstance(node, ast.BinOp) and type(node.op) in SAFE_OPERATORS:\n        return SAFE_OPERATORS[type(node.op)](_eval_expr(node.left), _eval_expr(node.right))\n    elif isinstance(node, ast.UnaryOp) and type(node.op) in SAFE_OPERATORS:\n        return SAFE_OPERATORS[type(node.op)](_eval_expr(node.operand))\n    raise ValueError(f\"Unsupported expression\")\n\ndef safe_eval(expr: str) -\u003e float:\n    return _eval_expr(ast.parse(expr, mode='eval').body)\n\n# Define tools with cacheability metadata\n@tool\ndef calculate(expression: str) -\u003e str:\n    \"\"\"Evaluate a math expression.\"\"\"\n    return str(safe_eval(expression))\n\ncalculate.metadata = {\"cacheable\": True}  # Deterministic - safe to cache\n\n@tool\ndef get_stock_price(symbol: str) -\u003e str:\n    \"\"\"Get current stock price.\"\"\"\n    return fetch_price(symbol)\n\nget_stock_price.metadata = {\"cacheable\": False}  # Temporal - don't cache\n\n# Create middleware\nsemantic_cache = SemanticCacheMiddleware(\n    SemanticCacheConfig(\n        redis_url=\"redis://localhost:6379\",\n        name=\"llm_cache\",\n        distance_threshold=0.15,\n        ttl_seconds=3600,\n        deterministic_tools=[\"calculate\"],  # Safe to cache after these tools\n    )\n)\n\ntool_cache = ToolResultCacheMiddleware(\n    ToolCacheConfig(\n        redis_url=\"redis://localhost:6379\",\n        name=\"tool_cache\",\n        ttl_seconds=1800,\n    )\n)\n\n# Create agent with middleware\nagent = create_agent(\n    model=\"gpt-4o-mini\",\n    tools=[calculate, get_stock_price],\n    middleware=[semantic_cache, tool_cache],\n)\n\n# Use async invocation (middleware is async-first)\nresult = await agent.ainvoke({\"messages\": [HumanMessage(content=\"Calculate 25 * 4\")]})\n```\n\n### Tool Cacheability\n\nControl which tools are cached using LangChain's native metadata:\n\n```python\n# Option 1: Set metadata after @tool decoration\n@tool\ndef search(query: str) -\u003e str:\n    \"\"\"Search the web.\"\"\"\n    return web_search(query)\n\nsearch.metadata = {\"cacheable\": True}\n\n# Option 2: Use StructuredTool with metadata\nfrom langchain_core.tools import StructuredTool\n\nget_weather = StructuredTool.from_function(\n    func=fetch_weather,\n    name=\"get_weather\",\n    description=\"Get current weather\",\n    metadata={\"cacheable\": False},  # Real-time data\n)\n```\n\n### Middleware Composition\n\nCombine multiple middleware using `MiddlewareStack` or factory functions:\n\n```python\nfrom langgraph.middleware.redis import MiddlewareStack, from_configs\n\n# Option 1: Create stack directly\nstack = MiddlewareStack([\n    SemanticCacheMiddleware(SemanticCacheConfig(redis_url=\"redis://localhost:6379\", name=\"llm_cache\")),\n    ToolResultCacheMiddleware(ToolCacheConfig(redis_url=\"redis://localhost:6379\", name=\"tool_cache\")),\n])\n\n# Option 2: Use from_configs factory (shares Redis connection)\nstack = from_configs(\n    configs=[\n        SemanticCacheConfig(name=\"llm_cache\", ttl_seconds=3600),\n        ToolCacheConfig(name=\"tool_cache\", ttl_seconds=1800),\n    ],\n    redis_url=\"redis://localhost:6379\",\n)\n\nagent = create_agent(model=\"gpt-4o-mini\", tools=tools, middleware=[stack])\n```\n\n### Connection Sharing with Checkpointer\n\nShare Redis connections between middleware and checkpointer:\n\n```python\nfrom langgraph.checkpoint.redis.aio import AsyncRedisSaver\nfrom langgraph.middleware.redis import IntegratedRedisMiddleware\n\n# Create checkpointer\ncheckpointer = AsyncRedisSaver(redis_url=\"redis://localhost:6379\")\nawait checkpointer.asetup()\n\n# Create middleware that shares the connection\nmiddleware = IntegratedRedisMiddleware.from_saver(\n    checkpointer,\n    configs=[\n        SemanticCacheConfig(name=\"llm_cache\"),\n        ToolCacheConfig(name=\"tool_cache\"),\n    ],\n)\n\nagent = create_agent(\n    model=\"gpt-4o-mini\",\n    tools=tools,\n    checkpointer=checkpointer,\n    middleware=[middleware],\n)\n```\n\n### Example Notebooks\n\nSee the `examples/middleware/` directory for detailed notebooks:\n\n- `middleware_semantic_cache.ipynb`: LLM response caching with semantic matching\n- `middleware_tool_caching.ipynb`: Tool result caching with metadata-based control\n- `middleware_conversation_memory.ipynb`: Semantic conversation history retrieval\n- `middleware_composition.ipynb`: Combining middleware with checkpointers\n\n## Examples\n\nThe `examples` directory contains Jupyter notebooks demonstrating the usage of Redis with LangGraph:\n\n### Checkpoint and Store Examples\n\n- `persistence_redis.ipynb`: Demonstrates the usage of Redis checkpoint savers with LangGraph\n- `create-react-agent-memory.ipynb`: Shows how to create an agent with persistent memory using Redis\n- `cross-thread-persistence.ipynb`: Demonstrates cross-thread persistence capabilities\n- `persistence-functional.ipynb`: Shows functional persistence patterns with Redis\n\n### Middleware Examples (`examples/middleware/`)\n\n- `middleware_semantic_cache.ipynb`: LLM response caching with semantic similarity matching\n- `middleware_tool_caching.ipynb`: Tool result caching with metadata-based cacheability control\n- `middleware_conversation_memory.ipynb`: Semantic conversation history and context injection\n- `middleware_composition.ipynb`: Combining multiple middleware with shared Redis connections\n\n### Running Example Notebooks\n\nTo run the example notebooks with Docker:\n\n1. Navigate to the examples directory:\n\n   ```bash\n   cd examples\n   ```\n\n2. Start the Docker containers:\n\n   ```bash\n   docker compose up\n   ```\n\n3. Open the URL shown in the console (typically \u003chttp://127.0.0.1:8888/tree\u003e) in your browser to access Jupyter.\n\n4. When finished, stop the containers:\n\n   ```bash\n   docker compose down\n   ```\n\n## Implementation Details\n\n### Redis Module Usage\n\nThis implementation relies on specific Redis modules:\n\n- **RedisJSON**: Used for storing structured JSON data as native Redis objects\n- **RediSearch**: Used for creating and querying indices on JSON data\n\n### Indexing\n\nThe Redis implementation creates these main indices using RediSearch:\n\n1. **Checkpoints Index**: Stores checkpoint metadata and versioning\n2. **Channel Values Index**: Stores channel-specific data\n3. **Writes Index**: Tracks pending writes and intermediate states\n\nFor Redis Stores with vector search:\n\n1. **Store Index**: Main key-value store\n2. **Vector Index**: Optional vector embeddings for similarity search\n\n### TTL Implementation\n\nBoth Redis checkpoint savers and stores leverage Redis's native key expiration:\n\n- **Native Redis TTL**: Uses Redis's built-in `EXPIRE` command for setting TTL\n- **TTL Removal**: Uses Redis's `PERSIST` command to remove TTL (with `ttl_minutes=-1`)\n- **Automatic Cleanup**: Redis automatically removes expired keys\n- **Configurable Default TTL**: Set a default TTL for all keys in minutes\n- **TTL Refresh on Read**: Optionally refresh TTL when keys are accessed\n- **Applied to All Related Keys**: TTL is applied to all related keys (checkpoint, blobs, writes)\n- **Persistent by Default**: When no TTL is configured, keys are persistent (no expiration)\n\n## Contributing\n\nWe welcome contributions! Here's how you can help:\n\n### Development Setup\n\n1. Clone the repository:\n\n   ```bash\n   git clone https://github.com/redis-developer/langgraph-redis\n   cd langgraph-redis\n   ```\n\n2. Install dependencies:\n\n   ```bash\n   `poetry install --all-extras`\n   ```\n\n### Available Commands\n\nThe project includes several make commands for development:\n\n- **Testing**:\n\n  ```bash\n  make test           # Run all tests\n  make test-all       # Run all tests including API tests\n  ```\n\n- **Linting and Formatting**:\n\n  ```bash\n  make format        # Format all files with Black and isort\n  make lint          # Run formatting, type checking, and other linters\n  make check-types   # Run mypy type checking\n  ```\n\n- **Code Quality**:\n\n  ```bash\n  make test-coverage    # Run tests with coverage reporting\n  make coverage-report  # Generate coverage report without running tests\n  make coverage-html    # Generate HTML coverage report (opens in htmlcov/)\n  make find-dead-code   # Find unused code with vulture\n  ```\n\n- **Redis for Development/Testing**:\n\n  ```bash\n  make redis-start   # Start Redis Stack in Docker (includes RedisJSON and RediSearch modules)\n  make redis-stop    # Stop Redis container\n  ```\n\n### Contribution Guidelines\n\n1. Create a new branch for your changes\n2. Write tests for new functionality\n3. Ensure all tests pass: `make test`\n4. Format your code: `make format`\n5. Run linting checks: `make lint`\n6. Submit a pull request with a clear description of your changes\n7. Follow [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/) for commit messages\n\n## License\n\nThis project is licensed under the MIT License.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fredis-developer%2Flanggraph-redis","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fredis-developer%2Flanggraph-redis","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fredis-developer%2Flanggraph-redis/lists"}