{"id":35416123,"url":"https://github.com/vadikko2/python-cqrs","last_synced_at":"2026-02-20T13:03:15.429Z","repository":{"id":254454839,"uuid":"835891783","full_name":"vadikko2/python-cqrs","owner":"vadikko2","description":"Event-Driven Architecture Framework","archived":false,"fork":false,"pushed_at":"2026-02-18T12:34:50.000Z","size":793,"stargazers_count":42,"open_issues_count":8,"forks_count":5,"subscribers_count":2,"default_branch":"master","last_synced_at":"2026-02-18T15:55:00.531Z","etag":null,"topics":["cqrs-pattern","eda","event-driven","event-driven-architecture","event-sourcing","fastapi","faststream","mediator","outbox","outbox-pattern","python-saga-orchestration","python3","saga","saga-pattern"],"latest_commit_sha":null,"homepage":"https://mkdocs.python-cqrs.dev/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/vadikko2.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2024-07-30T18:23:27.000Z","updated_at":"2026-02-18T05:01:26.000Z","dependencies_parsed_at":"2025-01-10T10:32:39.379Z","dependency_job_id":"862806ab-9330-4f92-af64-d76dcac0a150","html_url":"https://github.com/vadikko2/python-cqrs","commit_stats":null,"previous_names":["vadikko2/cqrs"],"tags_count":70,"template":false,"template_full_name":null,"purl":"pkg:github/vadikko2/python-cqrs","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/vadikko2%2Fpython-cqrs","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/vadikko2%2Fpython-cqrs/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/vadikko2%2Fpython-cqrs/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/vadikko2%2Fpython-cqrs/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/vadikko2","download_url":"https://codeload.github.com/vadikko2/python-cqrs/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/vadikko2%2Fpython-cqrs/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":29651975,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-02-20T09:27:29.698Z","status":"ssl_error","status_checked_at":"2026-02-20T09:26:12.373Z","response_time":59,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.6:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cqrs-pattern","eda","event-driven","event-driven-architecture","event-sourcing","fastapi","faststream","mediator","outbox","outbox-pattern","python-saga-orchestration","python3","saga","saga-pattern"],"created_at":"2026-01-02T15:25:00.411Z","updated_at":"2026-02-20T13:03:15.423Z","avatar_url":"https://github.com/vadikko2.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n\u003cdiv align=\"center\"\u003e\n  \u003cimg\n    src=\"https://raw.githubusercontent.com/vadikko2/python-cqrs-mkdocs/master/docs/img.png\"\n    alt=\"Python CQRS\"\n    style=\"max-width: 80%; width: 800px; border-radius: 16px; box-shadow: 0 8px 32px rgba(0, 102, 204, 0.2); display: block; margin: 2rem auto;\"\n  \u003e\n\u003c/div\u003e\n  \u003ch1\u003ePython CQRS\u003c/h1\u003e\n  \u003ch3\u003eEvent-Driven Architecture Framework for Distributed Systems\u003c/h3\u003e\n  \u003cp\u003e\n    \u003ca href=\"https://pypi.org/project/python-cqrs/\"\u003e\n      \u003cimg src=\"https://img.shields.io/pypi/pyversions/python-cqrs?logo=python\u0026logoColor=white\" alt=\"Python Versions\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://pypi.org/project/python-cqrs/\"\u003e\n      \u003cimg src=\"https://img.shields.io/pypi/v/python-cqrs?label=pypi\u0026logo=pypi\" alt=\"PyPI version\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://pepy.tech/projects/python-cqrs\"\u003e\n      \u003cimg src=\"https://pepy.tech/badge/python-cqrs\" alt=\"Total downloads\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://pepy.tech/projects/python-cqrs\"\u003e\n      \u003cimg src=\"https://pepy.tech/badge/python-cqrs/month\" alt=\"Downloads per month\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://codecov.io/gh/vadikko2/python-cqrs\"\u003e\n      \u003cimg src=\"https://img.shields.io/codecov/c/github/vadikko2/python-cqrs?logo=codecov\u0026logoColor=white\" alt=\"Coverage\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://codspeed.io/vadikko2/python-cqrs?utm_source=badge\"\u003e\n      \u003cimg src=\"https://img.shields.io/endpoint?url=https://codspeed.io/badge.json\" alt=\"CodSpeed\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://mkdocs.python-cqrs.dev/\"\u003e\n      \u003cimg src=\"https://img.shields.io/badge/docs-mkdocs-blue?logo=readthedocs\" alt=\"Documentation\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://deepwiki.com/vadikko2/python-cqrs\"\u003e\n      \u003cimg src=\"https://deepwiki.com/badge.svg\" alt=\"Ask DeepWiki\"\u003e\n    \u003c/a\u003e\n  \u003c/p\u003e\n\u003c/div\u003e\n\n\u003e [!WARNING]\n\u003e **Breaking Changes in v5.0.0**\n\u003e\n\u003e Starting with version 5.0.0, Pydantic support will become optional. The default implementations of `Request`, `Response`, `DomainEvent`, and `NotificationEvent` will be migrated to dataclasses-based implementations.\n\n## Overview\n\nAn event-driven framework for building distributed systems in Python. It centers on CQRS (Command Query Responsibility Segregation) and extends into messaging, sagas, and reliable event delivery — so you can separate read and write flows, react to events from the bus, run distributed transactions with compensation, and publish events via Transaction Outbox. The result is clearer structure, better scalability, and easier evolution of the application.\n\nThis package is a fork of the [diator](https://github.com/akhundMurad/diator)\nproject ([documentation](https://akhundmurad.github.io/diator/)) with several enhancements, ordered by importance:\n\n**Core framework**\n\n1. Redesigned the event and request mapping mechanism to handlers;\n2. `EventMediator` for handling `Notification` and `ECST` events coming from the bus;\n3. `bootstrap` for easy setup;\n4. **Transaction Outbox**, ensuring that `Notification` and `ECST` events are sent to the broker;\n5. **Orchestrated Saga** pattern for distributed transactions with automatic compensation and recovery;\n6. `StreamingRequestMediator` and `StreamingRequestHandler` for streaming requests with real-time progress updates;\n7. **Chain of Responsibility** with `CORRequestHandler` for processing requests through multiple handlers in sequence;\n8. **Parallel event processing** with configurable concurrency limits.\n\n**Also**\n\n- **Typing:** Pydantic [v2.*](https://docs.pydantic.dev/2.8/) and `IRequest`/`IResponse` interfaces — use Pydantic-based, dataclass-based, or custom Request/Response implementations.\n- **Broker:** Kafka via [aiokafka](https://github.com/aio-libs/aiokafka).\n- **Integration:** Ready for integration with FastAPI and FastStream.\n- **Documentation:** Built-in Mermaid diagram generation (Sequence and Class diagrams).\n- **Protobuf:** Interface-level support for converting Notification events to Protobuf and back.\n\n## Request Handlers\n\nRequest handlers can be divided into two main types:\n\n### Command Handler\n\nCommand Handler executes the received command. The logic of the handler may include, for example, modifying the state of\nthe domain model.\nAs a result of executing the command, an event may be produced to the broker.\n\u003e [!TIP]\n\u003e By default, the command handler does not return any result, but it is not mandatory.\n\n```python\nfrom cqrs.requests.request_handler import RequestHandler\nfrom cqrs.events.event import Event\n\nclass JoinMeetingCommandHandler(RequestHandler[JoinMeetingCommand, None]):\n\n      def __init__(self, meetings_api: MeetingAPIProtocol) -\u003e None:\n          self._meetings_api = meetings_api\n          self._events: list[Event] = []\n\n      @property\n      def events(self) -\u003e typing.List[events.Event]:\n          return self._events\n\n      async def handle(self, request: JoinMeetingCommand) -\u003e None:\n          await self._meetings_api.join_user(request.user_id, request.meeting_id)\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/request_handler.py)\n\n### Query handler\n\nQuery Handler returns a representation of the requested data, for example, from\nthe [read model](https://radekmaziarka.pl/2018/01/08/cqrs-third-step-simple-read-model/#simple-read-model---to-the-rescue).\n\u003e [!TIP]\n\u003e The read model can be constructed based on domain events produced by the `Command Handler`.\n\n```python\nfrom cqrs.requests.request_handler import RequestHandler\nfrom cqrs.events.event import Event\n\nclass ReadMeetingQueryHandler(RequestHandler[ReadMeetingQuery, ReadMeetingQueryResult]):\n\n      def __init__(self, meetings_api: MeetingAPIProtocol) -\u003e None:\n          self._meetings_api = meetings_api\n          self._events: list[Event] = []\n\n      @property\n      def events(self) -\u003e typing.List[events.Event]:\n          return self._events\n\n      async def handle(self, request: ReadMeetingQuery) -\u003e ReadMeetingQueryResult:\n          link = await self._meetings_api.get_link(request.meeting_id)\n          return ReadMeetingQueryResult(link=link, meeting_id=request.meeting_id)\n\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/request_handler.py)\n\n### Streaming Request Handler\n\nStreaming Request Handler processes requests incrementally and yields results as they become available.\nThis is particularly useful for processing large batches of items, file uploads, or any operation that benefits from\nreal-time progress updates.\n\n`StreamingRequestHandler` works with `StreamingRequestMediator` that streams results to clients in real-time.\n\n```python\nimport typing\nfrom cqrs.requests.request_handler import StreamingRequestHandler\nfrom cqrs.events.event import Event\n\nclass ProcessFilesCommandHandler(StreamingRequestHandler[ProcessFilesCommand, FileProcessedResult]):\n    def __init__(self):\n        self._events: list[Event] = []\n\n    @property\n    def events(self) -\u003e list[Event]:\n        return self._events.copy()\n\n    def clear_events(self) -\u003e None:\n        self._events.clear()\n\n    async def handle(self, request: ProcessFilesCommand) -\u003e typing.AsyncIterator[FileProcessedResult]:\n        for file_id in request.file_ids:\n            # Process file\n            result = FileProcessedResult(file_id=file_id, status=\"completed\", ...)\n            # Emit events\n            self._events.append(FileProcessedEvent(file_id=file_id, ...))\n            yield result\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/streaming_handler_parallel_events.py)\n\n### Chain of Responsibility Request Handler\n\nChain of Responsibility Request Handler implements the chain of responsibility pattern, allowing multiple handlers\nto process a request in sequence until one successfully handles it. This pattern is particularly useful when you have\nmultiple processing strategies or need to implement fallback mechanisms.\n\nEach handler in the chain decides whether to process the request or pass it to the next handler. The chain stops\nwhen a handler successfully processes the request or when all handlers have been exhausted.\n\n```python\nimport typing\nfrom cqrs.requests.cor_request_handler import CORRequestHandler\nfrom cqrs.events.event import Event\n\nclass CreditCardPaymentHandler(CORRequestHandler[ProcessPaymentCommand, PaymentResult]):\n    def __init__(self, payment_service: PaymentServiceProtocol) -\u003e None:\n        self._payment_service = payment_service\n        self._events: typing.List[Event] = []\n\n    @property\n    def events(self) -\u003e typing.List[Event]:\n        return self._events\n\n    async def handle(self, request: ProcessPaymentCommand) -\u003e PaymentResult | None:\n        if request.payment_method == \"credit_card\":\n            # Process credit card payment\n            result = await self._payment_service.process_credit_card(request)\n            self._events.append(PaymentProcessedEvent(...))\n            return PaymentResult(success=True, transaction_id=result.id)\n\n        # Pass to next handler\n        return await self.next(request)\n\nclass PayPalPaymentHandler(CORRequestHandler[ProcessPaymentCommand, PaymentResult]):\n    def __init__(self, paypal_service: PayPalServiceProtocol) -\u003e None:\n        self._paypal_service = paypal_service\n        self._events: typing.List[Event] = []\n\n    @property\n    def events(self) -\u003e typing.List[Event]:\n        return self._events\n\n    async def handle(self, request: ProcessPaymentCommand) -\u003e PaymentResult | None:\n        if request.payment_method == \"paypal\":\n            # Process PayPal payment\n            result = await self._paypal_service.process_payment(request)\n            return PaymentResult(success=True, transaction_id=result.id)\n\n        # Pass to next handler\n        return await self.next(request)\n\n# Chain registration\ndef payment_mapper(mapper: cqrs.RequestMap) -\u003e None:\n    mapper.bind(ProcessPaymentCommand, [\n        CreditCardPaymentHandler,\n        PayPalPaymentHandler,\n        DefaultPaymentHandler  # Fallback handler\n    ])\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/cor_request_handler.py)\n\n#### Mermaid Diagram Generation\n\nThe package includes built-in support for generating Mermaid diagrams from Chain of Responsibility handler chains.\n\n```python\nfrom cqrs.requests.mermaid import CoRMermaid\n\n# Create Mermaid generator from handler chain\nhandlers = [CreditCardHandler, PayPalHandler, DefaultHandler]\ngenerator = CoRMermaid(handlers)\n\n# Generate Sequence diagram showing execution flow\nsequence_diagram = generator.sequence()\n\n# Generate Class diagram showing type structure\nclass_diagram = generator.class_diagram()\n```\n\nComplete example: [CoR Mermaid Diagrams](https://github.com/vadikko2/cqrs/blob/master/examples/cor_mermaid.py)\n\n## Request and Response Types\n\nThe library supports both Pydantic-based (`PydanticRequest`/`PydanticResponse`, aliased as `Request`/`Response`) and Dataclass-based (`DCRequest`/`DCResponse`) implementations. You can also implement custom classes by implementing the `IRequest`/`IResponse` interfaces directly.\n\n```python\nimport dataclasses\n\n# Pydantic-based (default)\nclass CreateUserCommand(cqrs.Request):\n    username: str\n    email: str\n\nclass UserResponse(cqrs.Response):\n    user_id: str\n    username: str\n\n# Dataclass-based\n@dataclasses.dataclass\nclass CreateProductCommand(cqrs.DCRequest):\n    name: str\n    price: float\n\n@dataclasses.dataclass\nclass ProductResponse(cqrs.DCResponse):\n    product_id: str\n    name: str\n\n# Custom implementation\nclass CustomRequest(cqrs.IRequest):\n    def __init__(self, user_id: str, action: str):\n        self.user_id = user_id\n        self.action = action\n\n    def to_dict(self) -\u003e dict:\n        return {\"user_id\": self.user_id, \"action\": self.action}\n\n    @classmethod\n    def from_dict(cls, **kwargs) -\u003e \"CustomRequest\":\n        return cls(user_id=kwargs[\"user_id\"], action=kwargs[\"action\"])\n\nclass CustomResponse(cqrs.IResponse):\n    def __init__(self, result: str, status: int):\n        self.result = result\n        self.status = status\n\n    def to_dict(self) -\u003e dict:\n        return {\"result\": self.result, \"status\": self.status}\n\n    @classmethod\n    def from_dict(cls, **kwargs) -\u003e \"CustomResponse\":\n        return cls(result=kwargs[\"result\"], status=kwargs[\"status\"])\n```\n\nA complete example can be found in [request_response_types.py](https://github.com/vadikko2/cqrs/blob/master/examples/request_response_types.py)\n\n## Mapping\n\nTo bind commands, queries and events with specific handlers, you can use the registries `EventMap` and `RequestMap`.\n\n```python\nfrom cqrs import requests, events\n\nfrom app import commands, command_handlers\nfrom app import queries, query_handlers\nfrom app import events as event_models, event_handlers\n\n\ndef init_commands(mapper: requests.RequestMap) -\u003e None:\n    mapper.bind(commands.JoinMeetingCommand, command_handlers.JoinMeetingCommandHandler)\n\ndef init_queries(mapper: requests.RequestMap) -\u003e None:\n    mapper.bind(queries.ReadMeetingQuery, query_handlers.ReadMeetingQueryHandler)\n\ndef init_events(mapper: events.EventMap) -\u003e None:\n    mapper.bind(events.NotificationEvent[event_models.NotificationMeetingRoomClosed], event_handlers.MeetingRoomClosedNotificationHandler)\n    mapper.bind(events.NotificationEvent[event_models.ECSTMeetingRoomClosed], event_handlers.UpdateMeetingRoomReadModelHandler)\n```\n\n## Bootstrap\n\nThe `python-cqrs` package implements a set of bootstrap utilities designed to simplify the initial configuration of an\napplication.\n\n```python\nimport functools\n\nfrom cqrs.events import bootstrap as event_bootstrap\nfrom cqrs.requests import bootstrap as request_bootstrap\n\nfrom app import dependencies, mapping, orm\n\n\n@functools.lru_cache\ndef mediator_factory():\n    return request_bootstrap.bootstrap(\n        di_container=dependencies.setup_di(),\n        commands_mapper=mapping.init_commands,\n        queries_mapper=mapping.init_queries,\n        domain_events_mapper=mapping.init_events,\n        on_startup=[orm.init_store_event_mapper],\n    )\n\n\n@functools.lru_cache\ndef event_mediator_factory():\n    return event_bootstrap.bootstrap(\n        di_container=dependencies.setup_di(),\n        events_mapper=mapping.init_events,\n        on_startup=[orm.init_store_event_mapper],\n    )\n```\n\n## Saga Pattern\n\nThe package implements the Orchestrated Saga pattern for managing distributed transactions across multiple services or operations.\nSagas enable eventual consistency by executing a series of steps where each step can be compensated if a subsequent step fails.\n\n### Key Features\n\n- **SagaStorage**: Persists saga state and execution history, enabling recovery of interrupted sagas\n- **SagaLog**: Tracks all step executions (act/compensate) with status and timestamps\n- **Recovery Mechanism**: Automatically recovers interrupted sagas from storage, ensuring eventual consistency\n- **Automatic Compensation**: If any step fails, all previously completed steps are automatically compensated in reverse order\n- **Fallback Pattern**: Define alternative steps to execute when primary steps fail, with optional Circuit Breaker protection\n- **Mermaid Diagram Generation**: Generate Sequence and Class diagrams for documentation and visualization\n\n### Example\n\n```python\nimport dataclasses\nimport uuid\nfrom cqrs.saga.models import SagaContext\nfrom cqrs.saga.saga import Saga\nfrom cqrs.saga.step import SagaStepHandler\n\n@dataclasses.dataclass\nclass OrderContext(SagaContext):\n    order_id: str\n    user_id: str\n    items: list[str]\n    total_amount: float\n    inventory_reservation_id: str | None = None\n    payment_id: str | None = None\n\n# Define saga class with steps\nclass OrderSaga(Saga[OrderContext]):\n    steps = [\n        ReserveInventoryStep,\n        ProcessPaymentStep,\n    ]\n\n# Execute saga via mediator\ncontext = OrderContext(order_id=\"123\", user_id=\"user_1\", items=[\"item_1\"], total_amount=100.0)\nsaga_id = uuid.uuid4()\n\nasync for step_result in mediator.stream(context, saga_id=saga_id):\n    print(f\"Step completed: {step_result.step_type.__name__}\")\n    # If any step fails, compensation happens automatically\n```\n\n### Fallback Pattern with Circuit Breaker\n\nThe saga pattern supports fallback steps that execute automatically when primary steps fail. You can also integrate Circuit Breaker protection to prevent cascading failures:\n\n```python\nfrom cqrs.saga.fallback import Fallback\nfrom cqrs.adapters.circuit_breaker import AioBreakerAdapter\nfrom cqrs.response import Response\nfrom cqrs.saga.step import SagaStepHandler, SagaStepResult\n\nclass ReserveInventoryResponse(Response):\n    reservation_id: str\n\nclass PrimaryStep(SagaStepHandler[OrderContext, ReserveInventoryResponse]):\n    async def act(self, context: OrderContext) -\u003e SagaStepResult[OrderContext, ReserveInventoryResponse]:\n        # Primary step that may fail\n        raise RuntimeError(\"Service unavailable\")\n\nclass FallbackStep(SagaStepHandler[OrderContext, ReserveInventoryResponse]):\n    async def act(self, context: OrderContext) -\u003e SagaStepResult[OrderContext, ReserveInventoryResponse]:\n        # Alternative step that executes when primary fails\n        reservation_id = f\"fallback_reservation_{context.order_id}\"\n        context.reservation_id = reservation_id\n        return self._generate_step_result(ReserveInventoryResponse(reservation_id=reservation_id))\n\n# Define saga with fallback and circuit breaker\nclass OrderSagaWithFallback(Saga[OrderContext]):\n    steps = [\n        Fallback(\n            step=PrimaryStep,\n            fallback=FallbackStep,\n            circuit_breaker=AioBreakerAdapter(\n                fail_max=2,  # Circuit opens after 2 failures\n                timeout_duration=60,  # Wait 60 seconds before retry\n            ),\n        ),\n    ]\n\n# Optional: Using Redis for distributed circuit breaker state\n# import redis\n# from aiobreaker.storage.redis import CircuitRedisStorage\n#\n# def redis_storage_factory(name: str):\n#     client = redis.from_url(\"redis://localhost:6379\", decode_responses=False)\n#     return CircuitRedisStorage(state=\"closed\", redis_object=client, namespace=name)\n#\n# AioBreakerAdapter(..., storage_factory=redis_storage_factory)\n```\n\nWhen the primary step fails, the fallback step executes automatically. The Circuit Breaker opens after the configured failure threshold, preventing unnecessary load on failing services by failing fast.\n\nThe saga state and step history are persisted to `SagaStorage`. The `SagaLog` maintains a complete audit trail\nof all step executions (both `act` and `compensate` operations) with timestamps and status information.\nThis enables the recovery mechanism to restore saga state and ensure eventual consistency even after system failures.\n\nIf a saga is interrupted (e.g., due to a crash), you can recover it using the recovery mechanism:\n\n```python\nfrom cqrs.saga.recovery import recover_saga\n\n# Get saga instance from mediator's saga map (or keep reference to saga class)\nsaga = OrderSaga()\n\n# Recover interrupted saga - will resume from last completed step\n# or continue compensation if saga was in compensating state\nawait recover_saga(\n    saga=saga,\n    saga_id=saga_id,\n    context_builder=OrderContext,\n    container=di_container,  # Same container used in bootstrap\n    storage=storage,\n)\n\n# Access execution history (SagaLog) for monitoring and debugging\nhistory = await storage.get_step_history(saga_id)\nfor entry in history:\n    print(f\"{entry.timestamp}: {entry.step_name} - {entry.action} - {entry.status}\")\n```\n\nThe recovery mechanism ensures eventual consistency by:\n- Loading the last known saga state from `SagaStorage`\n- Checking the `SagaLog` to determine which steps were completed\n- Resuming execution from the last completed step, or continuing compensation if the saga was in a compensating state\n- Preventing duplicate execution of already completed steps\n\n#### Mermaid Diagram Generation\n\nThe package includes built-in support for generating Mermaid diagrams from Saga instances.\n\n```python\nfrom cqrs.saga.mermaid import SagaMermaid\n\n# Create Mermaid generator from saga class\nsaga = OrderSaga()\ngenerator = SagaMermaid(saga)\n\n# Generate Sequence diagram showing execution flow\nsequence_diagram = generator.sequence()\n\n# Generate Class diagram showing type structure\nclass_diagram = generator.class_diagram()\n```\n\nComplete example: [Saga Mermaid Diagrams](https://github.com/vadikko2/cqrs/blob/master/examples/saga_mermaid.py)\n\n## Event Handlers\n\nEvent handlers are designed to process `Notification` and `ECST` events that are consumed from the broker.\nTo configure event handling, you need to implement a broker consumer on the side of your application.\nBelow is an example of `Kafka event consuming` that can be used in the Presentation Layer.\n\n```python\nclass JoinMeetingCommandHandler(cqrs.RequestHandler[JoinMeetingCommand, None]):\n    def __init__(self):\n        self._events = []\n\n    @property\n    def events(self):\n        return self._events\n\n    async def handle(self, request: JoinMeetingCommand) -\u003e None:\n        STORAGE[request.meeting_id].append(request.user_id)\n        self._events.append(\n            UserJoined(user_id=request.user_id, meeting_id=request.meeting_id),\n        )\n        print(f\"User {request.user_id} joined meeting {request.meeting_id}\")\n\n\nclass UserJoinedEventHandler(cqrs.EventHandler[UserJoined]):\n    async def handle(self, event: UserJoined) -\u003e None:\n        print(f\"Handle user {event.user_id} joined meeting {event.meeting_id} event\")\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/domain_event_handler.py)\n\n### Parallel Event Processing\n\nBoth `RequestMediator` and `StreamingRequestMediator` support parallel processing of domain events. You can control\nthe number of event handlers that run simultaneously using the `max_concurrent_event_handlers` parameter.\n\nThis feature is especially useful when:\n- Multiple event handlers need to process events independently\n- You want to improve performance by processing events concurrently\n- You need to limit resource consumption by controlling concurrency\n\n**Configuration:**\n\n```python\nfrom cqrs.requests import bootstrap\n\nmediator = bootstrap.bootstrap_streaming(\n    di_container=container,\n    commands_mapper=commands_mapper,\n    domain_events_mapper=domain_events_mapper,\n    message_broker=broker,\n    max_concurrent_event_handlers=3,  # Process up to 3 events in parallel\n    concurrent_event_handle_enable=True,  # Enable parallel processing\n)\n```\n\n\u003e [!TIP]\n\u003e - Set `max_concurrent_event_handlers` to limit the number of simultaneously running event handlers\n\u003e - Set `concurrent_event_handle_enable=False` to disable parallel processing and process events sequentially\n\u003e - The default value for `max_concurrent_event_handlers` is `10` for `StreamingRequestMediator` and `1` for `RequestMediator`\n\n## Producing Notification Events\n\nDuring the handling of a command, `cqrs.NotificationEvent` events may be generated and then sent to the broker.\n\n```python\nclass JoinMeetingCommandHandler(cqrs.RequestHandler[JoinMeetingCommand, None]):\n    def __init__(self):\n        self._events = []\n\n    @property\n    def events(self):\n        return self._events\n\n    async def handle(self, request: JoinMeetingCommand) -\u003e None:\n        print(f\"User {request.user_id} joined meeting {request.meeting_id}\")\n        self._events.append(\n            cqrs.NotificationEvent[UserJoinedNotificationPayload](\n                event_name=\"UserJoined\",\n                topic=\"user_notification_events\",\n                payload=UserJoinedNotificationPayload(\n                    user_id=request.user_id,\n                    meeting_id=request.meeting_id,\n                ),\n            )\n        )\n        self._events.append(\n            cqrs.NotificationEvent[UserJoinedECSTPayload](\n                event_name=\"UserJoined\",\n                topic=\"user_ecst_events\",\n                payload=UserJoinedECSTPayload(\n                    user_id=request.user_id,\n                    meeting_id=request.meeting_id,\n                ),\n            )\n        )\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/event_producing.py)\n\nAfter processing the command/request, if there are any Notification/ECST events,\nthe EventEmitter is invoked to produce the events via the message broker.\n\n\u003e [!WARNING]\n\u003e It is important to note that producing events using the events property parameter does not guarantee message delivery\n\u003e to the broker.\n\u003e In the event of broker unavailability or an exception occurring during message formation or sending, the message may\n\u003e be lost.\n\u003e This issue can potentially be addressed by configuring retry attempts for sending messages to the broker, but we\n\u003e recommend using the [Transaction Outbox](https://microservices.io/patterns/data/transactional-outbox.html) pattern,\n\u003e which is implemented in the current version of the python-cqrs package for this purpose.\n\n## Kafka broker\n\n```python\nfrom cqrs.adapters import kafka as kafka_adapter\nfrom cqrs.message_brokers import kafka as kafka_broker\n\n\nproducer = kafka_adapter.kafka_producer_factory(\n    dsn=\"localhost:9092\",\n    topics=[\"test.topic1\", \"test.topic2\"],\n)\nbroker = kafka_broker.KafkaMessageBroker(producer)\nawait broker.send_message(...)\n```\n\n## Transactional Outbox\n\nThe package implements the [Transactional Outbox](https://microservices.io/patterns/data/transactional-outbox.html)\npattern, which ensures that messages are produced to the broker according to the at-least-once semantics.\n\n```python\ndef do_some_logic(meeting_room_id: int, session: sql_session.AsyncSession):\n    \"\"\"\n    Make changes to the database\n    \"\"\"\n    session.add(...)\n\n\nclass JoinMeetingCommandHandler(cqrs.RequestHandler[JoinMeetingCommand, None]):\n    def __init__(self, outbox: cqrs.OutboxedEventRepository):\n        self.outbox = outbox\n\n    @property\n    def events(self):\n        return []\n\n    async def handle(self, request: JoinMeetingCommand) -\u003e None:\n        print(f\"User {request.user_id} joined meeting {request.meeting_id}\")\n        async with self.outbox as session:\n            do_some_logic(request.meeting_id, session) # business logic\n            self.outbox.add(\n                session,\n                cqrs.NotificationEvent[UserJoinedNotificationPayload](\n                    event_name=\"UserJoined\",\n                    topic=\"user_notification_events\",\n                    payload=UserJoinedNotificationPayload(\n                        user_id=request.user_id,\n                        meeting_id=request.meeting_id,\n                    ),\n                ),\n            )\n            self.outbox.add(\n                session,\n                cqrs.NotificationEvent[UserJoinedECSTPayload](\n                    event_name=\"UserJoined\",\n                    topic=\"user_ecst_events\",\n                    payload=UserJoinedECSTPayload(\n                        user_id=request.user_id,\n                        meeting_id=request.meeting_id,\n                    ),\n                ),\n            )\n            await self.outbox.commit(session)\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/save_events_into_outbox.py)\n\n\u003e [!TIP]\n\u003e You can specify the name of the Outbox table using the environment variable `OUTBOX_SQLA_TABLE`.\n\u003e By default, it is set to `outbox`.\n\n\u003e [!TIP]\n\u003e If you use the protobuf events you should specify `OutboxedEventRepository`\n\u003e by [protobuf serialize](https://github.com/vadikko2/cqrs/blob/master/src/cqrs/serializers/protobuf.py). A complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/save_proto_events_into_outbox.py)\n\n## Producing Events from Outbox to Kafka\n\nAs an implementation of the Transactional Outbox pattern, the SqlAlchemyOutboxedEventRepository is available for use as\nan access repository to the Outbox storage.\nIt can be utilized in conjunction with the KafkaMessageBroker.\n\n```python\nimport asyncio\nimport cqrs\nfrom cqrs.message_brokers import kafka\nfrom cqrs.adapters import kafka as kafka_adapters\nfrom cqrs.compressors import zlib\n\nsession_factory = async_sessionmaker(\n   create_async_engine(\n      f\"mysql+asyncmy://{USER}:{PASSWORD}@{HOSTNAME}:{PORT}/{DATABASE}\",\n      isolation_level=\"REPEATABLE READ\",\n   )\n)\n\nbroker = kafka.KafkaMessageBroker(\n   producer=kafka_adapters.kafka_producer_factory(dsn=\"localhost:9092\"),\n)\n\nproducer = cqrs.EventProducer(broker, cqrs.SqlAlchemyOutboxedEventRepository(session_factory, zlib.ZlibCompressor()))\n\n\nasync def periodically_task():\n   async for messages in producer.event_batch_generator():\n      for message in messages:\n         await producer.send_message(message)\n      await producer.repository.commit()\n      await asyncio.sleep(10)\n\n\nloop = asyncio.get_event_loop()\nloop.run_until_complete(periodically_task())\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/kafka_outboxed_event_producing.py)\n\n**Transaction log tailing.** If Outbox polling does not suit you, consider [Transaction Log Tailing](https://microservices.io/patterns/data/transaction-log-tailing.html). The package does not implement it; you can use [Debezium + Kafka Connect](https://debezium.io/documentation/reference/stable/architecture.html) to tail the Outbox and produce events to Kafka.\n\n## DI container\n\nUse the following example to set up dependency injection in your command, query and event handlers. This will make\ndependency management simpler.\n\nThe package supports two DI container libraries:\n\n### di library\n\n```python\nimport di\n...\n\ndef setup_di() -\u003e di.Container:\n    \"\"\"\n    Binds implementations to dependencies\n    \"\"\"\n    container = di.Container()\n    container.bind(\n        di.bind_by_type(\n            dependent.Dependent(cqrs.SqlAlchemyOutboxedEventRepository, scope=\"request\"),\n            cqrs.OutboxedEventRepository\n        )\n    )\n    container.bind(\n        di.bind_by_type(\n            dependent.Dependent(MeetingAPIImplementaion, scope=\"request\"),\n            MeetingAPIProtocol\n        )\n    )\n    return container\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/dependency_injection.py)\n\n### dependency-injector library\n\nThe package also supports [dependency-injector](https://github.com/ets-labs/python-dependency-injector) library.\nYou can use `DependencyInjectorCQRSContainer` adapter to integrate dependency-injector containers with python-cqrs.\n\n```python\nfrom dependency_injector import containers, providers\nfrom cqrs.container.dependency_injector import DependencyInjectorCQRSContainer\n\nclass ApplicationContainer(containers.DeclarativeContainer):\n    # Define your providers\n    service = providers.Factory(ServiceImplementation)\n\n# Create CQRS container adapter\ncqrs_container = DependencyInjectorCQRSContainer(ApplicationContainer())\n\n# Use with bootstrap\nmediator = bootstrap.bootstrap(\n    di_container=cqrs_container,\n    commands_mapper=commands_mapper,\n    ...\n)\n```\n\nComplete examples can be found in:\n- [Simple example](https://github.com/vadikko2/cqrs/blob/master/examples/dependency_injector_integration_simple_example.py)\n- [Practical example with FastAPI](https://github.com/vadikko2/cqrs/blob/master/examples/dependency_injector_integration_practical_example.py)\n\n## Integration with presentation layers\n\nThe framework is ready for integration with **FastAPI** and **FastStream**.\n\n\u003e [!TIP]\n\u003e I recommend reading the useful\n\u003e paper [Onion Architecture Used in Software Development](https://www.researchgate.net/publication/371006360_Onion_Architecture_Used_in_Software_Development).\n\u003e Separating user interaction and use-cases into Application and Presentation layers is a good practice.\n\u003e This can improve the `Testability`, `Maintainability`, `Scalability` of the application. It also provides benefits\n\u003e such as `Separation of Concerns`.\n\n### FastAPI requests handling\n\nIf your application uses FastAPI (or any other asynchronous framework for creating APIs).\nIn this case you can use python-cqrs to route requests to the appropriate handlers implementing specific use-cases.\n\n```python\nimport fastapi\nimport pydantic\n\nfrom app import dependecies, commands\n\nrouter = fastapi.APIRouter(prefix=\"/meetings\")\n\n\n@router.put(\"/{meeting_id}/{user_id}\", status_code=status.HTTP_200_OK)\nasync def join_metting(\n    meeting_id: pydantic.PositiveInt,\n    user_id: typing.Text,\n    mediator: cqrs.RequestMediator = fastapi.Depends(dependencies.mediator_factory),\n):\n    await mediator.send(commands.JoinMeetingCommand(meeting_id=meeting_id, user_id=user_id))\n    return {\"result\": \"ok\"}\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/fastapi_integration.py)\n\n### Kafka events consuming\n\nIf you build interaction by events over broker like `Kafka`, you can to implement an event consumer on your\napplication's side,\nwhich will call the appropriate handler for each event.\nAn example of handling events from `Kafka` is provided below.\n\n```python\nimport cqrs\n\nimport pydantic\nimport faststream\nfrom faststream import kafka\n\nbroker = kafka.KafkaBroker(bootstrap_servers=[\"localhost:9092\"])\napp = faststream.FastStream(broker)\n\n\nclass HelloWorldPayload(pydantic.BaseModel):\n    hello: str = pydantic.Field(default=\"Hello\")\n    world: str = pydantic.Field(default=\"World\")\n\n\nclass HelloWorldECSTEventHandler(cqrs.EventHandler[cqrs.NotificationEvent[HelloWorldPayload]]):\n    async def handle(self, event: cqrs.NotificationEvent[HelloWorldPayload]) -\u003e None:\n        print(f\"{event.payload.hello} {event.payload.world}\")  # type: ignore\n\n\n@broker.subscriber(\n    \"hello_world\",\n    group_id=\"examples\",\n    auto_commit=False,\n    value_deserializer=value_deserializer,\n    decoder=decoder,\n)\nasync def hello_world_event_handler(\n    body: cqrs.NotificationEvent[HelloWorldPayload] | None,\n    msg: kafka.KafkaMessage,\n    mediator: cqrs.EventMediator = faststream.Depends(mediator_factory),\n):\n    if body is not None:\n        await mediator.send(body)\n    await msg.ack()\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/python-cqrs/blob/master/examples/kafka_event_consuming.py)\n\n### FastAPI SSE Streaming\n\n`StreamingRequestMediator` is ready and designed for use with Server-Sent Events (SSE) in FastAPI applications.\nThis allows you to stream results to clients in real-time as they are processed.\n\n**Example FastAPI endpoint with SSE:**\n\n```python\nimport fastapi\nimport json\nfrom cqrs.requests import bootstrap\n\ndef streaming_mediator_factory() -\u003e cqrs.StreamingRequestMediator:\n    return bootstrap.bootstrap_streaming(\n        di_container=container,\n        commands_mapper=commands_mapper,\n        domain_events_mapper=domain_events_mapper,\n        message_broker=broker,\n        max_concurrent_event_handlers=3,\n        concurrent_event_handle_enable=True,\n    )\n\n@app.post(\"/process-files\")\nasync def process_files_stream(\n    command: ProcessFilesCommand,\n    mediator: cqrs.StreamingRequestMediator = fastapi.Depends(streaming_mediator_factory),\n) -\u003e fastapi.responses.StreamingResponse:\n    async def generate_sse():\n        yield f\"data: {json.dumps({'type': 'start', 'message': 'Processing...'})}\\\\n\\\\n\"\n\n        async for result in mediator.stream(command):\n            sse_data = {\n                \"type\": \"progress\",\n                \"data\": result.to_dict(),\n            }\n            yield f\"data: {json.dumps(sse_data)}\\\\n\\\\n\"\n\n        yield f\"data: {json.dumps({'type': 'complete'})}\\\\n\\\\n\"\n\n    return fastapi.responses.StreamingResponse(\n        generate_sse(),\n        media_type=\"text/event-stream\",\n    )\n```\n\nA complete example can be found in\nthe [documentation](https://github.com/vadikko2/cqrs/blob/master/examples/fastapi_sse_streaming.py)\n\n## Protobuf messaging\n\nThe `python-cqrs` package supports integration with [protobuf](https://developers.google.com/protocol-buffers/).\nThere is interface-level support for converting Notification events to Protobuf and back. Protocol buffers are Google's language-neutral, platform-neutral, extensible mechanism for serializing structured data – think XML, but smaller, faster, and simpler. You define how you want your data to be structured once, then you can use special generated source code to easily write and read your structured data to and from a variety of data streams and using a variety of languages.\n","funding_links":[],"categories":["Algorithms and Design Patterns","Libraries/Frameworks"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fvadikko2%2Fpython-cqrs","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fvadikko2%2Fpython-cqrs","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fvadikko2%2Fpython-cqrs/lists"}