{"id":26489737,"url":"https://github.com/dotflow-io/dotflow","last_synced_at":"2026-04-11T23:16:01.715Z","repository":{"id":270617682,"uuid":"908417376","full_name":"dotflow-io/dotflow","owner":"dotflow-io","description":"🎲 Business Logic Code in a flow!","archived":false,"fork":false,"pushed_at":"2026-03-31T02:29:42.000Z","size":4587,"stargazers_count":5,"open_issues_count":17,"forks_count":6,"subscribers_count":1,"default_branch":"master","last_synced_at":"2026-03-31T05:05:13.377Z","etag":null,"topics":["data","data-structures","database","dataflow","dataflow-programming","etl","etl-framework","etl-pipeline","flow","python","python3","workflow","workflow-engine"],"latest_commit_sha":null,"homepage":"https://dotflow-io.github.io/dotflow/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/dotflow-io.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2024-12-26T03:05:48.000Z","updated_at":"2026-03-31T02:39:54.000Z","dependencies_parsed_at":"2025-01-01T21:34:10.057Z","dependency_job_id":"68ebb828-7ac1-4c20-bdbe-c00e3035db12","html_url":"https://github.com/dotflow-io/dotflow","commit_stats":null,"previous_names":["linux-profile/dotflow","fernandocelmer/dotflow","dotflow-io/dotflow"],"tags_count":38,"template":false,"template_full_name":null,"purl":"pkg:github/dotflow-io/dotflow","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dotflow-io%2Fdotflow","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dotflow-io%2Fdotflow/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dotflow-io%2Fdotflow/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dotflow-io%2Fdotflow/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/dotflow-io","download_url":"https://codeload.github.com/dotflow-io/dotflow/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dotflow-io%2Fdotflow/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31308268,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-02T12:59:32.332Z","status":"ssl_error","status_checked_at":"2026-04-02T12:54:48.875Z","response_time":89,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.5:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["data","data-structures","database","dataflow","dataflow-programming","etl","etl-framework","etl-pipeline","flow","python","python3","workflow","workflow-engine"],"created_at":"2025-03-20T07:48:36.740Z","updated_at":"2026-04-08T05:01:57.498Z","avatar_url":"https://github.com/dotflow-io.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n  \u003ca aria-label=\"Serverless.com\" href=\"https://dotflow.io\"\u003eWebsite\u003c/a\u003e\n  \u0026nbsp;•\u0026nbsp;\n  \u003ca aria-label=\"Dotflow Documentation\" href=\"https://dotflow-io.github.io/dotflow/\"\u003eDocumentation\u003c/a\u003e\n  \u0026nbsp;•\u0026nbsp;\n  \u003ca aria-label=\"Pypi\" href=\"https://pypi.org/project/dotflow/\"\u003ePypi\u003c/a\u003e\n\u003c/div\u003e\n\n\u003cbr/\u003e\n\n\u003cdiv align=\"center\"\u003e\n\n![](https://raw.githubusercontent.com/FernandoCelmer/dotflow/master/docs/assets/dotflow.gif)\n\n![GitHub Org's stars](https://img.shields.io/github/stars/dotflow-io?label=Dotflow\u0026style=flat-square)\n![GitHub last commit](https://img.shields.io/github/last-commit/dotflow-io/dotflow?style=flat-square)\n![PyPI](https://img.shields.io/pypi/v/dotflow?style=flat-square)\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/dotflow?style=flat-square)\n![PyPI - Downloads](https://img.shields.io/pypi/dm/dotflow?style=flat-square)\n\n\u003c/div\u003e\n\n# Welcome to Dotflow\n\nDotflow is a lightweight Python library for building execution pipelines. Define tasks with decorators, chain them together, and run workflows in sequential, parallel, or background mode — with built-in retry, timeout, storage, notifications, and more.\n\n\u003e **[Read the full documentation](https://dotflow-io.github.io/dotflow/)**\n\n## Table of Contents\n\n\u003cdetails\u003e\n\u003csummary\u003eClick to expand\u003c/summary\u003e\n\n- [Getting Help](#getting-help)\n- [Installation](#installation)\n- [Quick Start](#quick-start)\n- [Features](#features)\n  - [Execution Modes](#execution-modes)\n  - [Retry, Timeout \u0026 Backoff](#retry-timeout--backoff)\n  - [Context System](#context-system)\n  - [Checkpoint \u0026 Resume](#checkpoint--resume)\n  - [Storage Providers](#storage-providers)\n  - [Notifications](#notifications)\n  - [Class-Based Steps](#class-based-steps)\n  - [Task Groups](#task-groups)\n  - [Callbacks](#callbacks)\n  - [Error Handling](#error-handling)\n  - [Async Support](#async-support)\n  - [Scheduler / Cron](#scheduler--cron)\n  - [CLI](#cli)\n  - [Dependency Injection via Config](#dependency-injection-via-config)\n- [More Examples](#more-examples)\n- [Commit Style](#commit-style)\n- [License](#license)\n\n\u003c/details\u003e\n\n## Getting Help\n\nWe use GitHub issues for tracking bugs and feature requests.\n\n- [Bug Report](https://github.com/dotflow-io/dotflow/issues/new/choose)\n- [Documentation](https://github.com/dotflow-io/dotflow/issues/new/choose)\n- [Feature Request](https://github.com/dotflow-io/dotflow/issues/new/choose)\n- [Security Issue](https://github.com/dotflow-io/dotflow/issues/new/choose)\n- [General Question](https://github.com/dotflow-io/dotflow/issues/new/choose)\n\n## Installation\n\n```bash\npip install dotflow\n```\n\n**Optional extras:**\n\n```bash\npip install dotflow[aws]        # AWS S3 storage\npip install dotflow[gcp]        # Google Cloud Storage\npip install dotflow[scheduler]  # Cron-based scheduler\n```\n\n## Quick Start\n\n```python\nfrom dotflow import DotFlow, action\n\n@action\ndef extract():\n    return {\"users\": 150}\n\n@action\ndef transform(previous_context):\n    total = previous_context.storage[\"users\"]\n    return {\"users\": total, \"active\": int(total * 0.8)}\n\n@action\ndef load(previous_context):\n    print(f\"Loaded {previous_context.storage['active']} active users\")\n\nworkflow = DotFlow()\nworkflow.task.add(step=extract)\nworkflow.task.add(step=transform)\nworkflow.task.add(step=load)\n\nworkflow.start()\n```\n\n## Features\n\n### Execution Modes\n\n\u003e [Process Mode docs](https://dotflow-io.github.io/dotflow/nav/concepts/process-mode-sequential/)\n\nDotflow supports 4 execution strategies out of the box:\n\n#### Sequential (default)\n\nTasks run one after another. The context from each task flows to the next.\n\n```python\nworkflow.task.add(step=task_a)\nworkflow.task.add(step=task_b)\n\nworkflow.start()  # or mode=\"sequential\"\n```\n\n```mermaid\nflowchart LR\n    A[task_a] --\u003e B[task_b] --\u003e C[Finish]\n```\n\n#### Background\n\nSame as sequential, but runs in a background thread — non-blocking.\n\n```python\nworkflow.start(mode=\"background\")\n```\n\n#### Parallel\n\nEvery task runs simultaneously in its own process.\n\n```python\nworkflow.task.add(step=task_a)\nworkflow.task.add(step=task_b)\nworkflow.task.add(step=task_c)\n\nworkflow.start(mode=\"parallel\")\n```\n\n```mermaid\nflowchart TD\n    S[Start] --\u003e A[task_a] \u0026 B[task_b] \u0026 C[task_c]\n    A \u0026 B \u0026 C --\u003e F[Finish]\n```\n\n#### Parallel Groups\n\nAssign tasks to named groups. Groups run in parallel, but tasks within each group run sequentially.\n\n```python\nworkflow.task.add(step=fetch_users, group_name=\"users\")\nworkflow.task.add(step=save_users, group_name=\"users\")\nworkflow.task.add(step=fetch_orders, group_name=\"orders\")\nworkflow.task.add(step=save_orders, group_name=\"orders\")\n\nworkflow.start()\n```\n\n```mermaid\nflowchart TD\n    S[Start] --\u003e G1[Group: users] \u0026 G2[Group: orders]\n    G1 --\u003e A[fetch_users] --\u003e B[save_users]\n    G2 --\u003e C[fetch_orders] --\u003e D[save_orders]\n    B \u0026 D --\u003e F[Finish]\n```\n\n---\n\n### Retry, Timeout \u0026 Backoff\n\n\u003e [Retry docs](https://dotflow-io.github.io/dotflow/nav/tutorial/task-retry/) | [Backoff docs](https://dotflow-io.github.io/dotflow/nav/tutorial/task-backoff/) | [Timeout docs](https://dotflow-io.github.io/dotflow/nav/tutorial/task-timeout/)\n\nThe `@action` decorator supports built-in resilience options:\n\n```python\n@action(retry=3, timeout=10, retry_delay=2, backoff=True)\ndef unreliable_api_call():\n    response = requests.get(\"https://api.example.com/data\")\n    response.raise_for_status()\n    return response.json()\n```\n\n| Parameter | Type | Default | Description |\n|-----------|------|---------|-------------|\n| `retry` | `int` | `1` | Number of attempts before failing |\n| `timeout` | `int` | `0` | Max seconds per attempt (0 = no limit) |\n| `retry_delay` | `int` | `1` | Seconds to wait between retries |\n| `backoff` | `bool` | `False` | Exponential backoff (delay doubles each retry) |\n\n---\n\n### Context System\n\n\u003e [Context docs](https://dotflow-io.github.io/dotflow/nav/tutorial/initial-context/) | [Previous Context](https://dotflow-io.github.io/dotflow/nav/tutorial/previous-context/) | [Many Contexts](https://dotflow-io.github.io/dotflow/nav/tutorial/many-contexts/)\n\nTasks communicate through a context chain. Each task receives the previous task's output and can access its own initial context.\n\n```python\n@action\ndef step_one():\n    return \"Hello\"\n\n@action\ndef step_two(previous_context, initial_context):\n    greeting = previous_context.storage   # \"Hello\"\n    name = initial_context.storage        # \"World\"\n    return f\"{greeting}, {name}!\"\n\nworkflow = DotFlow()\nworkflow.task.add(step=step_one)\nworkflow.task.add(step=step_two, initial_context=\"World\")\nworkflow.start()\n```\n\nEach `Context` object contains:\n- **`storage`** — the return value from the task\n- **`task_id`** — the task identifier\n- **`workflow_id`** — the workflow identifier\n- **`time`** — timestamp of execution\n\n---\n\n### Checkpoint \u0026 Resume\n\n\u003e [Checkpoint docs](https://dotflow-io.github.io/dotflow/nav/tutorial/checkpoint/)\n\nResume a workflow from where it left off. Requires a persistent storage provider and a fixed `workflow_id`.\n\n```python\nfrom dotflow import DotFlow, Config, action\nfrom dotflow.providers import StorageFile\n\nconfig = Config(storage=StorageFile())\n\nworkflow = DotFlow(config=config, workflow_id=\"my-pipeline-v1\")\nworkflow.task.add(step=step_a)\nworkflow.task.add(step=step_b)\nworkflow.task.add(step=step_c)\n\n# First run — executes all tasks and saves checkpoints\nworkflow.start()\n\n# If step_c failed, fix and re-run — skips step_a and step_b\nworkflow.start(resume=True)\n```\n\n---\n\n### Storage Providers\n\n\u003e [Storage docs](https://dotflow-io.github.io/dotflow/nav/tutorial/storage-default/)\n\nChoose where task results are persisted:\n\n#### In-Memory (default)\n\n```python\nfrom dotflow import DotFlow\n\nworkflow = DotFlow()  # uses StorageDefault (in-memory)\n```\n\n#### File System\n\n```python\nfrom dotflow import DotFlow, Config\nfrom dotflow.providers import StorageFile\n\nconfig = Config(storage=StorageFile(path=\".output\"))\nworkflow = DotFlow(config=config)\n```\n\n#### AWS S3\n\n```bash\npip install dotflow[aws]\n```\n\n```python\nfrom dotflow import DotFlow, Config\nfrom dotflow.providers import StorageS3\n\nconfig = Config(storage=StorageS3(bucket=\"my-bucket\", prefix=\"pipelines/\", region=\"us-east-1\"))\nworkflow = DotFlow(config=config)\n```\n\n#### Google Cloud Storage\n\n```bash\npip install dotflow[gcp]\n```\n\n```python\nfrom dotflow import DotFlow, Config\nfrom dotflow.providers import StorageGCS\n\nconfig = Config(storage=StorageGCS(bucket=\"my-bucket\", prefix=\"pipelines/\", project=\"my-project\"))\nworkflow = DotFlow(config=config)\n```\n\n---\n\n### Notifications\n\n\u003e [Telegram docs](https://dotflow-io.github.io/dotflow/nav/tutorial/notify-telegram/)\n\nGet notified about task status changes via Telegram.\n\n```python\nfrom dotflow import DotFlow, Config\nfrom dotflow.providers import NotifyTelegram\nfrom dotflow.core.types.status import TypeStatus\n\nnotify = NotifyTelegram(\n    token=\"YOUR_BOT_TOKEN\",\n    chat_id=123456789,\n    notification_type=TypeStatus.FAILED,  # only notify on failures (optional)\n)\n\nconfig = Config(notify=notify)\nworkflow = DotFlow(config=config)\n```\n\nStatus types: `NOT_STARTED`, `IN_PROGRESS`, `COMPLETED`, `PAUSED`, `RETRY`, `FAILED`\n\n---\n\n### Class-Based Steps\n\nReturn a class instance from a task, and Dotflow will automatically discover and execute all `@action`-decorated methods in source order.\n\n```python\nfrom dotflow import action\n\nclass ETLPipeline:\n    @action\n    def extract(self):\n        return {\"raw\": [1, 2, 3]}\n\n    @action\n    def transform(self, previous_context):\n        data = previous_context.storage[\"raw\"]\n        return {\"processed\": [x * 2 for x in data]}\n\n    @action\n    def load(self, previous_context):\n        print(f\"Loaded: {previous_context.storage['processed']}\")\n\n@action\ndef run_pipeline():\n    return ETLPipeline()\n\nworkflow = DotFlow()\nworkflow.task.add(step=run_pipeline)\nworkflow.start()\n```\n\n---\n\n### Task Groups\n\n\u003e [Groups docs](https://dotflow-io.github.io/dotflow/nav/tutorial/groups/)\n\nOrganize tasks into named groups for parallel group execution.\n\n```python\nworkflow.task.add(step=scrape_site_a, group_name=\"scraping\")\nworkflow.task.add(step=scrape_site_b, group_name=\"scraping\")\nworkflow.task.add(step=process_data, group_name=\"processing\")\nworkflow.task.add(step=save_results, group_name=\"processing\")\n\nworkflow.start()  # groups run in parallel, tasks within each group run sequentially\n```\n\n---\n\n### Callbacks\n\n\u003e [Task Callback docs](https://dotflow-io.github.io/dotflow/nav/tutorial/task-callback/) | [Workflow Callback docs](https://dotflow-io.github.io/dotflow/nav/tutorial/workflow-callback/)\n\nExecute a function after each task completes — useful for logging, alerting, or side effects.\n\n```python\ndef on_task_done(task):\n    print(f\"Task {task.task_id} finished with status: {task.status}\")\n\nworkflow.task.add(step=my_step, callback=on_task_done)\n```\n\nWorkflow-level callbacks for success and failure:\n\n```python\ndef on_success(*args, **kwargs):\n    print(\"All tasks completed!\")\n\ndef on_failure(*args, **kwargs):\n    print(\"Something went wrong.\")\n\nworkflow.start(on_success=on_success, on_failure=on_failure)\n```\n\n---\n\n### Error Handling\n\n\u003e [Error Handling docs](https://dotflow-io.github.io/dotflow/nav/tutorial/error-handling/) | [Keep Going docs](https://dotflow-io.github.io/dotflow/nav/tutorial/keep-going/)\n\nControl whether the workflow stops or continues when a task fails:\n\n```python\n# Stop on first failure (default)\nworkflow.start(keep_going=False)\n\n# Continue executing remaining tasks even if one fails\nworkflow.start(keep_going=True)\n```\n\nEach task tracks its errors with full detail:\n- Attempt number\n- Exception type and message\n- Traceback\n\nAccess results after execution:\n\n```python\nfor task in workflow.result_task():\n    print(f\"Task {task.task_id}: {task.status}\")\n    if task.errors:\n        print(f\"  Errors: {task.errors}\")\n```\n\n---\n\n### Async Support\n\n\u003e [Async docs](https://dotflow-io.github.io/dotflow/nav/tutorial/async-actions/)\n\n`@action` automatically detects and handles async functions:\n\n```python\nimport httpx\nfrom dotflow import DotFlow, action\n\n@action(timeout=30)\nasync def fetch_data():\n    async with httpx.AsyncClient() as client:\n        response = await client.get(\"https://api.example.com/data\")\n        return response.json()\n\nworkflow = DotFlow()\nworkflow.task.add(step=fetch_data)\nworkflow.start()\n```\n\n---\n\n### Scheduler / Cron\n\n\u003e [Cron scheduler docs](https://dotflow-io.github.io/dotflow/nav/tutorial/scheduler-cron/) | [Default scheduler](https://dotflow-io.github.io/dotflow/nav/tutorial/scheduler-default/) | [Cron overlap (concepts)](https://dotflow-io.github.io/dotflow/nav/concepts/concept-cron-overlap/)\n\nSchedule workflows to run automatically using cron expressions.\n\n```bash\npip install dotflow[scheduler]\n```\n\n```python\nfrom dotflow import DotFlow, Config, action\nfrom dotflow.providers import SchedulerCron\n\n@action\ndef sync_data():\n    return {\"synced\": True}\n\nconfig = Config(scheduler=SchedulerCron(cron=\"*/5 * * * *\"))\n\nworkflow = DotFlow(config=config)\nworkflow.task.add(step=sync_data)\nworkflow.schedule()\n```\n\n#### Overlap Strategies\n\nControl what happens when a new execution triggers while the previous one is still running:\n\n| Strategy | Description |\n|----------|-------------|\n| `skip` | Drops the new run if the previous is still active (default) |\n| `queue` | Buffers one pending run, executes when the current finishes |\n| `parallel` | Runs up to 10 concurrent executions via semaphore |\n\n```python\nfrom dotflow.providers import SchedulerCron\n\n# Queue overlapping executions\nscheduler = SchedulerCron(cron=\"*/5 * * * *\", overlap=\"queue\")\n\n# Allow parallel executions\nscheduler = SchedulerCron(cron=\"*/5 * * * *\", overlap=\"parallel\")\n```\n\nThe scheduler handles graceful shutdown via `SIGINT`/`SIGTERM` signals automatically.\n\n---\n\n### CLI\n\n\u003e [CLI docs](https://dotflow-io.github.io/dotflow/nav/how-to/cli/simple-start/)\n\nRun workflows directly from the command line:\n\n```bash\n# Simple execution\ndotflow start --step my_module.my_task\n\n# With initial context\ndotflow start --step my_module.my_task --initial-context '{\"key\": \"value\"}'\n\n# With callback\ndotflow start --step my_module.my_task --callback my_module.on_done\n\n# With execution mode\ndotflow start --step my_module.my_task --mode parallel\n\n# With file storage\ndotflow start --step my_module.my_task --storage file --path .output\n\n# With S3 storage\ndotflow start --step my_module.my_task --storage s3\n\n# With GCS storage\ndotflow start --step my_module.my_task --storage gcs\n\n# Schedule with cron\ndotflow schedule --step my_module.my_task --cron \"*/5 * * * *\"\n\n# Schedule with overlap strategy\ndotflow schedule --step my_module.my_task --cron \"0 * * * *\" --overlap queue\n\n# Schedule with resume\ndotflow schedule --step my_module.my_task --cron \"0 */6 * * *\" --storage file --resume\n```\n\nAvailable CLI commands:\n\n| Command | Description |\n|---------|-------------|\n| `dotflow init` | Initialize a new Dotflow project |\n| `dotflow start` | Run a workflow |\n| `dotflow schedule` | Run a workflow on a cron schedule |\n| `dotflow log` | View execution logs |\n\n---\n\n### Dependency Injection via Config\n\nThe `Config` class lets you swap providers for storage, notifications, logging, and scheduling:\n\n```python\nfrom dotflow import DotFlow, Config\nfrom dotflow.providers import StorageFile, NotifyTelegram, LogDefault, SchedulerCron\n\nconfig = Config(\n    storage=StorageFile(path=\".output\"),\n    notify=NotifyTelegram(token=\"...\", chat_id=123),\n    log=LogDefault(),\n    scheduler=SchedulerCron(cron=\"0 * * * *\"),\n)\n\nworkflow = DotFlow(config=config)\n```\n\nExtend Dotflow by implementing the abstract base classes:\n\n| ABC | Methods | Purpose |\n|-----|---------|---------|\n| `Storage` | `post`, `get`, `key` | Custom storage backends |\n| `Notify` | `hook_status_task` | Custom notification channels |\n| `Log` | `info`, `error` | Custom logging |\n| `Scheduler` | `start`, `stop` | Custom scheduling strategies |\n\n---\n\n### Results \u0026 Inspection\n\nAfter execution, inspect results directly from the workflow object:\n\n```python\nworkflow.start()\n\n# List of Task objects\ntasks = workflow.result_task()\n\n# List of Context objects (one per task)\ncontexts = workflow.result_context()\n\n# List of storage values (raw return values)\nstorages = workflow.result_storage()\n\n# Serialized result (Pydantic model)\nresult = workflow.result()\n```\n\nTask builder utilities:\n\n```python\nworkflow.task.count()     # Number of tasks\nworkflow.task.clear()     # Remove all tasks\nworkflow.task.reverse()   # Reverse execution order\nworkflow.task.schema()    # Pydantic schema of the workflow\n```\n\n---\n\n### Dynamic Module Import\n\nReference tasks and callbacks by their module path string instead of importing them directly:\n\n```python\nworkflow.task.add(step=\"my_package.tasks.process_data\")\nworkflow.task.add(step=\"my_package.tasks.save_results\", callback=\"my_package.callbacks.notify\")\n```\n\n---\n\n## More Examples\n\nAll examples are available in the [`docs_src/`](https://github.com/dotflow-io/dotflow/tree/develop/docs_src) directory.\n\n#### Basic\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [first_steps](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/first_steps/first_steps.py) | Minimal workflow with callback | `python docs_src/first_steps/first_steps.py` |\n| [simple_function_workflow](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/basic/simple_function_workflow.py) | Simple function-based workflow | `python docs_src/basic/simple_function_workflow.py` |\n| [simple_class_workflow](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/basic/simple_class_workflow.py) | Class-based step with retry | `python docs_src/basic/simple_class_workflow.py` |\n| [simple_function_workflow_with_error](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/basic/simple_function_workflow_with_error.py) | Error inspection after failure | `python docs_src/basic/simple_function_workflow_with_error.py` |\n\n#### Async\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [async_action](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/async/async_action.py) | Async task functions | `python docs_src/async/async_action.py` |\n\n#### Context\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/context/context.py) | Creating and inspecting a Context | `python docs_src/context/context.py` |\n| [initial_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/initial_context/initial_context.py) | Passing initial context per task | `python docs_src/initial_context/initial_context.py` |\n| [previous_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/previous_context/previous_context.py) | Chaining context between tasks | `python docs_src/previous_context/previous_context.py` |\n| [many_contexts](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/context/many_contexts.py) | Using both initial and previous context | `python docs_src/context/many_contexts.py` |\n\n#### Process Modes\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [sequential](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/process_mode/sequential.py) | Sequential execution | `python docs_src/process_mode/sequential.py` |\n| [background](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/process_mode/background.py) | Background (non-blocking) execution | `python docs_src/process_mode/background.py` |\n| [parallel](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/process_mode/parallel.py) | Parallel execution | `python docs_src/process_mode/parallel.py` |\n| [parallel_group](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/process_mode/parallel_group.py) | Parallel groups execution | `python docs_src/process_mode/parallel_group.py` |\n| [sequential_group_mode](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/workflow/sequential_group_mode.py) | Sequential with named groups | `python docs_src/workflow/sequential_group_mode.py` |\n\n#### Resilience (Retry, Backoff, Timeout)\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [retry](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/retry/retry.py) | Retry on function and class steps | `python docs_src/retry/retry.py` |\n| [retry_delay](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/retry/retry_delay.py) | Retry with delay between attempts | `python docs_src/retry/retry_delay.py` |\n| [backoff](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/backoff/backoff.py) | Exponential backoff on retries | `python docs_src/backoff/backoff.py` |\n| [timeout](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/timeout/timeout.py) | Timeout per task execution | `python docs_src/timeout/timeout.py` |\n\n#### Callbacks\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [task_callback](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/callback/task_callback.py) | Per-task callback on completion | `python docs_src/callback/task_callback.py` |\n| [workflow_callback_success](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/callback/workflow_callback_success.py) | Workflow-level success callback | `python docs_src/callback/workflow_callback_success.py` |\n| [workflow_callback_failure](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/callback/workflow_callback_failure.py) | Workflow-level failure callback | `python docs_src/callback/workflow_callback_failure.py` |\n\n#### Error Handling\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [errors](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/errors/errors.py) | Inspecting task errors and retry count | `python docs_src/errors/errors.py` |\n| [keep_going_true](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/workflow/keep_going_true.py) | Continue workflow after task failure | `python docs_src/workflow/keep_going_true.py` |\n\n#### Groups\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [step_with_groups](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/group/step_with_groups.py) | Tasks in named parallel groups | `python docs_src/group/step_with_groups.py` |\n\n#### Storage\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [storage_file](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/storage/storage_file.py) | File-based JSON storage | `python docs_src/storage/storage_file.py` |\n| [storage_s3](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/storage/storage_s3.py) | AWS S3 storage | `python docs_src/storage/storage_s3.py` |\n| [storage_gcs](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/storage/storage_gcs.py) | Google Cloud Storage | `python docs_src/storage/storage_gcs.py` |\n\n#### Checkpoint \u0026 Resume\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [checkpoint](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/checkpoint/checkpoint.py) | Resume workflow from last checkpoint | `python docs_src/checkpoint/checkpoint.py` |\n\n#### Notifications\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [notify_telegram](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/notify/notify_telegram.py) | Telegram notifications on failure | `python docs_src/notify/notify_telegram.py` |\n\n#### Config \u0026 Providers\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [config](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/config/config.py) | Full Config with storage, notify, log | `python docs_src/config/config.py` |\n| [storage_provider](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/config/storage_provider.py) | Swapping storage providers | `python docs_src/config/storage_provider.py` |\n| [notify_provider](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/config/notify_provider.py) | Swapping notification providers | `python docs_src/config/notify_provider.py` |\n| [log_provider](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/config/log_provider.py) | Custom log provider | `python docs_src/config/log_provider.py` |\n\n#### Results \u0026 Output\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [step_function_result_task](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_function_result_task.py) | Inspect task results (function) | `python docs_src/output/step_function_result_task.py` |\n| [step_function_result_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_function_result_context.py) | Inspect context results (function) | `python docs_src/output/step_function_result_context.py` |\n| [step_function_result_storage](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_function_result_storage.py) | Inspect storage results (function) | `python docs_src/output/step_function_result_storage.py` |\n| [step_class_result_task](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_class_result_task.py) | Inspect task results (class) | `python docs_src/output/step_class_result_task.py` |\n| [step_class_result_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_class_result_context.py) | Inspect context results (class) | `python docs_src/output/step_class_result_context.py` |\n| [step_class_result_storage](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/output/step_class_result_storage.py) | Inspect storage results (class) | `python docs_src/output/step_class_result_storage.py` |\n\n#### CLI\n\n| Example | Description | Command |\n|---------|-------------|---------|\n| [simple_cli](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/basic/simple_cli.py) | Basic CLI execution | `dotflow start --step docs_src.basic.simple_cli.simple_step` |\n| [cli_with_callback](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_callback.py) | CLI with callback function | `dotflow start --step docs_src.cli.cli_with_callback.simple_step --callback docs_src.cli.cli_with_callback.callback` |\n| [cli_with_initial_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_initial_context.py) | CLI with initial context | `dotflow start --step docs_src.cli.cli_with_initial_context.simple_step --initial-context abc` |\n| [cli_with_mode](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_mode.py) | CLI with execution mode | `dotflow start --step docs_src.cli.cli_with_mode.simple_step --mode sequential` |\n| [cli_with_output_context](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_output_context.py) | CLI with file storage output | `dotflow start --step docs_src.cli.cli_with_output_context.simple_step --storage file` |\n| [cli_with_path](https://github.com/dotflow-io/dotflow/blob/develop/docs_src/cli/cli_with_path.py) | CLI with custom storage path | `dotflow start --step docs_src.cli.cli_with_path.simple_step --path .storage --storage file` |\n\n## Commit Style\n\n| Icon | Type      | Description                                |\n|------|-----------|--------------------------------------------|\n| ⚙️   | FEATURE   | New feature                                |\n| 📝   | PEP8      | Formatting fixes following PEP8            |\n| 📌   | ISSUE     | Reference to issue                         |\n| 🪲   | BUG       | Bug fix                                    |\n| 📘   | DOCS      | Documentation changes                      |\n| 📦   | PyPI      | PyPI releases                              |\n| ❤️️   | TEST      | Automated tests                            |\n| ⬆️   | CI/CD     | Changes in continuous integration/delivery |\n| ⚠️   | SECURITY  | Security improvements                      |\n\n## License\n\n![GitHub License](https://img.shields.io/github/license/dotflow-io/dotflow)\n\nThis project is licensed under the terms of the MIT License.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdotflow-io%2Fdotflow","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdotflow-io%2Fdotflow","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdotflow-io%2Fdotflow/lists"}