{"id":14069227,"url":"https://github.com/virtualstaticvoid/taskinator","last_synced_at":"2025-07-30T05:31:48.420Z","repository":{"id":19626848,"uuid":"22878645","full_name":"virtualstaticvoid/taskinator","owner":"virtualstaticvoid","description":"A simple orchestration library for running complex processes or workflows in Ruby","archived":false,"fork":false,"pushed_at":"2024-10-04T16:26:21.000Z","size":681,"stargazers_count":27,"open_issues_count":3,"forks_count":14,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-07-05T06:24:59.841Z","etag":null,"topics":["activejob","cloud","delayed-jobs","distributed-computing","rails","resque","ruby","sidekiq","workers","workflow"],"latest_commit_sha":null,"homepage":"","language":"Ruby","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/virtualstaticvoid.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE.txt","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2014-08-12T13:40:34.000Z","updated_at":"2024-12-29T13:30:05.000Z","dependencies_parsed_at":"2023-01-13T20:29:42.406Z","dependency_job_id":null,"html_url":"https://github.com/virtualstaticvoid/taskinator","commit_stats":null,"previous_names":[],"tags_count":45,"template":false,"template_full_name":null,"purl":"pkg:github/virtualstaticvoid/taskinator","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/virtualstaticvoid%2Ftaskinator","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/virtualstaticvoid%2Ftaskinator/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/virtualstaticvoid%2Ftaskinator/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/virtualstaticvoid%2Ftaskinator/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/virtualstaticvoid","download_url":"https://codeload.github.com/virtualstaticvoid/taskinator/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/virtualstaticvoid%2Ftaskinator/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":267560669,"owners_count":24107526,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-07-28T02:00:09.689Z","response_time":68,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["activejob","cloud","delayed-jobs","distributed-computing","rails","resque","ruby","sidekiq","workers","workflow"],"created_at":"2024-08-13T07:06:43.873Z","updated_at":"2025-07-30T05:31:48.142Z","avatar_url":"https://github.com/virtualstaticvoid.png","language":"Ruby","readme":"# Taskinator\n\n[![Gem Version](https://badge.fury.io/rb/taskinator.svg)](http://badge.fury.io/rb/taskinator)\n[![Build Status](https://github.com/virtualstaticvoid/taskinator/actions/workflows/build.yml/badge.svg)](https://github.com/virtualstaticvoid/taskinator/actions/workflows/build.yml)\n[![Code Climate](https://codeclimate.com/github/virtualstaticvoid/taskinator.png)](https://codeclimate.com/github/virtualstaticvoid/taskinator)\n\nA simple orchestration library for running complex processes or workflows in Ruby.\nProcesses are defined using a simple DSL, where the sequences and tasks are defined.\nProcesses can then be queued for execution. Sequences can be synchronous or asynchronous,\nand the overall process can be monitored for completion or failure.\n\nProcesses and tasks are executed by background workers and you can use any one of the\nfollowing gems:\n\n* [active_job](https://github.com/rails/rails/tree/main/activejob)\n* [resque](https://github.com/resque/resque)\n* [sidekiq](https://github.com/mperham/sidekiq)\n* [delayed_job](https://github.com/collectiveidea/delayed_job)\n\nThe configuration and state of each process and their respective tasks is stored using\nRedis key/values.\n\n## Requirements\n\nThe latest MRI 2.x or 3.x version. Other versions/VMs are untested, but might work fine.\nMRI 1.x is not supported.\n\nRedis 2.4 or greater is required.\n\nOne of the following background worker queue gems: `resque`, `sidekiq` or `delayed_job`.\n\n_NOTE:_ `resque` or `sidekiq` is recommended since they use Redis as a backing store as well.\n\n## Installation\n\nAdd this line to your application's Gemfile:\n\n    gem 'taskinator'\n\nAnd then execute:\n\n    $ bundle install\n\nOr install it yourself as:\n\n    $ gem install taskinator\n\nIf you are using Taskinator within a Rails application, then add an initializer, such as\n`config/initializers/taskinator.rb`, with the following configuration content:\n\n```ruby\n# config/initializers/taskinator.rb\nTaskinator.configure do |config|\n\n  # configure the queue adapter to use\n  # can be :active_job, :delayed_job, :resque or :sidekiq\n  config.queue_adapter = :resque\n\n  # configure redis\n  config.redis = {\n    :url =\u003e 'redis://redis.example.com:7372/12',\n    :namespace =\u003e 'mynamespace'\n  }\n\nend\n```\n\nSee the configuration section below for more configuration details.\n\n## Usage\n\n### Definition\n\nStart by creating a \"process\" module and extending `Taskinator::Definition`.\n\n```ruby\nrequire 'taskinator'\n\nmodule MyProcess\n  extend Taskinator::Definition\n\nend\n```\n\nDefine the process using the `define_process` method.\n\n```ruby\nmodule MyProcess\n  extend Taskinator::Definition\n\n  # defines a process\n  define_process do\n\n  end\nend\n```\n\nThe `define_process` method optionally takes the list of expected arguments which are used\nto validate the arguments supplied when creating a new process.\nThese should be specified with symbols.\n\n```ruby\nmodule MyProcess\n  extend Taskinator::Definition\n\n  # defines a process\n  define_process :date, :options do\n    # ...\n  end\nend\n\n# when creating a process, 2 arguments are expected\nprocess = MyProcess.create_process Date.today, :option_1 =\u003e true\n```\n\n_NOTE:_ The current implementation performs a naive check on the count of arguments.\n\nNext, specify the tasks with their corresponding implementation methods, that make up the\nprocess, using the `task` method and providing the `method` to execute for the task.\n\n```ruby\nmodule MyProcess\n  extend Taskinator::Definition\n\n  define_process do\n    task :first_work_step\n    task :second_work_step\n  end\n\n  def first_work_step\n    # TODO: supply implementation\n  end\n\n  def second_work_step\n    # TODO: supply implementation\n  end\nend\n```\n\nMore complex processes may define sequential or concurrent steps, using the `sequential`\nand `concurrent` methods respectively.\n\n```ruby\nmodule MyProcess\n  extend Taskinator::Definition\n\n  define_process do\n    concurrent do\n      # these tasks will be executed concurrently\n      task :work_step_1\n      task :work_step_2\n    end\n\n    sequential do\n      # thes tasks will be executed sequentially\n      task :work_step_3\n      task :work_step_4\n    end\n  end\n\n  def work_step_1\n    # TODO: supply implementation\n  end\n\n  ...\n\n  def work_step_N\n    # TODO: supply implementation\n  end\n\nend\n```\n\n#### Data Driven Process Definitions\n\nYou can also define data driven tasks using the `for_each` method, which takes an iterator method\nname as an argument.\n\nThe iterator method yields the parameters necessary for the task or job. Notice that the task\nmethod takes a parameter in this case, which will be the return values provided by the iterator.\n\n```ruby\nmodule MyProcess\n  extend Taskinator::Definition\n\n  define_process do\n    for_each :yield_data_elements do\n      task :work_step\n    end\n  end\n\n  def yield_data_elements\n    # TODO: supply implementation to yield elements\n    yield 1\n  end\n\n  def work_step(data_element)\n    # TODO: supply implementation\n  end\nend\n```\n\n#### Branching\n\nIt is possible to branch the process logic based on the options hash passed in when creating\na process. The `options?` method takes the options key as an argument and calls the supplied\nblock if the option is present and it's value is _truthy_.\n\n```ruby\nmodule MyProcess\n  extend Taskinator::Definition\n\n  define_process do\n\n    option?(:some_setting) do\n      task :prerequisite_step\n    end\n\n    task :work_step\n\n  end\n\n  def prerequisite_step\n    # ...\n  end\n\n  def work_step\n    # ...\n  end\n\nend\n\n# now when creating the process, the `:some_setting` option can be used to branch the logic\nprocess1 = MyProcess.create_process :some_setting =\u003e true\nprocess1.tasks.count #=\u003e 2\n\nprocess2 = MyProcess.create_process\nprocess2.tasks.count #=\u003e 1\n```\n\n#### Argument Transformations\n\nIn addition, it is possible to transform the arguments used by a task or job, by including\na `transform` step in the definition.\n\nSimilarly for the `for_each` method, `transform` takes a method name as an argument.\nThe transformer method must yield the new arguments as required.\n\n```ruby\nmodule MyProcess\n  extend Taskinator::Definition\n\n  # this process is created with a hash argument\n\n  define_process do\n    transform :convert_args do\n      task :work_step\n    end\n  end\n\n  def convert_args(options)\n    yield *[options[:date_from], options[:date_to]]\n  end\n\n  def work_step(date_from, date_to)\n    # TODO: supply implementation\n  end\nend\n```\n\n#### Subprocesses\n\nProcesses can be composed of other processes too:\n\n```ruby\nmodule MySubProcessA\n  ...\nend\n\nmodule MySubProcessB\n  ...\nend\n\nmodule MyProcess\n  extend Taskinator::Definition\n\n  define_process do\n    sub_process MySubProcessA\n    sub_process MySubProcessB\n  end\nend\n```\n\n#### Complex Process Definitions\n\nAny combination or nesting of `task`, `sequential`, `concurrent` and `for_each` steps are\npossible. E.g.\n\n```ruby\nmodule MyProcess\n  extend Taskinator::Definition\n\n  define_process do\n    for_each :data_elements do\n      task :work_step_begin\n\n      concurrent do\n        for_each :sub_data_elements do\n          task :work_step_all_at_once\n        end\n      end\n\n      sub_process MySubProcess\n\n      sequential do\n        for_each :sub_data_elements do\n          task :work_step_one_by_one\n        end\n      end\n\n      task :work_step_end\n    end\n  end\n\n  # \"task\" and \"iterator\" methods omitted for brevity\n\nend\n```\n\nIn this example, the `work_step_begin` is executed, followed by the `work_step_all_at_once`\nsteps which are executed concurrently, then the sub process `MySubProcess` is created and\nexecuted, followed by the `work_step_one_by_one` tasks which are executed sequentially and\nfinally the `work_step_end` is executed.\n\nIt is also possible to embed conditional logic within the process definition stages in\norder to produce steps based on the required logic.\n\nAll builder methods are available within the scope of the `define_process` block. These\nmethods include `args` and `options` which are passed into the `create_process` method\nof the definition.\n\nE.g.\n\n```ruby\nmodule MyProcess\n  extend Taskinator::Definition\n\n  define_process do\n    task :task_1\n    task :task_2\n    task :task_3 if args[3] == 1\n    task :send_notification if options[:send_notification]\n  end\n\n  # \"task\" methods are omitted for brevity\n\nend\n\n# when creating this proces, you supply to option when calling `create_process`\n# in this example, 'args' will be an array [1,2,3]\n# and options will be a Hash {:send_notification =\u003e true}\nMyProcess.create_process(1, 2, 3, :send_notification =\u003e true)\n\n```\n\n#### Reusing ActiveJob jobs\n\nIt is likely that you already have one or more [jobs](https://guides.rubyonrails.org/active_job_basics.html)\nand want to reuse them within the process definition.\n\nDefine a `job` step, providing the class of the Active Job to run and then taskinator will\ninvoke that job as part of the process.\n\nThe `job` step will be queued and executed on same queue as\n[configured by the job](https://guides.rubyonrails.org/active_job_basics.html#queues).\n\n```ruby\n# E.g. A resque worker\nclass DoSomeWork\n  queue :high_priority\n\n  def self.perform(arg1, arg2)\n    # code to do the work\n  end\nend\n\nmodule MyProcess\n  extend Taskinator::Definition\n\n  # when creating the process, supply the same arguments\n  # that the DoSomeWork worker expects\n\n  define_process do\n    job DoSomeWork\n  end\nend\n```\n\n### Execution\n\nA process is created by calling the generated `create_process` method on your \"process\" module.\n\n```ruby\nprocess = MyProcess.create_process\n```\n\nAnd then enqueued for execution by calling the `enqueue!` method of the process.\n\n```ruby\nprocess.enqueue!\n```\n\nOr, started immediately by calling the `start!` method of the process.\n\n```ruby\nprocess = MyProcess.create_process\nprocess.start!\n```\n\n#### Arguments\n\nArgument handling for defining and executing process definitions is where things can get trickey.\n_This may be something that gets refactored down the line_.\n\nTo best understand how arguments are handled, you need to break it down into 3 phases. Namely:\n\n  * Definition,\n  * Creation and\n  * Execution\n\nFirstly, a process definition is declarative in that the `define_process` and a mix of\n`sequential`, `concurrent`, `for_each`, `task` and `job` directives provide the way to\nspecify the sequencing of the steps for the process.\n\nTaskinator will interprete this definition and execute each step in the desired sequence\nor concurrency.\n\nConsider the following process definition:\n\n```ruby\nmodule MySimpleProcess\n  extend Taskinator::Definition\n\n  # definition\n\n  define_process do\n    task :work_step_1\n    task :work_step_2\n\n    for_each :additional_step do\n      task :work_step_3\n    end\n  end\n\n  # creation\n\n  def additional_step(options)\n    options.steps.each do |k, v|\n      yield k, v\n    end\n  end\n\n  # execution\n\n  def work_step_1(options)\n    # ...\n  end\n\n  def work_step_2(options)\n    # ...\n  end\n\n  def work_step_3(k, v)\n    # ...\n  end\n\nend\n```\n\nThere are three tasks; namely `:work_step_1`, `:work_step_2` and `:work_step_3`.\n\nThe third task, `:work_step_3`, is built up using the `for_each` iterator, which means that\nthe number of `:work_step_3` tasks will depend on how many times the `additional_step`\niterator method yields to the definition.\n\nThis brings us to the creation part. When `create_process` is called on the given module,\nyou provide arguments to it, which will get passed onto the respective `task` and\n`for_each` iterator methods.\n\nSo, considering the `MySimpleProcess` module shown above, `work_step_1`, `work_step_2`\nand `work_step_3` methods each expect arguments.\n\nThese will ultimately come from the arguments passed into the `create_process` method.\n\nE.g.\n\n```ruby\n\n# Given an options hash\noptions = {\n  :opt1 =\u003e true,\n  :opt2 =\u003e false,\n  :steps =\u003e {\n    :a =\u003e 1,\n    :b =\u003e 2,\n    :c =\u003e 3,\n  }\n}\n\n# You create the process, passing in the options hash\nprocess = MySimpleProcess.create_process(options)\n\n```\n\nTo best understand how the process is created, consider the following \"procedural\" code\nfor how it could work.\n\n```ruby\n# A process, which maps the target and a list of steps\nclass Process\n  attr_reader :target\n  attr_reader :tasks\n\n  def initialize(target)\n    @target = target\n    @tasks = []\n  end\nend\n\n# A task, which maps the method to call and it's arguments\nclass Task\n  attr_reader :method\n  attr_reader :args\n\n  def initialize(method, args)\n    @method, @args = method, args\n  end\nend\n\n# Your module, with the methods which do the actual work\nmodule MySimpleProcess\n\n  def self.work_step_1(options) ...\n  def self.work_step_2(options) ...\n  def self.work_step_3(k, v) ...\n\nend\n\n# Now, the creation phase of the definition\n# create a process, providing the module\n\nprocess = Process.new(MySimpleProcess)\n\n# create the first and second tasks, providing the method\n# for the task and it's arguments, which are the options defined above\n\nprocess.tasks \u003c\u003c Task.new(:work_step_1, options)\nprocess.tasks \u003c\u003c Task.new(:work_step_2, options)\n\n# iterate over the steps hash in the options, and add the third step\n# this time specify the key and value as the\n# arguments for the work_step_3 method\n\noptions.steps.each do |k, v|\n  process.tasks \u003c\u003c Task.new(:work_step_3, [k, v])\nend\n\n# we now have a process with the tasks defined\n\nprocess.tasks  #=\u003e [\u003cTask :method=\u003ework_step_1, :args=\u003eoptions, ...\u003e ,\n               #    \u003cTask :method=\u003ework_step_2, :args=\u003eoptions, ...\u003e,\n               #    \u003cTask :method=\u003ework_step_3, :args=\u003e[:a, 1], ...\u003e,\n               #    \u003cTask :method=\u003ework_step_3, :args=\u003e[:b, 2], ...\u003e,\n               #    \u003cTask :method=\u003ework_step_3, :args=\u003e[:c, 3], ...\u003e]\n\n```\n\nFinally, for the execution phase, the process and tasks will act on the supplied module.\n\n```ruby\n# building out the \"Process\" class\nclass Process\n  #...\n\n  def execute\n    tasks.each {|task| task.execute(target) )\n  end\nend\n\n# and the \"Task\" class\nclass Task\n  #...\n\n  def execute(target)\n    puts \"Calling '#{method}' on '#{target.name}' with #{args.inspect}...\"\n    target.send(method, *args)\n  end\nend\n\n# executing the process iterates over each task and\n# the target modules method is called with the arguments\n\nprocess.execute\n\n# Calling 'work_step_1' on 'MySimpleProcess' with {:opt1 =\u003e true, :opt2 =\u003e false, ...}\n# Calling 'work_step_2' on 'MySimpleProcess' with {:opt1 =\u003e true, :opt2 =\u003e false, ...}\n# Calling 'work_step_3' on 'MySimpleProcess' with [:a, 1]\n# Calling 'work_step_3' on 'MySimpleProcess' with [:b, 2]\n# Calling 'work_step_3' on 'MySimpleProcess' with [:c, 3]\n\n```\n\nIn reality, each task is executed by a worker process, possibly on another host, so the\nexecution process isn't as simple, but this example should help you to understand\nconceptually how the process is executed, and how the arguments are propagated through.\n\n### Monitoring\n\nNOTE: This aspect of the library is still a work in progress.\n\n#### Processes\n\nTo monitor the state of the processes, use the `Taskinator::Api::Processes` class.\n\n```ruby\nprocesses = Taskinator::Api::Processes.new\nprocesses.each do |process|\n  # =\u003e output the unique process identifier and current state\n  puts [:process, process.uuid, process.current_state]\nend\n```\n\n#### Web UI\n\nYou can also install a web interface for your Rails application. \nCheck https://github.com/bguban/taskinator_ui for details.\n\n#### Debugging\n\nTo aid debugging specific processes and tasks, where the process or task identifier is\nknown, it is possible to retrieve the specific task or process using `Taskinator::Api`.\n\nTo retrieve a specific process, given the process identifier:\n\n```ruby\nprocess_id = \"SUPPLY-PROCESS-IDENTIFIER\"\nprocess = Taskinator::Api.find_process(process_id)\n\nputs process.inspect\nputs process.definition\nputs process.current_state\nputs process.tasks\n# etc...\n```\n\nThe type of process may be one of the following:\n\n* `Taskinator::Process::Sequential`\n* `Taskinator::Process::Concurrent`\n\nThen, to retrieve a specific task, given the task identifier:\n\n```ruby\ntask_id = \"SUPPLY-TASK-IDENTIFIER\"\ntask = Taskinator::Api.find_task(task_id)\n\nputs task.inspect\nputs task.class\nputs task.definition\nputs task.args                # for Step and Job types\nputs task.sub_process.tasks   # for SubProcess type\n# etc...\n```\n\nDepending on the type of task, different attributes will be available for inspection.\n\nThe types include:\n\n* `Taskinator::Task::Step`\n* `Taskinator::Task::Job`\n* `Taskinator::Task::SubProcess`\n\n## Configuration\n\n### Redis\n\nBy default Taskinator assumes Redis is located at `localhost:6397`. This is fine for development,\nbut for many production environments you will need to point to an external Redis server.\nYou may also what to use a namespace for the Redis keys.\n\n_NOTE:_ The configuration hash _must_ have symbolized keys.\n\n```ruby\nTaskinator.configure do |config|\n\n  # redis configuration\n  config.redis = {\n    :url =\u003e 'redis://redis.example.com:7372/12',\n    :namespace =\u003e 'mynamespace'\n  }\n\nend\n```\n\nOr, alternatively, via an `ENV` variable\n\nSet the `REDIS_PROVIDER` environment variable to the Redis server url.\nE.g. On Heroku, with RedisGreen: set `REDIS_PROVIDER=REDISGREEN_URL` and Taskinator will use the\nvalue of the `REDISGREEN_URL` environment variable when connecting to Redis.\n\nYou may also use the generic `REDIS_URL` which may be set to your own private Redis server.\n\nThe Redis configuration leverages the same setup as `sidekiq`. For advanced options, checkout the\n[Sidekiq Advanced Options](https://github.com/mperham/sidekiq/wiki/Advanced-Options#complete-control)\nwiki page for more information.\n\n### Queues\n\nTo configure the queue adapter to use, set `config.queue_adapter` to one of the following values:\n\n* `:active_job`\n* `:delayed_job`\n* `:resque`\n* `:sidekiq`\n\nAs follows:\n\n```ruby\nTaskinator.configure do |config|\n\n  # configure the queue adapter to use\n  # can be :active_job, :delayed_job, :resque or :sidekiq\n  config.queue_adapter = :resque\n\nend\n```\n\nBy default the queue names for process and task workers is `default`, however, you can specify\nthe queue names as follows:\n\n```ruby\nTaskinator.configure do |config|\n\n  # queue configuration\n  config.queue_config = {\n    :process_queue =\u003e :default,\n    :task_queue =\u003e :default\n  }\n\nend\n```\n\n### Instrumentation\n\nIt is possible to instrument processes, tasks and jobs by providing an instrumeter such\nas `ActiveSupport::Notifications`.\n\n```ruby\nTaskinator.configure do |config|\n\n  # configure instrumenter to use\n  config.instrumenter = ActiveSupport::Notifications\n\nend\n```\n\nAlternatively, you can use the built-in instrumenter for logging to the console for debugging:\n\n```ruby\nTaskinator.configure do |config|\n\n  # configure instrumenter to use\n  config.instrumenter = Taskinator::ConsoleInstrumenter.new\n\nend\n```\n\nThe following instrumentation events are issued:\n\n| Event                           | When                                                     |\n|---------------------------------|----------------------------------------------------------|\n| `taskinator.process.created`    | After a root process gets created                        |\n| `taskinator.process.saved`      | After a root process has been persisted to Redis         |\n| `taskinator.process.enqueued`   | After a process or subprocess is enqueued for processing |\n| `taskinator.process.processing` | When a process or subprocess is processing               |\n| `taskinator.process.paused`     | When a process or subprocess is paused                   |\n| `taskinator.process.resumed`    | When a process or subprocess is resumed                  |\n| `taskinator.process.completed`  | After a process or subprocess has completed processing   |\n| `taskinator.process.cancelled`  | After a process or subprocess has been cancelled         |\n| `taskinator.process.failed`     | After a process or subprocess has failed                 |\n| `taskinator.task.enqueued`      | After a task has been enqueued                           |\n| `taskinator.task.processing`    | When a task is processing                                |\n| `taskinator.task.completed`     | After a task has completed                               |\n| `taskinator.task.cancelled`     | After a task has been cancelled                          |\n| `taskinator.task.failed`        | After a task has failed                                  |\n\nFor all events, the data included contains the following information:\n\n| Key                             | Value                                                    |\n|---------------------------------|----------------------------------------------------------|\n| `:type`                         | The type name of the component reporting the event       |\n| `:definition`                   | The type name of the process definition                  |\n| `:process_uuid`                 | The UUID of the root process                             |\n| `:process_options`              | Options hash of the root process                         |\n| `:uuid`                         | The UUID of the respective task, job or sub process      |\n| `:options`                      | Options hash of the component                            |\n| `:state`                        | State of the component                                   |\n| `:percentage_completed`         | The percentage of completed tasks                        |\n| `:percentage_failed`            | The percentage of failed tasks                           |\n| `:percentage_cancelled`         | The percentage of cancelled tasks                        |\n\n## Notes\n\nThe persistence logic is decoupled from the implementation, so it is possible to implement\nanother backing store if required.\n\n## Contributing\n\n1. Fork it\n2. Create your feature branch (`git checkout -b my-new-feature`)\n3. Commit your changes (`git commit -am 'Add some feature'`)\n4. Push to the branch (`git push origin my-new-feature`)\n5. Create new Pull Request\n\n## License\n\nMIT Copyright (c) 2014 Chris Stefano\n\nPortions of code are from the Sidekiq project, Copyright (c) Contributed Systems LLC.\n\n## Inspiration\n\nInspired by the [sidekiq](https://github.com/mperham/sidekiq) and\n[workflow](https://github.com/geekq/workflow) gems.\n\nFor other workflow solutions, checkout [Stonepath](https://github.com/bokmann/stonepath),\nthe now deprecated [ruote](https://github.com/jmettraux/ruote) gem and\n[workflow](https://github.com/geekq/workflow).\n\nAlternatively, for a robust enterprise ready solution checkout the\n[AWS Flow Framework for Ruby](http://docs.aws.amazon.com/amazonswf/latest/awsrbflowguide/welcome.html).\n","funding_links":[],"categories":["Ruby"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fvirtualstaticvoid%2Ftaskinator","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fvirtualstaticvoid%2Ftaskinator","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fvirtualstaticvoid%2Ftaskinator/lists"}