{"id":13682601,"url":"https://github.com/exAspArk/batch-loader","last_synced_at":"2025-04-30T09:33:16.882Z","repository":{"id":23418090,"uuid":"98901089","full_name":"exAspArk/batch-loader","owner":"exAspArk","description":":zap: Powerful tool for avoiding N+1 DB or HTTP queries","archived":false,"fork":false,"pushed_at":"2024-04-24T15:23:08.000Z","size":282,"stargazers_count":1041,"open_issues_count":11,"forks_count":52,"subscribers_count":13,"default_branch":"main","last_synced_at":"2024-10-29T11:22:28.137Z","etag":null,"topics":["batching","dataloader","gem","graphql","graphql-ruby","n-plus-1","nplus1","ruby"],"latest_commit_sha":null,"homepage":"https://engineering.universe.com/batching-a-powerful-way-to-solve-n-1-queries-every-rubyist-should-know-24e20c6e7b94","language":"Ruby","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/exAspArk.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2017-07-31T14:59:54.000Z","updated_at":"2024-10-23T00:42:48.000Z","dependencies_parsed_at":"2024-06-18T12:36:15.721Z","dependency_job_id":"9e532c8e-c009-411a-bb16-e9823585fc1e","html_url":"https://github.com/exAspArk/batch-loader","commit_stats":null,"previous_names":[],"tags_count":23,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/exAspArk%2Fbatch-loader","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/exAspArk%2Fbatch-loader/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/exAspArk%2Fbatch-loader/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/exAspArk%2Fbatch-loader/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/exAspArk","download_url":"https://codeload.github.com/exAspArk/batch-loader/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":222383973,"owners_count":16975395,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["batching","dataloader","gem","graphql","graphql-ruby","n-plus-1","nplus1","ruby"],"created_at":"2024-08-02T13:01:49.520Z","updated_at":"2024-11-12T02:31:06.937Z","avatar_url":"https://github.com/exAspArk.png","language":"Ruby","readme":"# BatchLoader\n\n[![Coverage Status](https://coveralls.io/repos/github/exAspArk/batch-loader/badge.svg)](https://coveralls.io/github/exAspArk/batch-loader)\n[![Code Climate](https://img.shields.io/codeclimate/maintainability/exAspArk/batch-loader.svg)](https://codeclimate.com/github/exAspArk/batch-loader/maintainability)\n[![Downloads](https://img.shields.io/gem/dt/batch-loader.svg)](https://rubygems.org/gems/batch-loader)\n[![Latest Version](https://img.shields.io/gem/v/batch-loader.svg)](https://rubygems.org/gems/batch-loader)\n\nThis gem provides a generic lazy batching mechanism to avoid N+1 DB queries, HTTP queries, etc.\n\nDevelopers from these companies use `BatchLoader`:\n\n\u003ca href=\"https://about.gitlab.com/\"\u003e\u003cimg src=\"images/gitlab.png\" height=\"35\" width=\"114\" alt=\"GitLab\" style=\"max-width:100%;\"\u003e\u003c/a\u003e\n\u003cimg src=\"images/space.png\" height=\"35\" width=\"10\" alt=\"\" style=\"max-width:100%;\"\u003e\n\u003ca href=\"https://www.netflix.com/\"\u003e\u003cimg src=\"images/netflix.png\" height=\"35\" width=\"110\" alt=\"Netflix\" style=\"max-width:100%;\"\u003e\u003c/a\u003e\n\u003cimg src=\"images/space.png\" height=\"35\" width=\"10\" alt=\"\" style=\"max-width:100%;\"\u003e\n\u003ca href=\"https://www.alibaba.com/\"\u003e\u003cimg src=\"images/alibaba.png\" height=\"35\" width=\"86\" alt=\"Alibaba\" style=\"max-width:100%;\"\u003e\u003c/a\u003e\n\u003cimg src=\"images/space.png\" height=\"35\" width=\"10\" alt=\"\" style=\"max-width:100%;\"\u003e\n\u003ca href=\"https://www.universe.com/\"\u003e\u003cimg src=\"images/universe.png\" height=\"35\" width=\"137\" alt=\"Universe\" style=\"max-width:100%;\"\u003e\u003c/a\u003e\n\u003cimg src=\"images/space.png\" height=\"35\" width=\"10\" alt=\"\" style=\"max-width:100%;\"\u003e\n\u003ca href=\"https://www.wealthsimple.com/\"\u003e\u003cimg src=\"images/wealthsimple.png\" height=\"35\" width=\"150\" alt=\"Wealthsimple\" style=\"max-width:100%;\"\u003e\u003c/a\u003e\n\u003cimg src=\"images/space.png\" height=\"35\" width=\"10\" alt=\"\" style=\"max-width:100%;\"\u003e\n\u003ca href=\"https://decidim.org/\"\u003e\u003cimg src=\"images/decidim.png\" height=\"35\" width=\"94\" alt=\"Decidim\" style=\"max-width:100%;\"\u003e\u003c/a\u003e\n\n## Contents\n\n* [Highlights](#highlights)\n* [Usage](#usage)\n  * [Why?](#why)\n  * [Basic example](#basic-example)\n  * [How it works](#how-it-works)\n  * [RESTful API example](#restful-api-example)\n  * [GraphQL example](#graphql-example)\n  * [Loading multiple items](#loading-multiple-items)\n  * [Batch key](#batch-key)\n  * [Caching](#caching)\n  * [Replacing methods](#replacing-methods)\n* [Installation](#installation)\n* [API](#api)\n* [Related tools](#related-tools)\n* [Implementation details](#implementation-details)\n* [Development](#development)\n* [Contributing](#contributing)\n* [Alternatives](#alternatives)\n* [License](#license)\n* [Code of Conduct](#code-of-conduct)\n\n## Highlights\n\n* Generic utility to avoid N+1 DB queries, HTTP requests, etc.\n* Adapted Ruby implementation of battle-tested tools like [Haskell Haxl](https://github.com/facebook/Haxl), [JS DataLoader](https://github.com/facebook/dataloader), etc.\n* Batching is isolated and lazy, load data in batch where and when it's needed.\n* Automatically caches previous queries (identity map).\n* Thread-safe (`loader`).\n* No need to share batching through variables or custom defined classes.\n* No dependencies, no monkey-patches, no extra primitives such as Promises.\n\n## Usage\n\n### Why?\n\nLet's have a look at the code with N+1 queries:\n\n```ruby\ndef load_posts(ids)\n  Post.where(id: ids)\nend\n\nposts = load_posts([1, 2, 3])  #      Posts      SELECT * FROM posts WHERE id IN (1, 2, 3)\n                               #      _ ↓ _\n                               #    ↙   ↓   ↘\nusers = posts.map do |post|    #   U    ↓    ↓   SELECT * FROM users WHERE id = 1\n  post.user                    #   ↓    U    ↓   SELECT * FROM users WHERE id = 2\nend                            #   ↓    ↓    U   SELECT * FROM users WHERE id = 3\n                               #    ↘   ↓   ↙\n                               #      ¯ ↓ ¯\nputs users                     #      Users\n```\n\nThe naive approach would be to preload dependent objects on the top level:\n\n```ruby\n# With ORM in basic cases\ndef load_posts(ids)\n  Post.where(id: ids).includes(:user)\nend\n\n# But without ORM or in more complicated cases you will have to do something like:\ndef load_posts(ids)\n  # load posts\n  posts = Post.where(id: ids)\n  user_ids = posts.map(\u0026:user_id)\n\n  # load users\n  users = User.where(id: user_ids)\n  user_by_id = users.each_with_object({}) { |user, memo| memo[user.id] = user }\n\n  # map user to post\n  posts.each { |post| post.user = user_by_id[post.user_id] }\nend\n\nposts = load_posts([1, 2, 3])  #      Posts      SELECT * FROM posts WHERE id IN (1, 2, 3)\n                               #      _ ↓ _      SELECT * FROM users WHERE id IN (1, 2, 3)\n                               #    ↙   ↓   ↘\nusers = posts.map do |post|    #   U    ↓    ↓\n  post.user                    #   ↓    U    ↓\nend                            #   ↓    ↓    U\n                               #    ↘   ↓   ↙\n                               #      ¯ ↓ ¯\nputs users                     #      Users\n```\n\nBut the problem here is that `load_posts` now depends on the child association and knows that it has to preload data for future use. And it'll do it every time, even if it's not necessary. Can we do better? Sure!\n\n### Basic example\n\nWith `BatchLoader` we can rewrite the code above:\n\n```ruby\ndef load_posts(ids)\n  Post.where(id: ids)\nend\n\ndef load_user(post)\n  BatchLoader.for(post.user_id).batch do |user_ids, loader|\n    User.where(id: user_ids).each { |user| loader.call(user.id, user) }\n  end\nend\n\nposts = load_posts([1, 2, 3])  #      Posts      SELECT * FROM posts WHERE id IN (1, 2, 3)\n                               #      _ ↓ _\n                               #    ↙   ↓   ↘\nusers = posts.map do |post|    #   BL   ↓    ↓\n  load_user(post)              #   ↓    BL   ↓\nend                            #   ↓    ↓    BL\n                               #    ↘   ↓   ↙\n                               #      ¯ ↓ ¯\nputs users                     #      Users      SELECT * FROM users WHERE id IN (1, 2, 3)\n```\n\nAs we can see, batching is isolated and described right in a place where it's needed.\n\n### How it works\n\nIn general, `BatchLoader` returns a lazy object. Each lazy object knows which data it needs to load and how to batch the query. As soon as you need to use the lazy objects, they will be automatically loaded once without N+1 queries.\n\nSo, when we call `BatchLoader.for` we pass an item (`user_id`) which should be collected and used for batching later. For the `batch` method, we pass a block which will use all the collected items (`user_ids`):\n\n\u003cpre\u003e\nBatchLoader.for(post.\u003cb\u003euser_id\u003c/b\u003e).batch do |\u003cb\u003euser_ids\u003c/b\u003e, loader|\n  ...\nend\n\u003c/pre\u003e\n\nInside the block we execute a batch query for our items (`User.where`). After that, all we have to do is to call `loader` by passing an item which was used in `BatchLoader.for` method (`user_id`) and the loaded object itself (`user`):\n\n\u003cpre\u003e\nBatchLoader.for(post.\u003cb\u003euser_id\u003c/b\u003e).batch do |user_ids, loader|\n  User.where(id: user_ids).each { |user| loader.call(\u003cb\u003euser.id\u003c/b\u003e, \u003cb\u003euser\u003c/b\u003e) }\nend\n\u003c/pre\u003e\n\nWhen we call any method on the lazy object, it'll be automatically loaded through batching for all instantiated `BatchLoader`s:\n\n\u003cpre\u003e\nputs users # =\u003e SELECT * FROM users WHERE id IN (1, 2, 3)\n\u003c/pre\u003e\n\nFor more information, see the [Implementation details](#implementation-details) section.\n\n### RESTful API example\n\nNow imagine we have a regular Rails app with N+1 HTTP requests:\n\n```ruby\n# app/models/post.rb\nclass Post \u003c ApplicationRecord\n  def rating\n    HttpClient.request(:get, \"https://example.com/ratings/#{id}\")\n  end\nend\n\n# app/controllers/posts_controller.rb\nclass PostsController \u003c ApplicationController\n  def index\n    posts = Post.limit(10)\n    serialized_posts = posts.map { |post| {id: post.id, rating: post.rating} } # N+1 HTTP requests for each post.rating\n\n    render json: serialized_posts\n  end\nend\n```\n\nAs we can see, the code above will make N+1 HTTP requests, one for each post. Let's batch the requests with a gem called [parallel](https://github.com/grosser/parallel):\n\n```ruby\nclass Post \u003c ApplicationRecord\n  def rating_lazy\n    BatchLoader.for(post).batch do |posts, loader|\n      Parallel.each(posts, in_threads: 10) { |post| loader.call(post, post.rating) }\n    end\n  end\n\n  # ...\nend\n```\n\n`loader` is thread-safe. So, if `HttpClient` is also thread-safe, then with `parallel` gem we can execute all HTTP requests concurrently in threads (there are some benchmarks for [concurrent HTTP requests](https://github.com/exAspArk/concurrent_http_requests) in Ruby). Thanks to Matz, MRI releases GIL when thread hits blocking I/O – HTTP request in our case.\n\nIn the controller, all we have to do is to replace `post.rating` with the lazy `post.rating_lazy`:\n\n```ruby\nclass PostsController \u003c ApplicationController\n  def index\n    posts = Post.limit(10)\n    serialized_posts = posts.map { |post| {id: post.id, rating: post.rating_lazy} }\n\n    render json: serialized_posts\n  end\nend\n```\n\n`BatchLoader` caches the loaded values. To ensure that the cache is purged between requests in the app add the following middleware to your `config/application.rb`:\n\n```ruby\nconfig.middleware.use BatchLoader::Middleware\n```\n\nSee the [Caching](#caching) section for more information.\n\n### GraphQL example\n\nBatching is particularly useful with GraphQL. Using such techniques as preloading data in advance to avoid N+1 queries can be very complicated, since a user can ask for any available fields in a query.\n\nLet's take a look at the simple [graphql-ruby](https://github.com/rmosolgo/graphql-ruby) schema example:\n\n```ruby\nclass MyProjectSchema \u003c GraphQL::Schema\n  query Types::QueryType\nend\n\nmodule Types\n  class QueryType \u003c Types::BaseObject\n    field :posts, [PostType], null: false\n\n    def posts\n      Post.all\n    end\n  end\nend\n\nmodule Types\n  class PostType \u003c Types::BaseObject\n    name \"Post\"\n\n    field :user, UserType, null: false\n\n    def user\n      object.user # N+1 queries\n    end\n  end\nend\n\nmodule Types\n  class UserType \u003c Types::BaseObject\n    name \"User\"\n\n    field :name, String, null: false\n  end\nend\n```\n\nIf we want to execute a simple query like the following, we will get N+1 queries for each `post.user`:\n\n```ruby\nquery = \"\n{\n  posts {\n    user {\n      name\n    }\n  }\n}\n\"\nMyProjectSchema.execute(query)\n```\n\nTo avoid this problem, all we have to do is to change the resolver to return `BatchLoader::GraphQL` ([#32](https://github.com/exAspArk/batch-loader/pull/32) explains why not just `BatchLoader`):\n\n```ruby\nmodule Types\n  class PostType \u003c Types::BaseObject\n    name \"Post\"\n\n    field :user, UserType, null: false\n\n    def user\n      BatchLoader::GraphQL.for(object.user_id).batch do |user_ids, loader|\n        User.where(id: user_ids).each { |user| loader.call(user.id, user) }\n      end\n    end\n  end\nend\n```\n\nAnd setup GraphQL to use the built-in `lazy_resolve` method:\n\n```ruby\nclass MyProjectSchema \u003c GraphQL::Schema\n  query Types::QueryType\n  use BatchLoader::GraphQL\nend\n```\n\n---\n\nIf you need to use BatchLoader with ActiveRecord in multiple places, you can use this `preload:` helper shared by [Aha!](https://www.aha.io/engineering/articles/automatically-avoiding-graphql-n-1s):\n\n```rb\nfield :user, UserType, null: false, preload: :user\n#                                   ^^^^^^^^^^^^^^\n# Simply add this instead of defining custom `user` method with BatchLoader\n```\n\nAnd add this custom field resolver that uses ActiveRecord's preload functionality with BatchLoader:\n\n```rb\n# app/graphql/types/base_object.rb\nfield_class Types::PreloadableField\n\n# app/graphql/types/preloadable_field.rb\nclass Types::PreloadableField \u003c Types::BaseField\n  def initialize(*args, preload: nil, **kwargs, \u0026block)\n    @preloads = preload\n    super(*args, **kwargs, \u0026block)\n  end\n\n  def resolve(type, args, ctx)\n    return super unless @preloads\n\n    BatchLoader::GraphQL.for(type).batch(key: self) do |records, loader|\n      ActiveRecord::Associations::Preloader.new(records: records.map(\u0026:object), associations: @preloads).call\n      records.each { |r| loader.call(r, super(r, args, ctx)) }\n    end\n  end\nend\n```\n\n### Loading multiple items\n\nFor batches where there is no item in response to a call, we normally return `nil`. However, you can use `:default_value` to return something else instead:\n\n```ruby\nBatchLoader.for(post.user_id).batch(default_value: NullUser.new) do |user_ids, loader|\n  User.where(id: user_ids).each { |user| loader.call(user.id, user) }\nend\n```\n\nFor batches where the value is some kind of collection, such as an Array or Hash, `loader` also supports being called with a block, which yields the _current_ value, and returns the _next_ value. This is extremely useful for 1:Many (`has_many`) relationships:\n\n```ruby\nBatchLoader.for(user.id).batch(default_value: []) do |user_ids, loader|\n  Comment.where(user_id: user_ids).each do |comment|\n    loader.call(comment.user_id) { |memo| memo \u003c\u003c comment }\n  end\nend\n```\n\n### Batch key\n\nIt's possible to reuse the same `BatchLoader#batch` block for loading different types of data by specifying a unique `key`.\nFor example, with polymorphic associations:\n\n```ruby\ndef lazy_association(post)\n  id = post.association_id\n  key = post.association_type\n\n  BatchLoader.for(id).batch(key: key) do |ids, loader, args|\n    model = Object.const_get(args[:key])\n    model.where(id: ids).each { |record| loader.call(record.id, record) }\n  end\nend\npost1 = Post.save(association_id: 1, association_type: 'Tag')\npost2 = Post.save(association_id: 1, association_type: 'Category')\n\nlazy_association(post1) # SELECT * FROM tags WHERE id IN (1)\nlazy_association(post2) # SELECT * FROM categories WHERE id IN (1)\n```\n\nIt's also required to pass custom `key` when using `BatchLoader` with metaprogramming (e.g. `eval`).\n\n### Caching\n\nBy default `BatchLoader` caches the loaded values. You can test it by running something like:\n\n```ruby\ndef user_lazy(id)\n  BatchLoader.for(id).batch do |ids, loader|\n    User.where(id: ids).each { |user| loader.call(user.id, user) }\n  end\nend\n\nputs user_lazy(1) # SELECT * FROM users WHERE id IN (1)\n# =\u003e \u003c#User:...\u003e\n\nputs user_lazy(1) # no request\n# =\u003e \u003c#User:...\u003e\n```\n\nUsually, it's just enough to clear the cache between HTTP requests in the app. To do so, simply add the middleware:\n\n```ruby\nuse BatchLoader::Middleware\n```\n\nTo drop the cache manually you can run:\n\n```ruby\nputs user_lazy(1) # SELECT * FROM users WHERE id IN (1)\nputs user_lazy(1) # no request\n\nBatchLoader::Executor.clear_current\n\nputs user_lazy(1) # SELECT * FROM users WHERE id IN (1)\n```\n\nIn some rare cases it's useful to disable caching for `BatchLoader`. For example, in tests or after data mutations:\n\n```ruby\ndef user_lazy(id)\n  BatchLoader.for(id).batch(cache: false) do |ids, loader|\n    # ...\n  end\nend\n\nputs user_lazy(1) # SELECT * FROM users WHERE id IN (1)\nputs user_lazy(1) # SELECT * FROM users WHERE id IN (1)\n```\n\nIf you set `cache: false`, it's likely you also want `replace_methods: false` (see below section).\n\n### Replacing methods\n\nBy default, `BatchLoader` replaces methods on its instance by calling `#define_method` after batching to copy methods from the loaded value.\nThis consumes some time but allows to speed up any future method calls on the instance.\nIn some cases, when there are a lot of instances with a huge number of defined methods, this initial process of replacing the methods can be slow.\nYou may consider avoiding the \"up front payment\" and \"pay as you go\" with `#method_missing` by disabling the method replacement:\n\n```ruby\nBatchLoader.for(id).batch(replace_methods: false) do |ids, loader|\n  # ...\nend\n```\n\n## Installation\n\nAdd this line to your application's Gemfile:\n\n```ruby\ngem 'batch-loader'\n```\n\nAnd then execute:\n\n    $ bundle\n\nOr install it yourself as:\n\n    $ gem install batch-loader\n\n## API\n\n```ruby\nBatchLoader.for(item).batch(\n  default_value: default_value,\n  cache: cache,\n  replace_methods: replace_methods,\n  key: key\n) do |items, loader, args|\n  # ...\nend\n```\n\n| Argument Key      | Default                                                              | Description                                                                           |\n| ---------------   | ---------------------------------------------                        | -------------------------------------------------------------                         |\n| `item`            | -                                                                    | Item which will be collected and used for batching.                                   |\n| `default_value`   | `nil`                                                                | Value returned by default after batching.                                             |\n| `cache`           | `true`                                                               | Set `false` to disable caching between the same executions.                           |\n| `replace_methods` | `true`                                                               | Set `false` to use `#method_missing` instead of replacing the methods after batching. |\n| `key`             | `nil`                                                                | Pass custom key to uniquely identify the batch block.                                 |\n| `items`           | -                                                                    | List of collected items for batching.                                                 |\n| `loader`          | -                                                                    | Lambda which should be called to load values loaded in batch.                         |\n| `args`            | `{default_value: nil, cache: true, replace_methods: true, key: nil}` | Arguments passed to the `batch` method.                                               |\n\n## Related tools\n\nThese gems are built by using `BatchLoader`:\n\n* [decidim-core](https://github.com/decidim/decidim/) – participatory democracy framework made with Ruby on Rails.\n* [ams_lazy_relationships](https://github.com/Bajena/ams_lazy_relationships/) – ActiveModel Serializers add-on for eliminating N+1 queries.\n* [batch-loader-active-record](https://github.com/mathieul/batch-loader-active-record/) – ActiveRecord lazy association generator to avoid N+1 DB queries.\n\n`BatchLoader` in other programming languages:\n\n* [batch_loader](https://github.com/exaspark/batch_loader) - Elixir implementation.\n\n## Implementation details\n\nSee the [slides](https://speakerdeck.com/exaspark/batching-a-powerful-way-to-solve-n-plus-1-queries) [37-42].\n\n## Development\n\nAfter checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.\n\nTo install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org).\n\n## Contributing\n\nBug reports and pull requests are welcome on GitHub at https://github.com/exAspArk/batch-loader. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [Contributor Covenant](http://contributor-covenant.org) code of conduct.\n\n## Alternatives\n\nThere are some other Ruby implementations for batching such as:\n\n* [shopify/graphql-batch](https://github.com/shopify/graphql-batch)\n* [sheerun/dataloader](https://github.com/sheerun/dataloader)\n\nHowever, `batch-loader` has some differences:\n\n* It is implemented for general usage and can be used not only with GraphQL. In fact, we use it for RESTful APIs and GraphQL on production at the same time.\n* It doesn't try to mimic implementations in other programming languages which have an asynchronous nature. So, it doesn't load extra dependencies to bring such primitives as Promises, which are not very popular in Ruby community.\nInstead, it uses the idea of lazy objects, which are included in the [Ruby standard library](https://ruby-doc.org/core-2.4.1/Enumerable.html#method-i-lazy). These lazy objects allow one to return the necessary data at the end when it's necessary.\n* It doesn't force you to share batching through variables or custom defined classes, just pass a block to the `batch` method.\n* It doesn't require to return an array of the loaded objects in the same order as the passed items. I find it difficult to satisfy these constraints: to sort the loaded objects and add `nil` values for the missing ones. Instead, it provides the `loader` lambda which simply maps an item to the loaded object.\n* It doesn't depend on any other external dependencies. For example, no need to load huge external libraries for thread-safety, the gem is thread-safe out of the box.\n\n## License\n\nThe gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).\n\n## Code of Conduct\n\nEveryone interacting in the Batch::Loader project’s codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/exAspArk/batch-loader/blob/master/CODE_OF_CONDUCT.md).\n","funding_links":[],"categories":["Ruby"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FexAspArk%2Fbatch-loader","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FexAspArk%2Fbatch-loader","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FexAspArk%2Fbatch-loader/lists"}