{"id":18335079,"url":"https://github.com/cmeeren/batchit","last_synced_at":"2025-04-06T04:33:47.021Z","repository":{"id":87810455,"uuid":"209045400","full_name":"cmeeren/BatchIt","owner":"cmeeren","description":"Call batched functions as normal functions","archived":false,"fork":false,"pushed_at":"2024-01-19T11:17:48.000Z","size":108,"stargazers_count":10,"open_issues_count":1,"forks_count":0,"subscribers_count":4,"default_branch":"main","last_synced_at":"2024-04-29T11:21:06.369Z","etag":null,"topics":["batch-processing","batching","fsharp"],"latest_commit_sha":null,"homepage":"","language":"F#","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/cmeeren.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":".github/CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-09-17T12:16:54.000Z","updated_at":"2024-07-29T15:56:57.538Z","dependencies_parsed_at":"2024-01-19T12:33:22.296Z","dependency_job_id":"143f3358-76f8-42b3-a7c5-050f041464a4","html_url":"https://github.com/cmeeren/BatchIt","commit_stats":null,"previous_names":[],"tags_count":7,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cmeeren%2FBatchIt","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cmeeren%2FBatchIt/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cmeeren%2FBatchIt/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cmeeren%2FBatchIt/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/cmeeren","download_url":"https://codeload.github.com/cmeeren/BatchIt/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247435043,"owners_count":20938530,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["batch-processing","batching","fsharp"],"created_at":"2024-11-05T19:53:22.756Z","updated_at":"2025-04-06T04:33:46.529Z","avatar_url":"https://github.com/cmeeren.png","language":"F#","readme":"BatchIt\n==============\n\nBatchIt is a very simple F# library that allows you to write batched async functions and consume them as if they were\nnon-batched.\n\nIt’s useful if you have DB queries that are executed very frequently. This may for example be the case when using\nGraphQL resolvers or similar (e.g. JSON-API with [Felicity](https://github.com/cmeeren/Felicity)). BatchIt will also\ntransparently take care of deduplicating your input arguments, so that the batched function receives a distinct set of\ninputs.\n\nInstallation\n------------\n\nInstall the [BatchIt](https://www.nuget.org/packages/BatchIt) package from NuGet.\n\nQuick start\n-----------\n\nBatchIt centers around these two function signatures:\n\n* Normal: `'a -\u003e Async\u003c'b\u003e`\n  * Accepts a single argument `'a` (which can be a tuple or a record if you need multiple values) and returns a single\n    value `'b`\n* Batched: `'a [] -\u003e Async\u003c('a * 'b) []\u003e'`\n  * Accepts an array of `'a` (all the values to use for a single batch) and returns an array of output values `'b`\n    together with their corresponding input values (so that BatchIt can correlate inputs with outputs).\n\nYour job is to implement the batched function, and then you use BatchIt to configure timing and convert it to the normal\nversion. Any calls to the normal version will then be batched using your configuration and your batched function.\n\n### Example\n\n```f#\ntype Person = { Name: string }\ntype FindArgs = { PersonId: Guid; TenantId: Guid }\n\nlet getPersonBatched (args: FindArgs []) : Async\u003c(FindArgs * Person option) []\u003e =\n  async {\n    // BatchIt is not concerned with what actually happens inside this function.\n\n    // Here you can e.g. query a DB with all the args. If using SQL, you might\n    // use TVP or XML to get the args into the query and use INNER JOIN instead\n    // of WHERE to filter your results.\n\n    // Note that the query must return the input args for each row along with the\n    // output values, so that you can create the FindArgs to return, too.\n\n    // Example:\n\n    let! rows = args |\u003e ... // get rows for all input args\n\n    return\n      rows\n      |\u003e Seq.toArray\n      |\u003e Array.groupBy (fun r -\u003e\n          { PersonId = r.inputPersonId; TenantId = r.inputTenantId })\n      |\u003e Array.map (fun (batchKey, rows) -\u003e\n          batchKey,\n\t        rows |\u003e Array.tryHead |\u003e Option.map (fun r -\u003e { Name = r.name }))\n    }\n\n// Now create the \"normal\" function using BatchIt:\n\nopen BatchIt\n\nlet getPerson : PersonArgs -\u003e Async\u003cPerson option\u003e =\n  Batch.Create(getPersonBatched, 50, 100, 1000)\n\n// Now any calls to the seemingly normal getPerson function will be batched.\n// Each call will reset wait time to 50ms, and it will wait at least 100ms\n// and at most 1000ms.\n```\n\n### Don’t add arguments to the non-batched function\n\nThe non-batched function needs to be defined without arguments (as a module-level function-typed value), as shown above.\nDon’t do the following:\n\n```f#\n// Don't do this!\nlet getPerson args = Batch.Create(getPersonBatched, 50, 100, 1000) args\n```\n\nThe above would result in a new batched version to be created for every invocation, and no batching would take place (\nall batch sizes would be 1, plus there’d be a big overhead for each call).\n\nIf you want to make the non-batched function prettier (e.g. to name the arguments), simply wrap it in another function:\n\n```f#\nlet private getPerson' = Batch.Create(getPersonBatched, 50, 100, 1000)\n\nlet getPerson args = getPerson' args\n```\n\nConfiguration\n-------------\n\nBatchIt’s only public API is the `Batch.Create` overloads. You have the following options:\n\n* Throttle-like: Execute the batch X ms after the first call\n  * Optionally also limit max batch size\n* Debounce-like: Execute the batch X ms after the last call (timer resets when new calls come in before X ms), but wait\n  at least Y ms and at most Z ms\n  * Optionally also limit max batch size\n* Specify the value the “normal” function should return if the batched output did not contain an item corresponding to\n  the supplied input\n  * For `option`, `voption`, `list`, and `array`, will automatically use `None`, `ValueNone`, `List.empty`,\n    and `Array.empty`, respectively\n  * For other types, you must supply the missing value yourself\n\nExtra parameters to the batched function (e.g. connection strings)\n------------------------------------------------------------------\n\nBatchIt supports functions that accept one extra parameter in addition to the argument array. This can be used e.g. for\npassing a connection string. (To pass multiple values, use a tuple or record.) The extra argument will be passed through\nfrom the non-batched function returned by `Batch.Create`.\n\nExample:\n\n```f#\nlet getPersonBatched\n\t\t(connStr: string)\n\t\t(args: FindArgs array)\n\t\t: Async\u003c(FindArgs * Person option) array\u003e =\n  ...  // like shown previously, but obviously uses the connection string somewhere\n\n\nopen BatchIt\n\nlet getPerson : string -\u003e PersonArgs -\u003e Async\u003cPerson option\u003e =\n  Batch.Create(getPersonBatched, 50, 100, 1000)\n```\n\nIf the non-batched function is called with different values for the extra argument, all values are still collected and\nrun as part of the same batch (regarding timing, max size, etc.), but the batched arguments are grouped by the\ncorresponding extra argument value and the batched function is then invoked once for each unique extra value (with the\ncorresponding batched arguments).\n\nContributing\n------------\n\nContributions and ideas are welcome! Please\nsee [Contributing.md](https://github.com/cmeeren/BatchIt/blob/master/.github/CONTRIBUTING.md) for details.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcmeeren%2Fbatchit","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcmeeren%2Fbatchit","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcmeeren%2Fbatchit/lists"}