{"id":27070271,"url":"https://github.com/r-lib/mirai","last_synced_at":"2025-04-05T22:08:25.077Z","repository":{"id":50226368,"uuid":"459341940","full_name":"r-lib/mirai","owner":"r-lib","description":"mirai - Minimalist Async Evaluation Framework for R","archived":false,"fork":false,"pushed_at":"2025-04-01T20:45:58.000Z","size":11888,"stargazers_count":220,"open_issues_count":4,"forks_count":10,"subscribers_count":8,"default_branch":"main","last_synced_at":"2025-04-01T21:26:30.824Z","etag":null,"topics":["async","asynchronous-tasks","concurrency","distributed-computing","high-performance-computing","parallel-computing","r"],"latest_commit_sha":null,"homepage":"https://r-lib.github.io/mirai/","language":"R","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/r-lib.png","metadata":{"files":{"readme":"README.Rmd","changelog":"NEWS.md","contributing":null,"funding":null,"license":"LICENSE.md","code_of_conduct":".github/CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-02-14T22:11:55.000Z","updated_at":"2025-04-01T20:44:11.000Z","dependencies_parsed_at":"2023-10-15T03:25:08.780Z","dependency_job_id":"275d847b-1732-41d1-8b42-d7ec399ad990","html_url":"https://github.com/r-lib/mirai","commit_stats":{"total_commits":898,"total_committers":1,"mean_commits":898.0,"dds":0.0,"last_synced_commit":"9566bbc35bd89d24939bcc8864fc29f6fc11d486"},"previous_names":["r-lib/mirai"],"tags_count":40,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/r-lib%2Fmirai","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/r-lib%2Fmirai/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/r-lib%2Fmirai/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/r-lib%2Fmirai/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/r-lib","download_url":"https://codeload.github.com/r-lib/mirai/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247406091,"owners_count":20933803,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["async","asynchronous-tasks","concurrency","distributed-computing","high-performance-computing","parallel-computing","r"],"created_at":"2025-04-05T22:08:24.188Z","updated_at":"2025-04-05T22:08:25.019Z","avatar_url":"https://github.com/r-lib.png","language":"R","readme":"---\noutput: github_document\n---\n\n\u003c!-- README.md is generated from README.Rmd. Please edit that file --\u003e\n\n```{r, include = FALSE}\nknitr::opts_chunk$set(\n  collapse = TRUE,\n  comment = \"#\u003e\",\n  fig.path = \"man/figures/README-\",\n  out.width = \"100%\"\n)\n```\n\n# mirai \u003ca href=\"https://mirai.r-lib.org/\" alt=\"mirai\"\u003e\u003cimg src=\"man/figures/logo.png\" alt=\"mirai logo\" align=\"right\" width=\"120\"/\u003e\u003c/a\u003e\n\n\u003c!-- badges: start --\u003e\n[![CRAN status](https://www.r-pkg.org/badges/version/mirai)](https://CRAN.R-project.org/package=mirai)\n[![R-universe status](https://r-lib.r-universe.dev/badges/mirai)](https://r-lib.r-universe.dev/mirai)\n[![R-CMD-check](https://github.com/r-lib/mirai/workflows/R-CMD-check/badge.svg)](https://github.com/r-lib/mirai/actions)\n[![codecov](https://codecov.io/gh/r-lib/mirai/graph/badge.svg)](https://app.codecov.io/gh/r-lib/mirai)\n[![DOI](https://zenodo.org/badge/459341940.svg)](https://zenodo.org/badge/latestdoi/459341940)\n\u003c!-- badges: end --\u003e\n\n### ミライ\n\u003cbr /\u003e\nみらい  未来\n\u003cbr /\u003e\u003cbr /\u003e\nMinimalist Async Evaluation Framework for R\n\u003cbr /\u003e\u003cbr /\u003e\nDesigned for simplicity, a 'mirai' evaluates an R expression asynchronously in a parallel process, locally or distributed over the network. The result is automatically available upon completion.\n\nModern networking and concurrency, built on [nanonext](https://github.com/r-lib/nanonext/) and [NNG](https://nng.nanomsg.org/) (Nanomsg Next Gen), ensures reliable and efficient scheduling over fast inter-process communications or TCP/IP secured by TLS. Distributed computing can launch remote resources via SSH or cluster managers.\n\nAn inherently queued architecture handles many more tasks than available processes, and requires no storage on the file system. Innovative features include support for otherwise non-exportable reference objects, event-driven promises, and asynchronous parallel map.\n\u003cbr /\u003e\u003cbr /\u003e\n\n### Quick Start\n\nUse `mirai()` to evaluate an expression asynchronously in a separate, clean R process.\n\nThe following mimics an expensive calculation that eventually returns a vector of random values.\n```{r exec}\nlibrary(mirai)\n\nm \u003c- mirai({Sys.sleep(n); rnorm(n, mean)}, n = 5L, mean = 7)\n```\nThe mirai expression is evaluated in another process and hence must be self-contained, not referring to variables that do not already exist there. Above, the variables `n` and `mean` are passed as part of the `mirai()` call.\n\nA 'mirai' object is returned immediately - creating a mirai never blocks the session.\n\nWhilst the async operation is ongoing, attempting to access a mirai's data yields an 'unresolved' logical NA.\n```{r do}\nm\nm$data\n```\nTo check whether a mirai remains unresolved (yet to complete):\n```{r unres}\nunresolved(m)\n```\nTo wait for and collect the return value, use the mirai's `[]` method:\n```{r call}\nm[]\n```\nAs a mirai represents an async operation, it is never necessary to wait for it. Other code can continue to run. Once it completes, the return value automatically becomes available at `$data`.\n```{r resolv}\nwhile (unresolved(m)) {\n  # do work here that does not depend on 'm'\n}\nm\nm$data\n```\n\n#### Daemons\n\nDaemons are persistent background processes for receiving mirai requests, and are created as easily as:\n\n```{r daemons}\ndaemons(4)\n```\n\nDaemons may also be deployed [remotely](https://mirai.r-lib.org/articles/mirai.html#distributed-computing-remote-daemons) for distributed computing and [launchers](https://shikokuchuo.net/mirai/articles/mirai.html#distributed-computing-launching-daemons) can start daemons across the network via (tunnelled) SSH or a cluster resource manager.\n\n[Secure TLS connections](https://mirai.r-lib.org/articles/mirai.html#distributed-computing-tls-secure-connections) can be used for remote daemon connections, with zero configuration required.\n\n#### Async Parallel Map\n\n`mirai_map()` maps a function over a list or vector, with each element processed in a separate parallel process. It also performs multiple map over the rows of a dataframe or matrix.\n\n```{r map}\ndf \u003c- data.frame(\n  fruit = c(\"melon\", \"grapes\", \"coconut\"),\n  price = c(3L, 5L, 2L)\n)\nm \u003c- mirai_map(df, \\(...) sprintf(\"%s: $%d\", ...))\n```\nA 'mirai_map' object is returned immediately. Other code can continue to run at this point. Its value may be retrieved at any time using its `[]` method to return a list, just like `purrr::map()` or `base::lapply()`. The `[]` method also provides options for flatmap, early stopping and/or progress indicators. \n```{r mapvalue}\nm\nm[.flat]\n```\nAll errors are returned as 'errorValues', facilitating recovery from partial failure. There are further [advantages](https://mirai.r-lib.org/articles/mirai.html#asynchronous-parallel-map) over alternative map implementations.\n\n### Design Concepts\n\nmirai is designed from the ground up to provide a production-grade experience.\n\n- Fast\n  + 1,000x more responsive compared to common alternatives [\u003csup\u003e[1]\u003c/sup\u003e](https://github.com/r-lib/mirai/pull/142#issuecomment-2457589563)\n  + Built for low-latency applications such as real time inference or Shiny apps\n\n- Reliable\n  + Consistent behaviour with no reliance on global options or variables\n  + Each mirai call is evaluated explicitly for transparent and predictable results\n\n- Scalable\n  + Launch millions of tasks simultaneously over thousands of connections\n  + Proven track record handling heavy-duty workloads in the life sciences industry\n\n[\u003cimg alt=\"Joe Cheng on mirai with Shiny\" src=\"https://img.youtube.com/vi/GhX0PcEm3CY/hqdefault.jpg\" width = \"300\" height=\"225\" /\u003e](https://youtu.be/GhX0PcEm3CY?t=1740) \u0026nbsp;\n[\u003cimg alt=\"Will Landau on mirai in clinical trials\" src=\"https://img.youtube.com/vi/cyF2dzloVLo/hqdefault.jpg\" width = \"300\" height=\"225\" /\u003e](https://youtu.be/cyF2dzloVLo?t=5127)\n\n\u003e *mirai パッケージを試してみたところ、かなり速くて驚きました*\n\n### Integrations\n\nmirai features the following core integrations, with usage examples in the linked vignettes:\n\n[\u003cimg alt=\"R parallel\" src=\"https://www.r-project.org/logo/Rlogo.png\" width=\"40\" height=\"31\" /\u003e](https://mirai.r-lib.org/articles/parallel.html) \u0026nbsp; Provides an alternative communications backend for R, implementing a new parallel cluster type, a feature request by R-Core at R Project Sprint 2023. 'miraiCluster' may also be used with 'foreach' via 'doParallel'.\n\n[\u003cimg alt=\"promises\" src=\"https://solutions.posit.co/images/brand/posit-icon-fullcolor.svg\" width=\"40\" height=\"36\" /\u003e](https://mirai.r-lib.org/articles/promises.html) \u0026nbsp; Implements the next generation of completely event-driven, non-polling promises. 'mirai' and 'mirai_map' objects may be used interchageably with 'promises', including with the promise pipe `%...\u003e%`.\n\n[\u003cimg alt=\"Shiny\" src=\"https://github.com/rstudio/shiny/raw/main/man/figures/logo.png\" width=\"40\" height=\"46\" /\u003e](https://mirai.r-lib.org/articles/shiny.html) \u0026nbsp; Asynchronous parallel / distributed backend, supporting the next level of responsiveness and scalability within Shiny, with native support for ExtendedTask.\n\n[\u003cimg alt=\"Plumber\" src=\"https://rstudio.github.io/cheatsheets/html/images/logo-plumber.png\" width=\"40\" height=\"46\" /\u003e](https://mirai.r-lib.org/articles/plumber.html) \u0026nbsp; Asynchronous parallel / distributed backend for scaling Plumber applications in production.\n\n[\u003cimg alt=\"Arrow\" src=\"https://arrow.apache.org/img/arrow-logo_hex_black-txt_white-bg.png\" width=\"40\" height=\"46\" /\u003e](https://mirai.r-lib.org/articles/databases.html) \u0026nbsp; Allows queries using the Apache Arrow format to be handled seamlessly over ADBC database connections hosted in background processes.\n\n[\u003cimg alt=\"torch\" src=\"https://torch.mlverse.org/css/images/hex/torch.png\" width=\"40\" height=\"46\" /\u003e](https://mirai.r-lib.org/articles/torch.html) \u0026nbsp; Allows Torch tensors and complex objects such as models and optimizers to be used seamlessly across parallel processes.\n\n### Powering Crew and Targets High Performance Computing\n\n[\u003cimg alt=\"targets\" src=\"https://github.com/ropensci/targets/raw/main/man/figures/logo.png\" width=\"40\" height=\"46\" /\u003e](https://docs.ropensci.org/targets/) \u0026nbsp; Targets, a Make-like pipeline tool for statistics and data science, has integrated and adopted the crew package as its default high-performance computing backend.\n\n[\u003cimg alt=\"crew\" src=\"https://github.com/wlandau/crew/raw/main/man/figures/logo.png\" width=\"40\" height=\"46\" /\u003e](https://wlandau.github.io/crew/) \u0026nbsp; Crew is a distributed worker-launcher extending mirai to different distributed computing platforms, from traditional clusters to cloud services.\n\n[\u003cimg alt=\"crew.cluster\" src=\"https://github.com/wlandau/crew.cluster/raw/main/man/figures/logo.png\" width=\"40\" height=\"46\" /\u003e](https://wlandau.github.io/crew.cluster/) \u0026nbsp; 'crew.cluster' enables mirai-based workflows on traditional high-performance computing clusters using LFS, PBS/TORQUE, SGE and Slurm.\n\n[\u003cimg alt=\"crew.aws.batch\" src=\"https://github.com/wlandau/crew.aws.batch/raw/main/man/figures/logo.png\" width=\"40\" height=\"46\" /\u003e](https://wlandau.github.io/crew.aws.batch/) \u0026nbsp; 'crew.aws.batch' extends mirai to cloud computing using AWS Batch.\n\n### Thanks\n\nWe would like to thank in particular:\n\n[Will Landau](https://github.com/wlandau/) for being instrumental in shaping development of the package, from initiating the original request for persistent daemons, through to orchestrating robustness testing for the high performance computing requirements of crew and targets.\n\n[Joe Cheng](https://github.com/jcheng5/) for integrating the 'promises' method to work seamlessly within Shiny, and prototyping event-driven promises.\n\n[Luke Tierney](https://github.com/ltierney/) of R Core, for discussion on L'Ecuyer-CMRG streams to ensure statistical independence in parallel processing, and making it possible for mirai to be the first 'alternative communications backend for R'.\n\n[Henrik Bengtsson](https://github.com/HenrikBengtsson/) for valuable insights leading to the interface accepting broader usage patterns.\n\n[Daniel Falbel](https://github.com/dfalbel/) for discussion around an efficient solution to serialization and transmission of torch tensors.\n\n[Kirill Müller](https://github.com/krlmlr/) for discussion on using parallel processes to host Arrow database connections.\n\n[\u003cimg alt=\"R Consortium\" src=\"https://r-consortium.org/images/RConsortium_Horizontal_Pantone.webp\" width=\"100\" height=\"22\" /\u003e](https://r-consortium.org/)\u0026nbsp; for funding work on the TLS implementation in nanonext, used to provide secure connections in mirai.\n\n### Installation\n\nInstall the latest release from CRAN:\n\n```{r cran, eval=FALSE}\ninstall.packages(\"mirai\")\n```\n\nThe current development version is available from R-universe:\n\n```{r universe, eval=FALSE}\ninstall.packages(\"mirai\", repos = \"https://r-lib.r-universe.dev\")\n```\n\n### Links \u0026 References\n\n◈ mirai R package: \u003chttps://mirai.r-lib.org/\u003e \u003cbr /\u003e\n◈ nanonext R package: \u003chttps://nanonext.r-lib.org/\u003e\n\nmirai is listed in CRAN High Performance Computing Task View: \u003cbr /\u003e\n\u003chttps://cran.r-project.org/view=HighPerformanceComputing\u003e\n\n--\n\nPlease note that this project is released with a [Contributor Code of Conduct](https://mirai.r-lib.org/CODE_OF_CONDUCT.html). By participating in this project you agree to abide by its terms.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fr-lib%2Fmirai","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fr-lib%2Fmirai","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fr-lib%2Fmirai/lists"}