{"id":13423585,"url":"https://github.com/flashlight/flashlight","last_synced_at":"2025-04-23T20:58:16.777Z","repository":{"id":37105040,"uuid":"161376600","full_name":"flashlight/flashlight","owner":"flashlight","description":"A C++ standalone library for machine learning","archived":false,"fork":false,"pushed_at":"2025-03-28T02:16:30.000Z","size":16335,"stargazers_count":5361,"open_issues_count":122,"forks_count":502,"subscribers_count":118,"default_branch":"main","last_synced_at":"2025-04-23T20:58:10.533Z","etag":null,"topics":["autograd","cpp","deep-learning","flashlight","machine-learning","ml","neural-network"],"latest_commit_sha":null,"homepage":"https://fl.readthedocs.io/en/latest/","language":"C++","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/flashlight.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":"CITATION","codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-12-11T18:28:47.000Z","updated_at":"2025-04-22T09:30:57.000Z","dependencies_parsed_at":"2023-02-04T11:02:07.947Z","dependency_job_id":"830c30b5-3c53-4699-98ef-9d4c541484bc","html_url":"https://github.com/flashlight/flashlight","commit_stats":null,"previous_names":["facebookresearch/flashlight"],"tags_count":6,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/flashlight%2Fflashlight","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/flashlight%2Fflashlight/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/flashlight%2Fflashlight/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/flashlight%2Fflashlight/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/flashlight","download_url":"https://codeload.github.com/flashlight/flashlight/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250514767,"owners_count":21443208,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["autograd","cpp","deep-learning","flashlight","machine-learning","ml","neural-network"],"created_at":"2024-07-31T00:00:38.191Z","updated_at":"2025-04-23T20:58:16.759Z","avatar_url":"https://github.com/flashlight.png","language":"C++","readme":"[![CircleCI](https://circleci.com/gh/flashlight/flashlight.svg?style=shield)](https://app.circleci.com/pipelines/github/flashlight/flashlight)\n[![Documentation Status](https://img.shields.io/readthedocs/fl.svg)](https://fl.readthedocs.io/en/latest/)\n[![Docker Image Build Status](https://img.shields.io/github/workflow/status/flashlight/flashlight/Publish%20Docker%20images?label=docker%20image%20build)](https://hub.docker.com/r/flml/flashlight/tags)\n[![Join the chat at https://gitter.im/flashlight-ml/community](https://img.shields.io/gitter/room/flashlight-ml/community)](https://gitter.im/flashlight-ml/community?utm_source=badge\u0026utm_medium=badge\u0026utm_campaign=pr-badge\u0026utm_content=badge)\n\n[![codecov](https://codecov.io/gh/flashlight/flashlight/branch/master/graph/badge.svg?token=rBp4AilMc0)](https://codecov.io/gh/flashlight/flashlight)\n\n[![Docker Image for CUDA backend](https://img.shields.io/docker/image-size/flml/flashlight/cuda-latest?label=docker%20%28cuda%29\u0026logo=docker)](https://hub.docker.com/r/flml/flashlight/tags?page=1\u0026ordering=last_updated\u0026name=cuda-latest)\n[![Docker Image for CPU backend](https://img.shields.io/docker/image-size/flml/flashlight/cpu-latest?label=docker%20%28cpu%29\u0026logo=docker)](https://hub.docker.com/r/flml/flashlight/tags?page=1\u0026ordering=last_updated\u0026name=cpu-latest)\n\n[![Install CUDA backend with vcpkg](https://img.shields.io/badge/dynamic/json?color=orange\u0026label=get%20%28cuda%29\u0026query=name\u0026url=https%3A%2F%2Fraw.githubusercontent.com%2Fmicrosoft%2Fvcpkg%2Fmaster%2Fports%2Fflashlight-cuda%2Fvcpkg.json\u0026prefix=vcpkg%20install%20)](https://vcpkg.info/port/flashlight-cuda)\n[![Install CPU backend with vcpkg](https://img.shields.io/badge/dynamic/json?color=orange\u0026label=get%20%28cpu%29\u0026query=name\u0026url=https%3A%2F%2Fraw.githubusercontent.com%2Fmicrosoft%2Fvcpkg%2Fmaster%2Fports%2Fflashlight-cpu%2Fvcpkg.json\u0026prefix=vcpkg%20install%20)](https://vcpkg.info/port/flashlight-cpu)\n\n\nFlashlight is a fast, flexible machine learning library written entirely in C++\nfrom the Facebook AI Research and the creators of Torch, TensorFlow, Eigen and\nDeep Speech. Its core features include:\n- **Total internal modifiability** including [internal APIs for tensor computation](flashlight/fl/tensor/README.md).\n- **A small footprint**, with the core clocking in at under 10 MB and 20k lines of C++.\n- **High-performance defaults** featuring just-in-time kernel compilation with modern C++ via the [*ArrayFire*](https://github.com/arrayfire/arrayfire)\ntensor library.\n- An emphasis on efficiency and scale.\n\nNative support in C++ and simple extensibility makes Flashlight a powerful research framework that enables fast iteration on new experimental setups and algorithms with little unopinionation and without sacrificing performance. In a single repository, Flashlight provides [apps](https://github.com/flashlight/flashlight/tree/master/flashlight/app) for research across multiple domains:\n- [Automatic speech recognition](https://github.com/flashlight/flashlight/tree/master/flashlight/app/asr) (formerly [wav2letter](https://github.com/flashlight/wav2letter/) project) — [Documentation](flashlight/app/asr) | [Tutorial](flashlight/app/asr/tutorial)\n- [Image classification](flashlight/app/imgclass)\n- [Object detection](flashlight/app/objdet)\n- [Language modeling](flashlight/app/lm)\n\n### Project Layout\n\nFlashlight is broken down into a few parts:\n- [**`flashlight/lib`**](flashlight/lib) contains kernels and standalone utilities for audio processing and more.\n- [**`flashlight/fl`**](flashlight/fl) is the core tensor interface and neural network library using the [ArrayFire](https://github.com/arrayfire/arrayfire) tensor library by default.\n- [**`flashlight/pkg`**](flashlight/pkg) are domain packages for speech, vision, and text built on the core.\n- [**`flashlight/app`**](flashlight/app) are applications of the core library to machine learning across domains.\n\n## Quickstart\n\nFirst, [build and install Flashlight](#building-and-installing) and [link it to your own project](#building-your-own-project-with-flashlight).\n\n[`Sequential`](https://fl.readthedocs.io/en/latest/modules.html#sequential) forms a sequence of Flashlight [`Module`](https://fl.readthedocs.io/en/latest/modules.html#module)s for chaining computation.\n\n\u003cdetails\u003e\u003csummary\u003eImplementing a simple convnet is easy.\u003c/summary\u003e\n\n```c++\n#include \u003cflashlight/fl/flashlight.h\u003e\n\nSequential model;\n\nmodel.add(View(fl::Shape({IM_DIM, IM_DIM, 1, -1})));\nmodel.add(Conv2D(\n    1 /* input channels */,\n    32 /* output channels */,\n    5 /* kernel width */,\n    5 /* kernel height */,\n    1 /* stride x */,\n    1 /* stride y */,\n    PaddingMode::SAME; /* padding mode */,\n    PaddingMode::SAME; /* padding mode */));\nmodel.add(ReLU());\nmodel.add(Pool2D(\n    2 /* kernel width */,\n    2 /* kernel height */,\n    2 /* stride x */,\n    2 /* stride y */));\nmodel.add(Conv2D(32, 64, 5, 5, 1, 1, PaddingMode::SAME, PaddingMode::SAME));\nmodel.add(ReLU());\nmodel.add(Pool2D(2, 2, 2, 2));\nmodel.add(View(fl::Shape({7 * 7 * 64, -1})));\nmodel.add(Linear(7 * 7 * 64, 1024));\nmodel.add(ReLU());\nmodel.add(Dropout(0.5));\nmodel.add(Linear(1024, 10));\nmodel.add(LogSoftmax());\n```\n\nPerforming forward and backward computation is straightforwards:\n```c++\nauto output = model.forward(input);\nauto loss = categoricalCrossEntropy(output, target);\nloss.backward();\n```\n\n\u003c/details\u003e\n\nSee the [MNIST example](https://fl.readthedocs.io/en/latest/mnist.html) for a full tutorial including a training loop and dataset abstractions.\n\n[`Variable`](https://fl.readthedocs.io/en/latest/variable.html) is a tape-based abstraction that wraps [Flashlight tensors](https://github.com/flashlight/flashlight/blob/main/flashlight/fl/tensor/TensorBase.h). Tape-based [Automatic differentiation in Flashlight](https://fl.readthedocs.io/en/latest/autograd.html) is simple and works as you'd expect.\n\n\u003cdetails\u003e\u003csummary\u003eAutograd Example\u003c/summary\u003e\n\n```c++\nauto A = Variable(fl::rand({1000, 1000}), true /* calcGrad */);\nauto B = 2.0 * A;\nauto C = 1.0 + B;\nauto D = log(C);\nD.backward(); // populates A.grad() along with gradients for B, C, and D.\n```\n\n\u003c/details\u003e\n\n## Building and Installing\n[**Install with `vcpkg`**](#library-installation-with-vcpkg) | [**With Docker**](#building-and-running-flashlight-with-docker) | [**From Source**](#building-from-source) | [**From Source with `vcpkg`**](#from-source-build-with-vcpkg) | [**Build Your Project with Flashlight**](#building-your-own-project-with-flashlight)\n\n### Requirements\nAt minimum, compilation requires:\n- A C++ compiler with good C++17 support (e.g. gcc/g++ \u003e= 7)\n- [CMake](https://cmake.org/) — version 3.10 or later, and ``make``\n- A Linux-based operating system.\n\nSee the [full dependency](#dependencies) list for more details if [building from source](#building-from-source).\n\nInstructions for building/installing Python bindings [can be found here](bindings/python/README.md).\n\n### Flashlight Build Setups\n\nFlashlight can be broken down into several components as [described above](#project-layout). Each component can be incrementally built by specifying the correct [build options](#build-options).\n\nThere are two ways to work with Flashlight:\n1. **As an installed library** that you link to with your own project. This is best for building standalone applications dependent on Flashlight.\n2. **With in-source development** where the Flashlight project source is changed and rebuilt. This is best if customizing/hacking the core framework or the Flashlight-provided [app binaries](flashlight/app).\n\nFlashlight can be built in one of two ways:\n1. [**With `vcpkg`**](#installing-flashlight-with-vcpkg), a [C++ package manager](https://github.com/microsoft/vcpkg).\n2. [**From source**](#building-from-source) by installing dependencies as needed.\n\n### Installing Flashlight with `vcpkg`\n#### Library Installation with `vcpkg`\n\nFlashlight is most-easily built and installed with `vcpkg`. Both the CUDA and CPU backends are supported with `vcpkg`. For either backend, first install [Intel MKL](https://software.intel.com/content/www/us/en/develop/tools/oneapi/base-toolkit/download.html). For the CUDA backend, install [`CUDA` \u003e= 9.2](https://developer.nvidia.com/cuda-downloads), [`cuDNN`](https://docs.nvidia.com/deeplearning/cudnn/install-guide/index.html), and [`NCCL`](https://docs.nvidia.com/deeplearning/nccl/install-guide/index.html). Then, after [installing `vcpkg`](https://github.com/microsoft/vcpkg#getting-started), install the libraries and core with:\n```shell\n./vcpkg/vcpkg install flashlight-cuda # CUDA backend, OR\n./vcpkg/vcpkg install flashlight-cpu  # CPU backend\n```\nTo install [Flashlight apps](flashlight/app), check the features available for installation by running `./vcpkg search flashlight-cuda` or `./vcpkg search flashlight-cpu`. Each app is a \"feature\": for example, `./vcpkg install flashlight-cuda[asr]` installs the ASR app with the CUDA backend.\n\nBelow is the currently-supported list of features (for each of [`flashlight-cuda`](https://vcpkg.info/port/flashlight-cuda) and [`flashlight-cpu`](https://vcpkg.info/port/flashlight-cpu)):\n```\nflashlight-{cuda/cpu}[lib]      # Flashlight libraries\nflashlight-{cuda/cpu}[nn]       # Flashlight neural net library\nflashlight-{cuda/cpu}[asr]      # Flashlight speech recognition app\nflashlight-{cuda/cpu}[lm]       # Flashlight language modeling app\nflashlight-{cuda/cpu}[imgclass] # Flashlight image classification app\n```\n\nFlashlight [app binaries](flashlight/app) are also built for the selected features and are installed into the `vcpkg` install tree's `tools` directory.\n\n[Integrating Flashlight into your own project](#with-a-vcpkg-flashlight-installation) with is simple using `vcpkg`'s [CMake toolchain integration](https://vcpkg.readthedocs.io/en/latest/examples/installing-and-using-packages/#cmake).\n\n#### From-Source Build with `vcpkg`\n\nFirst, install the dependencies for your backend of choice using `vcpkg` (click to expand the below):\n\n\u003cdetails\u003e\u003csummary\u003eInstalling CUDA Backend Dependencies with vcpkg\u003c/summary\u003e\n\nTo build the Flashlight CUDA backend from source using dependencies installed with `vcpkg`, install [`CUDA` \u003e= 9.2](https://developer.nvidia.com/cuda-downloads), [`cuDNN`](https://docs.nvidia.com/deeplearning/cudnn/install-guide/index.html), [`NCCL`](https://docs.nvidia.com/deeplearning/nccl/install-guide/index.html), and [Intel MKL](https://software.intel.com/content/www/us/en/develop/tools/oneapi/base-toolkit/download.html), then build the rest of the dependencies for the CUDA backend based on which Flashlight features you'd like to build:\n```shell\n./vcpkg install \\\n    cuda intel-mkl fftw3 cub kenlm                \\ # if building flashlight libraries\n    arrayfire[cuda] cudnn nccl openmpi cereal stb \\ # if building the flashlight neural net library\n    gflags glog                                   \\ # if building any flashlight apps\n    libsndfile                                    \\ # if building the flashlight asr app\n    gtest                                           # optional, if building tests\n```\n\u003c/details\u003e\n\n\u003cdetails\u003e\u003csummary\u003eInstalling CPU Backend Dependencies with vcpkg\u003c/summary\u003e\n\nTo build the Flashlight CPU backend from source using dependencies installed with `vcpkg`, install [Intel MKL](https://software.intel.com/content/www/us/en/develop/tools/oneapi/base-toolkit/download.html), then build the rest of the dependencies for the CPU backend based on which Flashlight features you'd like to build:\n```shell\n./vcpkg install \\\n    intel-mkl fftw3 kenlm                              \\ # for flashlight libraries\n    arrayfire[cpu] gloo[mpi] openmpi onednn cereal stb \\ # for the flashlight neural net library\n    gflags glog                                        \\ # for the flashlight runtime pkg (any flashlight apps using it)\n    libsndfile                                         \\ # for the flashlight speech pkg\n    gtest                                                # optional, for tests\n```\n\n\u003c/details\u003e\n\n##### Build Using the `vcpkg` Toolchain File\nTo build Flashlight from source with these dependencies, clone the repository:\n```shell\ngit clone https://github.com/flashlight/flashlight.git \u0026\u0026 cd flashlight\nmkdir -p build \u0026\u0026 cd build\n```\nThen, build from source using `vcpkg`'s [CMake toolchain](https://github.com/microsoft/vcpkg/blob/master/docs/users/integration.md#cmake-toolchain-file-recommended-for-open-source-cmake-projects):\n```shell\ncmake .. \\\n    -DCMAKE_BUILD_TYPE=Release \\\n    -DFL_BUILD_ARRAYFIRE=ON \\\n    -DCMAKE_TOOLCHAIN_FILE=[path to your vcpkg clone]/scripts/buildsystems/vcpkg.cmake\nmake -j$(nproc)\nmake install -j$(nproc) # only if you want to install Flashlight for external use\n```\nTo build a subset of Flashlight's features, see the [build options](#build-options) below.\n\n### Building from Source\nTo build from source, first install the below [dependencies](#dependencies). Most are available with your system's local package manager.\n\nSome dependencies marked below are downloaded and installed automatically if not found on the local system. `FL_BUILD_STANDALONE` determines this behavior — if disabled, dependencies won't be downloaded and built when building Flashlight.\n\n**Once all dependencies are installed**, clone the repository:\n```shell\ngit clone https://github.com/flashlight/flashlight.git \u0026\u0026 cd flashlight\nmkdir -p build \u0026\u0026 cd build\n```\nThen build all Flashlight components with:\n```\ncmake .. -DCMAKE_BUILD_TYPE=Release -DFL_BUILD_ARRAYFIRE=ON [...build options]\nmake -j$(nproc)\nmake install\n```\nSetting the `MKLROOT` environment variable (`export MKLROOT=/opt/intel/oneapi/mkl/latest` or `export MKLROOT=/opt/intel/mkl` on most Linux-based systems) can help CMake find Intel MKL if not initially found.\n\nTo build a smaller subset of Flashlight features/apps, see the [build options](#build-options) below for a complete list of options.\n\nTo install Flashlight in a custom directory, use CMake's [`CMAKE_INSTALL_PREFIX`](https://cmake.org/cmake/help/v3.10/variable/CMAKE_INSTALL_PREFIX.html) argument. Flashlight libraries can be built as shared libraries using CMake's [`BUILD_SHARED_LIBS`](https://cmake.org/cmake/help/v3.10/variable/BUILD_SHARED_LIBS.html) argument.\n\nFlashlight uses modern CMake and `IMPORTED` targets for most dependencies. If a dependency isn't found, passing `-D\u003cpackage\u003e_DIR` to your `cmake` command or exporting `\u003cpackage\u003e_DIR` as an environment variable equal to the path to `\u003cpackage\u003eConfig.cmake` can help locate dependencies on your system. See [the documentation](https://cmake.org/cmake/help/v3.10/command/find_package.html) for more details. If CMake is failing to locate a package, check to see if a corresponding [issue](https://github.com/flashlight/flashlight/issues) has already been created before creating your own.\n\n#### Minimal setup on macOS\n\nOn MacOS, ArrayFire can be installed with homebrew and the Flashlight core can be built as follows:\n\n```\nbrew install arrayfire\ncmake .. \\\n      -DFL_ARRAYFIRE_USE_OPENCL=ON \\\n      -DFL_USE_ONEDNN=OFF \\\n      -DFL_BUILD_TESTS=OFF \\\n      -DFL_BUILD_EXAMPLES=OFF \\\n      -DFL_BUILD_SCRIPTS=OFF \\\n      -DFL_BUILD_DISTRIBUTED=OFF\nmake -j$(nproc)\n```\n\n#### Dependencies\n\nDependencies marked with `*` are automatically downloaded and built from source if not found on the system. Setting `FL_BUILD_STANDALONE` to `OFF` disables this behavior.\n\nDependencies marked with `^` are required if building with distributed training enabled (`FL_BUILD_DISTRIBUTED` — see the [build options](#build-options) below). Distributed training is required for all apps.\n\nDependencies marked with `†` are installable via `vcpkg`. See the [instructions for installing those dependencies](#from-source-build-with-vcpkg) above for doing a Flashlight from-source build.\n\n\u003ctable\u003e\n\u003cthead\u003e\n  \u003ctr\u003e\n    \u003cth\u003eComponent\u003c/th\u003e\n    \u003cth\u003eBackend\u003c/th\u003e\n    \u003cth\u003eDependencies\u003c/th\u003e\n  \u003c/tr\u003e\n\u003c/thead\u003e\n\u003ctbody\u003e\n  \u003ctr\u003e\n    \u003ctd rowspan=\"2\"\u003elibraries\u003c/td\u003e\n    \u003ctd\u003eCUDA\u003c/td\u003e\n    \u003ctd\u003e\u003ca href=\"https://developer.nvidia.com/cuda-downloads\"\u003eCUDA\u003c/a\u003e \u0026gt;= 9.2, \u003ca href=\"https://github.com/nvidia/cub\"\u003eCUB\u003c/a\u003e*† (if CUDA \u0026lt; 11)\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eCPU\u003c/td\u003e\n    \u003ctd\u003eA BLAS library (\u003ca href=\"https://software.intel.com/content/www/us/en/develop/tools/oneapi/base-toolkit/download.html\"\u003eIntel MKL\u003c/a\u003e \u0026gt;= 2018, OpenBLAS†, etc)\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd rowspan=\"3\"\u003ecore\u003c/td\u003e\n    \u003ctd\u003eAny\u003c/td\u003e\n    \u003ctd\u003e\u003ca href=\"https://github.com/arrayfire/arrayfire#installation\"\u003eArrayFire\u003c/a\u003e \u0026gt;= 3.7.3†, an MPI library^(\u003ca href=\"https://www.open-mpi.org/\"\u003eOpenMPI\u003c/a\u003e†, etc),\u0026nbsp;\u0026nbsp;\u003ca href=\"https://github.com/USCiLab/cereal\"\u003ecereal\u003c/a\u003e*† \u0026gt;= 1.3.0, \u003ca href=\"https://github.com/nothings/stb\"\u003estb\u003c/a\u003e*†\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eCUDA\u003c/td\u003e\n    \u003ctd\u003e\u003ca href=\"https://developer.nvidia.com/cuda-downloads\"\u003eCUDA\u003c/a\u003e \u0026gt;= 9.2, \u003ca href=\"https://developer.nvidia.com/nccl\"\u003eNCCL\u003c/a\u003e^, \u003ca href=\"https://developer.nvidia.com/cuDNN\"\u003ecuDNN\u003c/a\u003e\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eCPU\u003c/td\u003e\n    \u003ctd\u003e\u003ca href=\"https://github.com/oneapi-src/oneDNN\"\u003eoneDNN\u003c/a\u003e† \u0026gt;= 2.5.2, \u003ca href=\"https://github.com/facebookincubator/gloo\"\u003egloo\u003c/a\u003e (\u003ca href=\"https://github.com/facebookincubator/gloo/blob/01e2c2660cd43963ce1fe3e21220ac01f07d9a4b/docs/rendezvous.md#using-mpi\"\u003ewith MPI\u003c/a\u003e)*^†\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eapp: all \u003c/td\u003e\n    \u003ctd\u003eAny\u003c/td\u003e\n    \u003ctd\u003e\u003ca href=\"https://github.com/google/glog\"\u003eGoogle Glog\u003c/a\u003e†, \u003ca href=\"https://github.com/gflags/gflags\"\u003eGflags\u003c/a\u003e†\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eapp: asr\u003c/td\u003e\n    \u003ctd\u003eAny\u003c/td\u003e\n    \u003ctd\u003e\u003ca href=\"https://github.com/libsndfile/libsndfile\"\u003elibsndfile\u003c/a\u003e*† \u0026gt;= 10.0.28, a BLAS library (\u003ca href=\"https://software.intel.com/content/www/us/en/develop/tools/oneapi/base-toolkit/download.html\"\u003eIntel MKL\u003c/a\u003e \u0026gt;= 2018, OpenBLAS†, etc), \u003ca href=\"https://github.com/flashlight/text\"\u003eflashlight/text\u003c/a\u003e*\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eapp: imgclass\u003c/td\u003e\n    \u003ctd\u003eAny\u003c/td\u003e\n    \u003ctd\u003e-\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eapp: imgclass\u003c/td\u003e\n    \u003ctd\u003eAny\u003c/td\u003e\n    \u003ctd\u003e-\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eapp: lm\u003c/td\u003e\n    \u003ctd\u003eAny\u003c/td\u003e\n    \u003ctd\u003e\u003ca href=\"https://github.com/flashlight/text\"\u003eflashlight/text\u003c/a\u003e*\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003etests\u003c/td\u003e\n    \u003ctd\u003eAny\u003c/td\u003e\n    \u003ctd\u003e\u003ca href=\"https://github.com/google/googletest\"\u003eGoogle Test (gtest, with gmock)\u003c/a\u003e*† \u0026gt;= 1.10.0\u003c/td\u003e\n  \u003c/tr\u003e\n\u003c/tbody\u003e\n\u003c/table\u003e\n\n#### Build Options\nThe Flashlight CMake build accepts the following build options (prefixed with `-D` when running CMake from the command line):\n\n\u003ctable\u003e\n\u003cthead\u003e\n  \u003ctr\u003e\n    \u003cth\u003eName\u003c/th\u003e\n    \u003cth\u003eOptions\u003c/th\u003e\n    \u003cth\u003eDefault Value\u003c/th\u003e\n    \u003cth\u003eDescription\u003c/th\u003e\n  \u003c/tr\u003e\n\u003c/thead\u003e\n\u003ctbody\u003e\n  \u003ctr\u003e\n    \u003ctd rowspan=\"2\"\u003eFL_BUILD_ARRAYFIRE\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild Flashlight with the ArrayFire backend.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eDownloads/builds some dependencies if not found.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd rowspan=\"3\"\u003eFL_BUILD_LIBRARIES\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild the Flashlight libraries.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild the Flashlight neural net library.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild with distributed training; required for apps.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eFL_BUILD_CONTRIB\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild contrib APIs subject to breaking changes.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eFL_BUILD_APPS\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild applications (see below).\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eFL_BUILD_APP_ASR\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild the automatic speech recognition application.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eFL_BUILD_APP_IMGCLASS\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild the image classification application.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eFL_BUILD_APP_LM\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild the language modeling application.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eFL_BUILD_APP_ASR_TOOLS\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild automatic speech recognition app tools.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eFL_BUILD_TESTS\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild tests.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eFL_BUILD_EXAMPLES\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eON\u003c/td\u003e\n    \u003ctd\u003eBuild examples.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eFL_BUILD_EXPERIMENTAL\u003c/td\u003e\n    \u003ctd\u003eON, OFF\u003c/td\u003e\n    \u003ctd\u003eOFF\u003c/td\u003e\n    \u003ctd\u003eBuild experimental components.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eCMAKE_BUILD_TYPE\u003c/td\u003e\n    \u003ctd\u003eSee \u003ca href=\"https://cmake.org/cmake/help/v3.10/variable/CMAKE_BUILD_TYPE.html\"\u003edocs\u003c/a\u003e.\u003c/td\u003e\n    \u003ctd\u003eDebug\u003c/td\u003e\n    \u003ctd\u003eSee the \u003ca href=\"https://cmake.org/cmake/help/v3.10/variable/CMAKE_BUILD_TYPE.html\"\u003eCMake documentation\u003c/a\u003e.\u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd\u003eCMAKE_INSTALL_PREFIX\u003c/td\u003e\n    \u003ctd\u003e[Directory]\u003c/td\u003e\n    \u003ctd\u003eSee \u003ca href=\"https://cmake.org/cmake/help/v3.10/variable/CMAKE_INSTALL_PREFIX.html\"\u003edocs\u003c/a\u003e.\u003c/td\u003e\n    \u003ctd\u003eSee the \u003ca href=\"https://cmake.org/cmake/help/v3.10/variable/CMAKE_INSTALL_PREFIX.html\"\u003eCMake documentation\u003c/a\u003e.\u003c/td\u003e\n  \u003c/tr\u003e\n\u003c/tbody\u003e\n\u003c/table\u003e\n\n### Building Your Own Project with Flashlight\nFlashlight is most-easily linked to using CMake. Flashlight exports the following CMake targets when installed:\n- `flashlight::flashlight` — contains flashlight libraries as well as the flashlight core autograd and neural network library.\n- `flashlight::fl_pkg_runtime` — contains flashlight core as well as common utilities for training (logging / flags / distributed utils).\n- `flashlight::fl_pkg_vision` — contains flashlight core as well as common utilities for vision pipelines.\n- `flashlight::fl_pkg_text` — contains flashlight core as well as common utilities for dealing with text data.\n- `flashlight::fl_pkg_speech` — contains flashlight core as well as common utilities for dealing with speech data.\n- `flashlight::fl_pkg_halide` — contains flashlight core and extentions to easily interface with halide.\n\nGiven a simple `project.cpp` file that includes and links to Flashlight:\n```c++\n#include \u003ciostream\u003e\n\n#include \u003cflashlight/fl/flashlight.h\u003e\n\nint main() {\n  fl::init();\n  fl::Variable v(fl::full({1}, 1.), true);\n  auto result = v + 10;\n  std::cout \u003c\u003c \"Tensor value is \" \u003c\u003c result.tensor() \u003c\u003c std::endl; // 11.000\n  return 0;\n}\n```\n\nThe following CMake configuration links Flashlight and sets include directories:\n\n```cmake\ncmake_minimum_required(VERSION 3.10)\nset(CMAKE_CXX_STANDARD 17)\nset(CMAKE_CXX_STANDARD_REQUIRED ON)\n\nadd_executable(myProject project.cpp)\n\nfind_package(flashlight CONFIG REQUIRED)\ntarget_link_libraries(myProject PRIVATE flashlight::flashlight)\n```\n\n#### With a `vcpkg` Flashlight Installation\n\nIf you installed Flashlight with `vcpkg`, the above CMake configuration for `myProject` can be built by running:\n```shell\ncd project \u0026\u0026 mkdir build \u0026\u0026 cd build\ncmake .. \\\n  -DCMAKE_TOOLCHAIN_FILE=[path to vcpkg clone]/scripts/buildsystems/vcpkg.cmake \\\n  -DCMAKE_BUILD_TYPE=Release\nmake -j$(nproc)\n```\n\n#### With a From-Source Flashlight Installation\n\nIf using a from-source installation of Flashlight, Flashlight will be found automatically by CMake:\n```shell\ncd project \u0026\u0026 mkdir build \u0026\u0026 cd build\ncmake .. -DCMAKE_BUILD_TYPE=Release\nmake -j$(nproc)\n```\nIf Flashlight is installed in a custom location using a `CMAKE_INSTALL_PREFIX`, passing `-Dflashlight_DIR=[install prefix]/share/flashlight/cmake` as an argument to your `cmake` command can help CMake find Flashlight.\n\n### Building and Running Flashlight with Docker\nFlashlight and its dependencies can also be built with the provided Dockerfiles; see the accompanying [Docker documentation](.docker) for more information.\n\n### Contributing and Contact\nContact: vineelkpratap@fb.com, awni@fb.com, jacobkahn@fb.com, qiantong@fb.com, antares@fb.com, padentomasello@fb.com,\njcai@fb.com,  gab@fb.com, vitaliy888@fb.com, locronan@fb.com\n\nFlashlight is being very actively developed. See\n[CONTRIBUTING](CONTRIBUTING.md) for more on how to help out.\n\n#### Acknowledgments\nSome of Flashlight's code is derived from\n[arrayfire-ml](https://github.com/arrayfire/arrayfire-ml/).\n\n## Citing\nYou can cite [Flashlight](https://arxiv.org/abs/2201.12465) using:\n```\n@misc{kahn2022flashlight,\n      title={Flashlight: Enabling Innovation in Tools for Machine Learning},\n      author={Jacob Kahn and Vineel Pratap and Tatiana Likhomanenko and Qiantong Xu and Awni Hannun and Jeff Cai and Paden Tomasello and Ann Lee and Edouard Grave and Gilad Avidov and Benoit Steiner and Vitaliy Liptchinsky and Gabriel Synnaeve and Ronan Collobert},\n      year={2022},\n      eprint={2201.12465},\n      archivePrefix={arXiv},\n      primaryClass={cs.LG}\n}\n```\n\n## License\nFlashlight is under an MIT license. See [LICENSE](LICENSE) for more information.\n","funding_links":[],"categories":["C++","Artificial Intelligence","Neural Networks (NN) and Deep Neural Networks (DNN)","Table of Contents","\u003ca name=\"cpp\"\u003e\u003c/a\u003eC++","Flutter Apps","Deep Learning Framework","Computation and Communication Optimisation","Machine Learning","其他_机器学习与深度学习","Frameworks","Data Science"],"sub_categories":["NN/DNN Software Frameworks","AI - Frameworks and Toolkits","Flutter Utilities","High-Level DL APIs","Machine Learning"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fflashlight%2Fflashlight","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fflashlight%2Fflashlight","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fflashlight%2Fflashlight/lists"}