{"id":13768418,"url":"https://github.com/AztecProtocol/barretenberg","last_synced_at":"2025-05-10T23:31:21.616Z","repository":{"id":65219707,"uuid":"574999978","full_name":"AztecProtocol/barretenberg","owner":"AztecProtocol","description":null,"archived":false,"fork":false,"pushed_at":"2025-05-10T02:29:49.000Z","size":148082,"stargazers_count":168,"open_issues_count":446,"forks_count":120,"subscribers_count":31,"default_branch":"master","last_synced_at":"2025-05-10T03:36:58.565Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"C++","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/AztecProtocol.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"security/descriptions/Protogalaxy recursive verifier transcript bug.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2022-12-06T14:34:53.000Z","updated_at":"2025-05-10T02:29:54.000Z","dependencies_parsed_at":"2023-09-27T08:13:52.679Z","dependency_job_id":"784c6105-db93-409e-a453-ec999c23cb0d","html_url":"https://github.com/AztecProtocol/barretenberg","commit_stats":null,"previous_names":[],"tags_count":35,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AztecProtocol%2Fbarretenberg","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AztecProtocol%2Fbarretenberg/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AztecProtocol%2Fbarretenberg/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AztecProtocol%2Fbarretenberg/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/AztecProtocol","download_url":"https://codeload.github.com/AztecProtocol/barretenberg/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253497296,"owners_count":21917683,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-03T16:01:21.200Z","updated_at":"2025-05-10T23:31:16.597Z","avatar_url":"https://github.com/AztecProtocol.png","language":"C++","readme":"\u003e [!WARNING]\n\u003e :warning: This is not an actively developed repository, it is a mirror. See \u003chttps://github.com/AztecProtocol/aztec-packages\u003e :warning:\n\n\u003e [!WARNING]\n\u003e :warning: **\u003chttps://github.com/AztecProtocol/barretenberg\u003e is a mirror-only repository, please only use \u003chttps://github.com/AztecProtocol/aztec-packages\u003e. Do not use this for any purpose other than reference.** :warning:\n\n![banner](../.github/img/bb_banner.png)\n\n# Barretenberg\n\nBarretenberg (or `bb` for short) is an optimized elliptic curve library for the bn128 curve, and a PLONK SNARK prover.\n\n- [Installation](#installation)\n- [Usage](#usage)\n  - [UltraHonk](#ultrahonk)\n    - [Proving](#proving)\n    - [Verifying](#verifying)\n    - [Solidity verifier](#solidity-verifier)\n  - [MegaHonk](#megahonk)\n- [Development](#development)\n  - [Bootstrap](#bootstrap)\n  - [Build Options and Instructions](#build-options-and-instructions)\n    - [WASM build](#wasm-build)\n    - [Fuzzing build](#fuzzing-build)\n    - [Test coverage build](#test-coverage-build)\n  - [Formatting](#formatting)\n  - [Testing](#testing)\n    - [Integration tests with Aztec in Monorepo](#integration-tests-with-aztec-in-monorepo)\n      - [Integration tests with Aztec in Barretenberg Standalone Repo](#integration-tests-with-aztec-in-barretenberg-standalone-repo)\n      - [Testing locally in docker](#testing-locally-in-docker)\n  - [Docs Build](#docs-build)\n  - [Benchmarks](#benchmarks)\n    - [x86\\_64](#x86_64)\n    - [WASM](#wasm)\n    - [How to run](#how-to-run)\n  - [Debugging](#debugging)\n    - [Debugging Verification Failures](#debugging-verifification-failures)\n    - [Improving LLDB Debugging](#improving-lldb-debugging)\n    - [Using Tracy to Profile Memory/CPU](#using-tracy-to-profile-memorycpu)\n\n\u003e [!CAUTION]\n\u003e **This code is highly experimental, use at your own risk!**\n\n## Installation\n\nFor an easy installation, follow [bbup](https://github.com/AztecProtocol/aztec-packages/blob/master/barretenberg/bbup/README.md).\n\n## Usage\n\nTODO: \u003chttps://github.com/AztecProtocol/aztec-packages/issues/7600\u003e\n\nAll available `bb` commands:\n\u003chttps://github.com/AztecProtocol/aztec-packages/blob/barretenberg-v0.55.0/barretenberg/cpp/src/barretenberg/bb/main.cpp#L1369-L1512\u003e\n\n\u003e [!NOTE]\n\u003e Currently the binary downloads an SRS that can be used to prove the maximum circuit size. This maximum circuit size parameter is a constant in the code and has been set to $2^{23}$ as of writing. This maximum circuit size differs from the maximum circuit size that one can prove in the browser, due to WASM limits.\n\n\u003e [!NOTE]\n\u003e For commands which allow you to send the output to a file using `-o {filePath}`, there is also the option to send the output to stdout by using `-o -`.\n\n### UltraHonk\n\n\u003e [!TIP]\n\u003e Follow the [Noir Docs](https://noir-lang.org) for instructions on how to install Noir, write and compile programs, and generate witnesses.\n\nProve the valid execution of your program:\n\n```bash\nbb prove_ultra_honk -b ./target/hello_world.json -w ./target/witness-name.gz -o ./target/proof\n```\n\nYou can then compute the verification key for your Noir program and verify the proof:\n\n```bash\nbb write_vk_ultra_honk -b ./target/hello_world.json -o ./target/vk\nbb verify_ultra_honk -k ./target/vk -p ./target/proof\n\n```\n\nIf successful, the verification will complete in silence.\n\n### MegaHonk\n\nThe usage with MegaHonk is similar to the above UltraHonk. Refer to all the available `bb` commands, using the `bb \u003ccommand\u003e_mega_honk` syntax.\n\n\u003e[!WARNING]\n\u003e MegaHonk generates insecure recursion circuits when Goblin recursive verifiers are not present.\n\n### Solidity verifier\n\nBarretenberg can generate a smart contract that verifies proofs in Solidity (i.e. for usage in EVM chains). This feature is only available for UltraHonk, as the MegaHonk proving system is intended for use with apps deploying on Aztec only.\n\nFirst, prove the valid execution of your Noir program and export the verification key:\n\n```bash\nbb prove_ultra_keccak_honk -b ./target/hello_world.json -w ./target/witness-name.gz -o ./target/proof\nbb write_vk_ultra_honk -b ./target/hello_world.json -o ./target/vk\n```\n\n\u003e [!IMPORTANT]\n\u003e `prove_ultra_keccak_honk` is used to generate UltraHonk proofs with Keccak hashes, making them gas-efficient. `prove_ultra_honk` in comparison generates proofs with Poseidon hashes, more efficient in recursions but not on-chain verifications.\n\nYou can now use the verification key to generate a Solidity verifier contract:\n\n```bash\nbb contract_ultra_honk -k ./target/vk -c $CRS_PATH -b ./target/hello_world.json -o ./target/Verifier.sol\n```\n\n\u003e[!CAUTION]\n\u003e Solidity verifier contracts are work-in-progress. Expect significant optimizations and breaking changes, and *do NOT use it in production!*\n\n## Development\n\nThe following packages are required for building from source:\n\n- cmake \u003e= 3.24\n- Ninja (used by the presets as the default generator)\n- clang \u003e= 16 or gcc \u003e= 10\n- clang-format\n- libstdc++ \u003e= 12\n- libomp (if multithreading is required. Multithreading can be disabled using the compiler flag `-DMULTITHREADING 0`)\n\nTo install these on Ubuntu, run:\n\n```bash\nsudo apt-get install cmake clang clang-format ninja-build libstdc++-12-dev\n```\n\nThe default cmake version on 22.04 is 3.22.1, so it must be updated. You can get the latest version [here](https://cmake.org/download).\n\nWhen running MacOS Sonoma 14.2.1 the following steps are necessary:\n\n- update bash with `brew install bash`\n- update [cmake](https://cmake.org/download)\n\nIt is recommended to use homebrew llvm on macOS to enable std::execution parallel algorithms. To do so:\n\n- Install llvm with `brew install llvm`\n- Add it to the path with `export PATH=\"/opt/homebrew/opt/llvm/bin:$PATH\"` in your shell or profile file.\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003ch3\u003eInstalling openMP (Linux)\u003c/h3\u003e\u003c/summary\u003e\n\nYou can get openMP from package managers. Ex. on Ubuntu:\n\n```bash\nsudo apt-get install libomp-dev\n```\n\nOr you can install it from source:\n\n```bash!\ngit clone -b release/10.x --depth 1 https://github.com/llvm/llvm-project.git \\\n  \u0026\u0026 cd llvm-project \u0026\u0026 mkdir build-openmp \u0026\u0026 cd build-openmp \\\n  \u0026\u0026 cmake ../openmp -DCMAKE_C_COMPILER=clang -DCMAKE_CXX_COMPILER=clang++ -DLIBOMP_ENABLE_SHARED=OFF \\\n  \u0026\u0026 cmake --build . --parallel \\\n  \u0026\u0026 cmake --build . --parallel --target install \\\n  \u0026\u0026 cd ../.. \u0026\u0026 rm -rf llvm-project\n```\n\n\u003e [!Note]\n\u003e On a fresh Ubuntu Kinetic installation, installing OpenMP from source yields a `Could NOT find OpenMP_C (missing: OpenMP_omp_LIBRARY) (found version \"5.0\")` error when trying to build Barretenberg. Installing from apt worked fine.\n\n\u003c/details\u003e\n\n### Bootstrap\n\nThe bootstrap script will build both the native and wasm versions of barretenberg:\n\n```bash\ncd cpp\n./bootstrap.sh\n```\n\nYou can then install the library on your system:\n\n```sh\ncmake --install build\n```\n\n### Build Options and Instructions\n\nCMake can be passed various build options on its command line:\n\n- `-DCMAKE_BUILD_TYPE=Debug | Release | RelWithAssert`: Build types.\n- `-DDISABLE_ASM=ON | OFF`: Enable/disable x86 assembly.\n- `-DDISABLE_ADX=ON | OFF`: Enable/disable ADX assembly instructions (for older cpu support).\n- `-DMULTITHREADING=ON | OFF`: Enable/disable multithreading.\n- `-DOMP_MULTITHREADING=ON | OFF`: Enable/disable multithreading that uses OpenMP.\n- `-DTESTING=ON | OFF`: Enable/disable building of tests.\n- `-DBENCHMARK=ON | OFF`: Enable/disable building of benchmarks.\n- `-DFUZZING=ON | OFF`: Enable building various fuzzers.\n\nIf you are cross-compiling, you can use a preconfigured toolchain file:\n\n- `-DCMAKE_TOOLCHAIN_FILE=\u003cfilename in ./cmake/toolchains\u003e`: Use one of the preconfigured toolchains.\n\n#### WASM build\n\nTo build:\n\n```bash\ncmake --preset wasm\ncmake --build --preset wasm --target barretenberg.wasm\n```\n\nThe resulting wasm binary will be at `./build-wasm/bin/barretenberg.wasm`.\n\nTo run the tests, you'll need to install `wasmtime`.\n\n```\ncurl https://wasmtime.dev/install.sh -sSf | bash\n```\n\nTests can be built and run like:\n\n```bash\ncmake --build --preset wasm --target ecc_tests\nwasmtime --dir=.. ./bin/ecc_tests\n```\n\nTo add gtest filter parameters in a wasm context:\n\n```\nwasmtime --dir=.. ./bin/ecc_tests run --gtest_filter=filtertext\n```\n\n#### Fuzzing build\n\nFor detailed instructions look in cpp/docs/Fuzzing.md\n\nTo build:\n\n```bash\ncmake --preset fuzzing\ncmake --build --preset fuzzing\n```\n\nFuzzing build turns off building tests and benchmarks, since they are incompatible with libfuzzer interface.\n\nTo turn on address sanitizer add `-DADDRESS_SANITIZER=ON`. Note that address sanitizer can be used to explore crashes.\nSometimes you might have to specify the address of llvm-symbolizer. You have to do it with `export ASAN_SYMBOLIZER_PATH=\u003cPATH_TO_SYMBOLIZER\u003e`.\nFor undefined behavior sanitizer `-DUNDEFINED_BEHAVIOUR_SANITIZER=ON`.\nNote that the fuzzer can be orders of magnitude slower with ASan (2-3x slower) or UBSan on, so it is best to run a non-sanitized build first, minimize the testcase and then run it for a bit of time with sanitizers.\n\n#### Test coverage build\n\nTo build:\n\n```bash\ncmake --preset coverage\ncmake --build --preset coverage\n```\n\nThen run tests (on the mainframe always use taskset and nice to limit your influence on the server. Profiling instrumentation is very heavy):\n\n```\ntaskset 0xffffff nice -n10 make test\n```\n\nAnd generate report:\n\n```\nmake create_full_coverage_report\n```\n\nThe report will land in the build directory in the all_test_coverage_report directory.\n\nAlternatively you can build separate test binaries, e.g. honk_tests or numeric_tests and run **make test** just for them or even just for a single test. Then the report will just show coverage for those binaries.\n\n### Formatting\n\nCode is formatted using `clang-format` and the `./cpp/format.sh` script which is called via a git pre-commit hook.\n\n\u003e[!TIP]\n\u003e A default configuration for VS Code is provided by the file [`barretenberg.code-workspace`](barretenberg.code-workspace). These settings can be overridden by placing configuration files in `.vscode/`.\n\u003e If you've installed the C++ Vscode extension, configure it to format on save!\n\n### Testing\n\nEach module has its own tests. e.g. To build and run `ecc` tests:\n\n```bash\n# Replace the `default` preset with whichever preset you want to use\ncmake --build --preset default --target ecc_tests\ncd build\n./bin/ecc_tests\n```\n\nA shorthand for the above is:\n\n```bash\n# Replace the `default` preset with whichever preset you want to use\ncmake --build --preset default --target run_ecc_tests\n```\n\nRunning the entire suite of tests using `ctest`:\n\n```bash\ncmake --build --preset default --target test\n```\n\nYou can run specific tests, e.g.\n\n```\n./bin/ecc_tests --gtest_filter=scalar_multiplication.*\n```\n\n#### Integration tests with Aztec in Monorepo\n\nCI will automatically run integration tests against Aztec. It is located in the `barretenberg` folder.\n\n##### Integration tests with Aztec in Barretenberg Standalone Repo\n\nWhen working on a PR, you may want to point this file to a different Aztec branch or commit, but then it should probably be pointed back to master before merging.\n\n##### Testing locally in docker\n\nA common issue that arises is that our CI system has a different compiler version e.g. namely for GCC. If you need to mimic the CI operating system locally you can use bootstrap_docker.sh or run dockerfiles directly. However, there is a more efficient workflow for iterative development:\n\n```\ncd barretenberg/cpp\n./scripts/docker_interactive.sh\nmv build build-native # your native build folders are mounted, but will not work! have to clear them\ncmake --preset gcc ;  cmake --build build\n```\n\nThis will allow you to rebuild as efficiently as if you were running native code, and not have to see a full compile cycle.\n\n### Docs Build\n\nIf doxygen is installed on the system, you can use the **build_docs** target to build documentation, which can be configured in vscode CMake extension or using\n\n```bash\ncmake --build . --target build_docs\n```\n\nin the cpp/build directory. The documentation will be generated in cpp/docs/build folder. You can then run a python http server in the folder:\n\n```bash\npython3 -m http.server \u003cport\u003e\n```\n\nand tunnel the port through ssh.\n\n### Benchmarks\n\nTable represents time in ms to build circuit and proof for each test on n threads.\nIgnores proving key construction.\n\n#### x86_64\n\n```\n+--------------------------+------------+---------------+-----------+-----------+-----------+-----------+-----------+\n| Test                     | Gate Count | Subgroup Size |         1 |         4 |        16 |        32 |        64 |\n+--------------------------+------------+---------------+-----------+-----------+-----------+-----------+-----------+\n| sha256                   | 38799      | 65536         |      5947 |      1653 |       729 |       476 |       388 |\n| ecdsa_secp256k1          | 41049      | 65536         |      6005 |      2060 |       963 |       693 |       583 |\n| ecdsa_secp256r1          | 67331      | 131072        |     12186 |      3807 |      1612 |      1351 |      1137 |\n| schnorr                  | 33740      | 65536         |      5817 |      1696 |       688 |       532 |       432 |\n| double_verify_proof      | 505513     | 524288        |     47841 |     15824 |      7970 |      6784 |      6082 |\n+--------------------------+------------+---------------+-----------+-----------+-----------+-----------+-----------+\n```\n\n#### WASM\n\n```\n+--------------------------+------------+---------------+-----------+-----------+-----------+-----------+-----------+\n| Test                     | Gate Count | Subgroup Size |         1 |         4 |        16 |        32 |        64 |\n+--------------------------+------------+---------------+-----------+-----------+-----------+-----------+-----------+\n| sha256                   | 38799      | 65536         |     18764 |      5116 |      1854 |      1524 |      1635 |\n| ecdsa_secp256k1          | 41049      | 65536         |     19129 |      5595 |      2255 |      2097 |      2166 |\n| ecdsa_secp256r1          | 67331      | 131072        |     38815 |     11257 |      4744 |      3633 |      3702 |\n| schnorr                  | 33740      | 65536         |     18649 |      5244 |      2019 |      1498 |      1702 |\n| double_verify_proof      | 505513     | 524288        |    149652 |     45702 |     20811 |     16979 |     15679 |\n+--------------------------+------------+---------------+-----------+-----------+-----------+-----------+-----------+\n```\n\n#### How to run\n\nSome modules have benchmarks. The build targets are named `\u003cmodule_name\u003e_bench`. To build and run, for example `ecc` benchmarks.\n\n```bash\n# Replace the `default` preset with whichever preset you want to use\ncmake --build --preset default --target ecc_bench\ncd build\n./bin/ecc_bench\n```\n\nA shorthand for the above is:\n\n```bash\n# Replace the `default` preset with whichever preset you want to use\ncmake --build --preset default --target run_ecc_bench\n```\n\n### Debugging\n\n#### Debugging Verifification Failures\n\nThe CircuitChecker::check_circuit function is used to get the gate index and block information about a failing circuit constraint.\nIf you are in a scenario where you have a failing call to check_circuit and wish to get more information out of it than just the gate index, you can use this feature to get a stack trace, see example below.\n\nUsage instructions:\n\n- On ubuntu (or our mainframe accounts) use `sudo apt-get install libdw-dev` to support trace printing\n- Use `cmake --preset clang16-dbg-fast-circuit-check-traces` and `cmake --build --preset clang16-dbg-fast-circuit-check-traces` to enable the backward-cpp dependency through the CHECK_CIRCUIT_STACKTRACES CMake variable.\n- Run any case where you have a failing check_circuit call, you will now have a stack trace illuminating where this constraint was added in code.\n\nCaveats:\n\n- This works best for code that is not overly generic, i.e. where just the sequence of function calls carries a lot of information. It is possible to tag extra data along with the stack trace, this can be done as a followup, please leave feedback if desired.\n- There are certain functions like `assert_equals` that can cause gates that occur *before* them to fail. If this would be useful to automatically report, please leave feedback.\n\nExample:\n\n```\n[ RUN      ] standard_circuit_constructor.test_check_circuit_broken\nStack trace (most recent call last):\n#4    Source \"_deps/gtest-src/googletest/src/gtest.cc\", line 2845, in Run\n       2842:   if (!Test::HasFatalFailure() \u0026\u0026 !Test::IsSkipped()) {\n       2843:     // This doesn't throw as all user code that can throw are wrapped into\n       2844:     // exception handling code.\n      \u003e2845:     test-\u003eRun();\n       2846:   }\n       2847:\n       2848:   if (test != nullptr) {\n#3    Source \"_deps/gtest-src/googletest/src/gtest.cc\", line 2696, in Run\n       2693:   // GTEST_SKIP().\n       2694:   if (!HasFatalFailure() \u0026\u0026 !IsSkipped()) {\n       2695:     impl-\u003eos_stack_trace_getter()-\u003eUponLeavingGTest();\n      \u003e2696:     internal::HandleExceptionsInMethodIfSupported(this, \u0026Test::TestBody,\n       2697:                                                   \"the test body\");\n       2698:   }\n#2  | Source \"_deps/gtest-src/googletest/src/gtest.cc\", line 2657, in HandleSehExceptionsInMethodIfSupported\u003ctesting::Test, void\u003e\n    |  2655: #if GTEST_HAS_EXCEPTIONS\n    |  2656:     try {\n    | \u003e2657:       return HandleSehExceptionsInMethodIfSupported(object, method, location);\n    |  2658:     } catch (const AssertionException\u0026) {  // NOLINT\n    |  2659:       // This failure was reported already.\n      Source \"_deps/gtest-src/googletest/src/gtest.cc\", line 2621, in HandleExceptionsInMethodIfSupported\u003ctesting::Test, void\u003e\n       2618:   }\n       2619: #else\n       2620:   (void)location;\n      \u003e2621:   return (object-\u003e*method)();\n       2622: #endif  // GTEST_HAS_SEH\n       2623: }\n#1    Source \"/mnt/user-data/adam/aztec-packages/barretenberg/cpp/src/barretenberg/circuit_checker/standard_circuit_builder.test.cpp\", line 464, in TestBody\n        461:     uint32_t d_idx = circuit_constructor.add_variable(d);\n        462:     circuit_constructor.create_add_gate({ a_idx, b_idx, c_idx, fr::one(), fr::one(), fr::neg_one(), fr::zero() });\n        463:\n      \u003e 464:     circuit_constructor.create_add_gate({ d_idx, c_idx, a_idx, fr::one(), fr::neg_one(), fr::neg_one(), fr::zero() });\n        465:\n        466:     bool result = CircuitChecker::check(circuit_constructor);\n        467:     EXPECT_EQ(result, false);\n#0    Source \"/mnt/user-data/adam/aztec-packages/barretenberg/cpp/src/barretenberg/stdlib_circuit_builders/standard_circuit_builder.cpp\", line 22, in create_add_gate\n         19: {\n         20:     this-\u003eassert_valid_variables({ in.a, in.b, in.c });\n         21:\n      \u003e  22:     blocks.arithmetic.populate_wires(in.a, in.b, in.c);\n         23:     blocks.arithmetic.q_m().emplace_back(FF::zero());\n         24:     blocks.arithmetic.q_1().emplace_back(in.a_scaling);\n         25:     blocks.arithmetic.q_2().emplace_back(in.b_scaling);\ngate number4\n```\n\n#### Improving LLDB Debugging\n\nIt can be quite hard to make sense of field_t circuit values that indirectly reference their contents, and even plain field values that are typically in montgomery form.\nIn command-line LLDB or VSCode debug console, run:\n\n```\ncommand script import ~/aztec-packages/barretenberg/cpp/scripts/lldb_format.py\n```\n\nNow when you `print` things with e.g. `print bigfield_t.get_value()` or inspect in VSCode (if you opened the debug console and put in these commands) then you will get pretty-printing of these types. This can be expanded fairly easily with more types if needed.\n\n#### Using Tracy to Profile Memory/CPU\n\nSee Tracy manual linked here \u003chttps://github.com/wolfpld/tracy\u003e for in-depth Tracy documentation.\n\nThe basic use of Tracy is to run a benchmark with the `cmake --preset tracy` build type, create a capture file, then\ntransfer it to a local machine for interactive UI introspection.\n\nAll the steps to do this effectively are included in various scripts in cpp/scripts/.\n","funding_links":[],"categories":["Developer Tools","Get Coding"],"sub_categories":["Proving Backends"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FAztecProtocol%2Fbarretenberg","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FAztecProtocol%2Fbarretenberg","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FAztecProtocol%2Fbarretenberg/lists"}