{"id":16543409,"url":"https://github.com/ajndkr/fastapi-concurrency-tests","last_synced_at":"2026-04-17T18:02:25.700Z","repository":{"id":192451601,"uuid":"686662512","full_name":"ajndkr/fastapi-concurrency-tests","owner":"ajndkr","description":"compare performance of sync and async workers in FastAPI","archived":false,"fork":false,"pushed_at":"2023-09-03T17:34:20.000Z","size":11,"stargazers_count":3,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-03-13T04:32:49.066Z","etag":null,"topics":["fastapi","http-benchmarking"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ajndkr.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null}},"created_at":"2023-09-03T14:41:53.000Z","updated_at":"2023-09-11T09:37:03.000Z","dependencies_parsed_at":null,"dependency_job_id":"fada1708-06f7-4c25-97e7-33ea8f702b03","html_url":"https://github.com/ajndkr/fastapi-concurrency-tests","commit_stats":null,"previous_names":["ajndkr/sync-async-fastapi"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/ajndkr/fastapi-concurrency-tests","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ajndkr%2Ffastapi-concurrency-tests","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ajndkr%2Ffastapi-concurrency-tests/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ajndkr%2Ffastapi-concurrency-tests/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ajndkr%2Ffastapi-concurrency-tests/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ajndkr","download_url":"https://codeload.github.com/ajndkr/fastapi-concurrency-tests/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ajndkr%2Ffastapi-concurrency-tests/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31939788,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-17T17:29:20.459Z","status":"ssl_error","status_checked_at":"2026-04-17T17:28:47.801Z","response_time":62,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.6:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["fastapi","http-benchmarking"],"created_at":"2024-10-11T19:00:14.616Z","updated_at":"2026-04-17T18:02:25.683Z","avatar_url":"https://github.com/ajndkr.png","language":"Python","readme":"# FastAPI Experiment: Async vs Sync\n\nAn experiment to compare performance of sync and async endpoints in FastAPI.\n\n## Overview:\n\n-   [Setup](#setup)\n-   [Run experiments](#run-experiments)\n-   [Results](#results)\n-   [Conclusion](#conclusion)\n\n## Setup\n\nThe repository uses Python 3.11. Follow the steps below to get started:\n\n-   Create conda environment:\n\n    ```bash\n    conda create -n fastapi-tests python=3.11 -y\n    conda activate fastapi-tests\n    ```\n\n    You can choose any other environment manager of your choice.\n\n-   Install dependencies:\n\n    ```bash\n    pip install -r requirements.txt\n    ```\n\n    **Note**: All requirement files are generated using `pip-tools`.\n\n## Run experiments\n\n### FastAPI Application\n\nThe experiments are run on a simple FastAPI application. The application has two endpoints:\n\n-   `/sync`: A synchronous endpoint that sleeps for 10 seconds and returns a response.\n-   `/async`: An asynchronous endpoint that sleeps for 10 seconds and returns a response.\n\n### Gunicorn Configuration\n\nThe experiments use `gunicorn` as the server. The configuration is as follows:\n\n- experiment 1: 1 worker 1 thread\n\n```bash\ngunicorn -w 1 -k uvicorn.workers.UvicornWorker --threads 1 app:app\n```\n\n- experiment 2: 1 worker 4 threads\n\n```bash\ngunicorn -w 1 -k uvicorn.workers.UvicornWorker --threads 4 app:app\n```\n\n- experiment 3: 4 workers 1 thread\n\n```bash\ngunicorn -w 4 -k uvicorn.workers.UvicornWorker --threads 1 app:app\n```\n\n- experiment 4: 4 workers 4 threads\n\n```bash\ngunicorn -w 4 -k uvicorn.workers.UvicornWorker --threads 4 app:app\n```\n\n### wrk benchmarks\n\nThe experiments use `wrk` to benchmark the server. The configuration is as follows:\n\n```bash\nwrk -t4 -c100 -d1m --timeout 30s \u003cfastapi-endpoint\u003e\n```\n\n-   4 threads\n-   100 connections\n-   1 minute duration\n-   30 seconds timeout\n\nRun the benchmark script:\n\n```bash\n./benchmark.sh\n```\n\n## Results\n\n### Experiment 1: 1 worker 1 thread\n\n- sync benchmark result:\n\n```\nRunning 1m test @ http://localhost:8000/sync\n  4 threads and 100 connections\n  Thread Stats   Avg      Stdev     Max   +/- Stdev\n    Latency    18.04s     5.43s   30.00s    66.67%\n    Req/Sec    32.61     59.70   220.00     89.29%\n  200 requests in 1.00m, 28.52KB read\n  Socket errors: connect 0, read 0, write 0, timeout 50\nRequests/sec:      3.33\nTransfer/sec:     486.37B\n```\n\n- async benchmark result:\n\n```\nRunning 1m test @ http://localhost:8000/async\n  4 threads and 100 connections\n  Thread Stats   Avg      Stdev     Max   +/- Stdev\n    Latency    10.01s     6.23ms  10.03s    65.60%\n    Req/Sec    26.15     46.83   200.00     87.88%\n  500 requests in 1.00m, 71.29KB read\nRequests/sec:      8.33\nTransfer/sec:      1.19KB\n```\n\n### Experiment 2: 1 worker 4 threads\n\n- sync benchmark result:\n\n```\nRunning 1m test @ http://localhost:8000/sync\n  4 threads and 100 connections\n  Thread Stats   Avg      Stdev     Max   +/- Stdev\n    Latency    17.80s     5.21s   30.00s    68.03%\n    Req/Sec    34.61     54.21   210.00     83.87%\n  200 requests in 1.00m, 28.52KB read\n  Socket errors: connect 0, read 0, write 0, timeout 53\nRequests/sec:      3.33\nTransfer/sec:     486.52B\n```\n\n- async benchmark result:\n\n```\nRunning 1m test @ http://localhost:8000/async\n  4 threads and 100 connections\n  Thread Stats   Avg      Stdev     Max   +/- Stdev\n    Latency    10.00s     4.96ms  10.02s    84.80%\n    Req/Sec    16.08     41.37   180.00     87.50%\n  500 requests in 1.00m, 71.29KB read\nRequests/sec:      8.33\nTransfer/sec:      1.19KB\n```\n\n### Experiment 3: 4 workers 1 thread\n\n- sync benchmark result:\n\n```\nRunning 1m test @ http://localhost:8000/sync\n  4 threads and 100 connections\n  Thread Stats   Avg      Stdev     Max   +/- Stdev\n    Latency    10.00s     2.65ms  10.01s    60.00%\n    Req/Sec     2.00      0.00     2.00    100.00%\n  500 requests in 1.00m, 71.29KB read\nRequests/sec:      8.33\nTransfer/sec:      1.19KB\n```\n\n- async benchmark result:\n\n```\nRunning 1m test @ http://localhost:8000/async\n  4 threads and 100 connections\n  Thread Stats   Avg      Stdev     Max   +/- Stdev\n    Latency    10.00s     1.87ms  10.01s    74.40%\n    Req/Sec     2.00      0.00     2.00    100.00%\n  500 requests in 1.00m, 71.29KB read\nRequests/sec:      8.33\nTransfer/sec:      1.19KB\n```\n\n### Experiment 4: 4 workers 4 threads\n\n- sync benchmark result:\n\n```\nRunning 1m test @ http://localhost:8000/sync\n  4 threads and 100 connections\n  Thread Stats   Avg      Stdev     Max   +/- Stdev\n    Latency    10.01s     3.63ms  10.02s    67.60%\n    Req/Sec    12.42     26.35   100.00     83.33%\n  500 requests in 1.00m, 71.29KB read\nRequests/sec:      8.33\nTransfer/sec:      1.19KB\n```\n\n- async benchmark result:\n\n```\nRunning 1m test @ http://localhost:8000/async\n  4 threads and 100 connections\n  Thread Stats   Avg      Stdev     Max   +/- Stdev\n    Latency    10.00s     2.11ms  10.01s    79.20%\n    Req/Sec     2.00      0.00     2.00    100.00%\n  500 requests in 1.00m, 71.29KB read\nRequests/sec:      8.33\nTransfer/sec:      1.19KB\n```\n\n## Conclusion\n\n1.  Sync endpoint:\n\n    - shows high average latency with 1 worker\n    - shows similar latency as async endpoint when number of workers is increased\n\n2.  Async endpoint:\n\n    - shows consistent latency across all experiments. Increasing the number of\n    workers or threads does not affect the latency\n\nIn case of sync endpoints, increasing the number of workers improves the performance.\nHowever, in case of async endpoints, increasing the number of workers or threads doesn't\nappear to have any effect on the performance despite `asyncio.sleep` being a non-blocking call.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fajndkr%2Ffastapi-concurrency-tests","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fajndkr%2Ffastapi-concurrency-tests","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fajndkr%2Ffastapi-concurrency-tests/lists"}