{"id":18376945,"url":"https://github.com/bbc/origin_simulator","last_synced_at":"2025-04-06T20:31:53.084Z","repository":{"id":41973857,"uuid":"165743705","full_name":"bbc/origin_simulator","owner":"bbc","description":"A tool to simulate a (flaky) upstream origin during load and stress tests.","archived":false,"fork":false,"pushed_at":"2025-02-14T14:00:13.000Z","size":679,"stargazers_count":17,"open_issues_count":10,"forks_count":8,"subscribers_count":13,"default_branch":"master","last_synced_at":"2025-04-04T20:42:25.594Z","etag":null,"topics":["elixir","latency","load-testing","stress-testing"],"latest_commit_sha":null,"homepage":"","language":"Elixir","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/bbc.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-01-14T22:12:45.000Z","updated_at":"2025-04-04T12:47:46.000Z","dependencies_parsed_at":"2024-11-06T00:27:51.168Z","dependency_job_id":"e5a016ac-5cdb-4826-ae66-deca8cdf1f34","html_url":"https://github.com/bbc/origin_simulator","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bbc%2Forigin_simulator","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bbc%2Forigin_simulator/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bbc%2Forigin_simulator/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bbc%2Forigin_simulator/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/bbc","download_url":"https://codeload.github.com/bbc/origin_simulator/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247547694,"owners_count":20956599,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["elixir","latency","load-testing","stress-testing"],"created_at":"2024-11-06T00:25:32.378Z","updated_at":"2025-04-06T20:31:52.645Z","avatar_url":"https://github.com/bbc.png","language":"Elixir","readme":"# OriginSimulator [![Build Status](https://travis-ci.org/bbc/origin_simulator.svg?branch=master)](https://travis-ci.org/bbc/origin_simulator)\n\nA tool to simulate a (flaky) upstream origin during load and stress tests.\n\nIn our constant quest to improve our services to be more fault tolerant and handle faulty conditions without nasty surprises, we are trying to make load and stress test more automated and reproducible.\n\nThis tool is designed to be a simple helper to simulate an upstream service behaving unexpectedly for a programmable prolonged period of time. We can then use a load test to see how our downstream service react.\n\nOriginSimulator can also be used to simulate continuous responses with a given latency from a fake service.\n\nThese are the moving parts of a simple load test:\n\n```\n┌────────────────────┐        ┌────────────────────┐        ┌────────────────────┐\n│                    ├────────▶                    ├────────▶                    │\n│  Load Test Client  │        │       Target       │        │  OriginSimulator   │\n│                    ◀────────┤                    ◀────────┤                    │\n└────────────────────┘        └────────────────────┘        └────────────────────┘\n```\n\nWhere:\n* A **Load Test Client**, could be a tool like [WRK2](https://github.com/giltene/wrk2), [AB](https://httpd.apache.org/docs/2.4/programs/ab.html) or [Vegeta](https://github.com/tsenart/vegeta).\n* The load test **Target** is the service you want to test, such as NGINX, custom app or whatever fetches data from an upstream source.\n* **OriginSimulator** can simulate an origin and can be programatically set to behave slow or unstable.\n\n## Scenarios\n\nA JSON recipe defines the different stages of the scenario. This is an example of specifying an origin with stages:\n\n```json\n{\n    \"origin\": \"https://www.bbc.co.uk/news\",\n    \"stages\": [\n        {\n            \"at\": 0,\n            \"latency\": \"50ms\",\n            \"status\": 404\n        },\n        {\n            \"at\": \"4s\",\n            \"latency\": \"2s\",\n            \"status\": 503\n        },\n        {\n            \"at\": \"6s\",\n            \"latency\": \"100ms\",\n            \"status\": 200\n        }\n    ]\n}\n```\n\nWhere `at` represents the time points (in milliseconds) for a state mutation, and latency the simulated response time in milliseconds. In this case:\n\n```\n  0s                     4s                   6s                  ∞\n  *──────────────────────*────────────────────*───────────────────▶\n\n       HTTP 404 50ms           HTTP 503 2s       HTTP 200 100ms\n```\n\nThe recipe can also be a list of simulation scenarios, as descirbed in [multi-route origin simulation](#multi-route-origin-simulation) below.\n\n```json\n[\n\t{\n\t\t\"origin\": \"...\",\n\t\t\"stages\": \"...\",\n\t\t..\n\t},\n\t{\n\t\t\"origin\": \"...\",\n\t\t\"stages\": \"...\",\n\t\t..\n\t},\n\t{\n\t\t\"origin\": \"..\",\n\t\t\"stages\": \"...\",\n\t\t..\n\t}\n]\n```\n\n## Latency\n\nAny stage defines the simulated latency in ms. Is possible to simulate random latency using an array of values. \nIn the example below any response will take a random amount of time within the range 1000..1500:\n\n```json\n{\n    \"random_content\": \"428kb\",\n    \"stages\": [\n        {\n            \"at\": 0,\n            \"latency\": \"1000ms..1500ms\",\n            \"status\": 200\n        }\n    ]\n}\n```\n\n\n## Sources\n\nOriginSimulator can be used in three ways.\n\n* Serving cached content from an origin.\n\n```json\n{\n    \"origin\": \"https://www.bbc.co.uk/news\",\n    \"stages\": [\n        {\n            \"at\": 0,\n            \"latency\": \"100ms\",\n            \"status\": 200\n        }\n    ]\n}\n```\n\n* Serving random sized content.\n\nIn this example we are requiring a continuous successful response with no simulated latency, returning a 428kb payload\n\n```json\n{\n    \"random_content\": \"428kb\",\n    \"stages\": [\n        {\n            \"at\": 0,\n            \"latency\": \"100ms\",\n            \"status\": 200\n        }\n    ]\n}\n```\n\n* Serving content posted to it.\n\nIn this example content is posted along with the recipe. Where the payload body section can be any content such as HTML or JSON.\n\n```json\n{\n    \"body\": \"{\\\"hello\\\":\\\"world\\\"}\",\n    \"stages\": [\n        {\n            \"at\": 0,\n            \"latency\": \"100ms\",\n            \"status\": 200\n        }\n    ]\n}\n```\n\nIt's also possible to define random content inside the posted body. This can be useful to\nsimulate JSON contracts, structured text, etc.\n\n```json\n{\n    \"body\": \"{\\\"data\\\":\\\"\u003c\u003c256kb\u003e\u003e\\\", \\\"metadata\\\":\\\"\u003c\u003c128b\u003e\u003eand\u003c\u003c16b\u003e\u003e\\\", \\\"collection\\\":[\\\"\u003c\u003c128kb\u003e\u003e\\\", \\\"\u003c\u003c256kb\u003e\u003e\\\"]}\\\"}\",\n    \"stages\": [\n        {\n            \"at\": 0,\n            \"latency\": \"100ms\",\n            \"status\": 200\n        }\n    ]\n}\n```\n\n## Multi-route origin simulation\n\nOriginSimulator can also provide multiple origins simulation. Each origin is specified with a recipe and accessible through a `route` (request path) on the simulator. This is an example of specifying multiple origins with different routes:\n\n```json\n[\n  {\n    \"route\": \"/\",\n    \"origin\": \"https://www.bbc.co.uk/\",\n    \"stages\": [\n      {\n        \"at\": 0,\n        \"status\": 200,\n        \"latency\": \"100ms\"\n      }\n    ]\n  },\n  {\n    \"route\": \"/news*\",\n    \"origin\": \"https://www.bbc.co.uk/news\",\n    \"stages\": [\n      {\n        \"at\": 0,\n        \"status\": 200,\n        \"latency\": 0\n      }\n    ]\n  },\n  {\n    \"route\": \"/sport\",\n    \"origin\": \"https://www.bbc.co.uk/sport\",\n    \"stages\": [\n      {\n        \"at\": 0,\n        \"status\": 200,\n        \"latency\": \"1s\"\n      },\n      {\n        \"at\": \"5s\",\n        \"status\": 200,\n        \"latency\": \"100ms\"\n      }\n    ]\n  }\n]\n```\n\nWhere `route` is the request path on the simulator from which the corresponding origin can be accessed. A wildcard route may be used to match paths of the same domain, e.g. `/news*` (above) for `/news/business-51443421`. \n\nThe wildcard root route (`/*`) is the default If no route is specified for a scenario.\n\nMultiple origins of mixed sources can also be specified:\n\n```\n[\n  {\n    \"route\": \"/data/api\",\n    \"body\": \"{\\\"data\\\":\\\"\u003c\u003c256kb\u003e\u003e\\\", \\\"metadata\\\":\\\"\u003c\u003c128b\u003e\u003eand\u003c\u003c16b\u003e\u003e\\\", \\\"collection\\\":[\\\"\u003c\u003c128kb\u003e\u003e\\\", \\\"\u003c\u003c256kb\u003e\u003e\\\"]}\\\"}\",\n    \"stages\": [\n        {\n            \"at\": 0,\n            \"latency\": \"100ms\",\n            \"status\": 200\n        }\n    ]\n  },\n  {\n    \"route\": \"/news\",\n    \"origin\": \"https://www.bbc.co.uk/news\",\n    \"stages\": [\n      {\n        \"at\": 0,\n        \"status\": 404,\n        \"latency\": \"50ms\"\n      },\n      {\n        \"at\": \"2s\",\n        \"status\": 503,\n        \"latency\": \"2s\"\n      },\n      {\n        \"at\": \"4s\",\n        \"status\": 200,\n        \"latency\": \"100ms\"\n      }\n    ]\n  }\n]\n```\n## Usage\n\nYou can post recipes using `curl` and the `mix upload_recipe` task.\n\nFirst run the Elixir app:\n```\n$ env MIX_ENV=prod iex -S mix\nErlang/OTP 21 [erts-10.1.2] [source] [64-bit] [smp:4:4] [ds:4:4:10] [async-threads:1] [hipe] [dtrace]\n\nInteractive Elixir (1.7.4) - press Ctrl+C to exit (type h() ENTER for help)\niex(1)\u003e\n```\n\nThe app is now ready, but still waiting for a recipe:\n```shell\n$ curl http://127.0.0.1:8080/_admin/current_recipe\n\"Recipe not set, please POST a recipe to /_admin/add_recipe\"⏎\n\n$ curl -i http://127.0.0.1:8080/\nHTTP/1.1 406 Not Acceptable\ncache-control: max-age=0, private, must-revalidate\ncontent-length: 2\ncontent-type: text/plain; charset=utf-8\ndate: Sat, 12 Jan 2019 23:18:10 GMT\nserver: Cowboy\n```\n\nLet's add a simple recipe:\n```shell\n$ cat examples/demo.json\n{\n    \"origin\": \"https://www.bbc.co.uk\",\n    \"stages\": [\n        { \"at\": 0,    \"status\": 200, \"latency\": \"200ms\"},\n        { \"at\": \"10s\", \"status\": 500, \"latency\": \"500ms\"},\n        { \"at\": \"30s\", \"status\": 200, \"latency\": \"200ms\"}\n    ]\n}\n\n$ cat examples/demo.json | curl -X POST -d @- http://127.0.0.1:8080/_admin/add_recipe\n```\n\nAll done! Now at different times the server will respond with the indicated HTTP status code and response time:\n```\n$ curl -i http://127.0.0.1:8080/\nHTTP/1.1 404 Not Found\n...\n\n$ curl -i http://127.0.0.1:8080/\nHTTP/1.1 503 Service Unavailable\n...\n\n$ curl -i http://127.0.0.1:8080/\nHTTP/1.1 200 OK\n...\n```\n\nAt any time you can reset the scenario by simply POSTing a new one to `/_admin/add_recipe`. \n\nIn multiple origins scenario, new origins and routes can be added to the existing ones through `/_admin/add_recipe`. Existing scenarios can also be updated. For example you can \"take down\" an origin by updating its recipe with 500 status.\n\n### Using Belfrage with Origin Simulator locally\n1. Change dev [config value in belfrage](https://github.com/bbc/belfrage/blob/cf4278c0a9dcf3adee2c9d5c2599691338b6fb72/config/dev.exs#L4) for `:origin_simulator` to 'http://localhost:8080'\n2. Follow the steps above in 'usage' to run origin-simulator\n3. Run Belfrage locally using `iex -S mix`\n4. Accessing Belfrage locally (http://localhost:7080) will route requests through origin-simulator\n5. It may be helpful to place some debug code ie `IO.inspect()` to [view requests and responses](https://github.com/bbc/origin_simulator/blob/076b8c95e48e042f498227f1da446d53779ab3f2/lib/origin_simulator.ex#L25)\n\n#### Response headers\nOriginSimulator can serve HTTP headers in responses. The headers can be specified in recipes:\n\n```json\n{\n  \"route\": \"/news\",\n  \"origin\": \"https://www.bbc.co.uk/news\",\n  \"stages\": [\n    {\n      \"at\": 0,\n      \"latency\": \"100ms\",\n      \"status\": 200\n    }\n  ],\n  \"headers\": {\n    \"connection\": \"keepalive\",\n    \"cache-control\": \"private, max-age=0, no-cache\"\n  }\n}\n```\n\n#### Response compression\nResponse compression can be specified via the `content-encoding` header. For example, the following recipe returns a gzip random content of 200kb size.\n\n```json\n{\n  \"random_content\": \"200kb\",\n  \"stages\": [\n      { \"at\": 0, \"status\": 200, \"latency\": 0}\n  ],\n  \"headers\": {\n    \"content-encoding\": \"gzip\"\n  }\n}\n```\n\nA corresponding `content-type` header is required for posted `body` which could be of any type (e.g. JSON, HTML, XML):\n\n```json\n{\n  \"route\": \"/data/json\",\n  \"body\": \"{\\\"data\\\":\\\"\u003c\u003c256kb\u003e\u003e\\\", \\\"metadata\\\":\\\"\u003c\u003c128b\u003e\u003eand\u003c\u003c16b\u003e\u003e\\\", \\\"collection\\\":[\\\"\u003c\u003c128kb\u003e\u003e\\\", \\\"\u003c\u003c256kb\u003e\u003e\\\"]}\\\"}\",\n  \"stages\": [\n      { \"at\": 0, \"status\": 200, \"latency\": 0}\n  ],\n  \"headers\": {\n    \"content-encoding\": \"gzip\",\n    \"content-type\": \"application/json; charset=utf-8\"\n  }\n}\n```\n\nFor recipes with an origin, a gzip response may also be specified with the `\"content-encoding\": \"gzip\"` header. OriginSimulator will fetch content from the origin with a `accept-encoding: gzip` header. It will store and serve the gzip content from origin (if provided) during simulation.\n\n```json\n{\n  \"route\": \"/news\",\n  \"origin\": \"https://www.bbc.co.uk/news\",\n  \"stages\": [\n      { \"at\": 0, \"status\": 200, \"latency\": 0}\n  ],\n  \"headers\": {\n    \"content-encoding\": \"gzip\"\n  }\n}\n```\n\n#### Using `mix upload_recipe`\n`mix upload_recipe demo` will upload the recipe located at `examples/demo.json` to origin simulator running locally.\n\nIf you have deployed origin simulator, you can specify the host when uploading the recipe. For example:\n`mix upload_recipe \"http://origin-simulator.com\" demo`\n\n#### Admin routes\n\n* /_admin/status\n\nCheck if the simulator is running, return `ok!`\n\n* /_admin/add_recipe\n\nPost (POST) recipe: update or create new origins\n\n* /_admin/current_recipe\n\nList existing recipe for all origins and routes\n\n* /_admin/restart\n\nReset the simulator: remove all recipes\n\n* /_admin/routes\n\nList all origins and routes\n\n* /_admin/routes_status\n\nList all origin and routes with the corresponding current status and latency values\n\n## Performance\n\nOriginSimulator should be performant, it leverages on the concurrency and parallelism model offered by the Erlang BEAM VM and should sustain significant amount of load.\n\nOur goal was to have performance comparable to Nginx serving static files. To demonstate this, we have run a number of load testsusing Nginx/OpenResty as benchmark. We used [WRK2](https://github.com/giltene/wrk2) as load test tool and ran the tests on AWS EC2 instances.\n\nFor the tests we used two EC2 instances. The load test client ran on a c5.2xlarge instance. We tried c5.large,c5.xlarge,c5.2xlarge and i3.xlarge instanses for the Simulator and OpenResty targets. Interestingly the results didn't show major performance improvements with bigger instances, full results are [available here](https://gist.github.com/ettomatic/6d2ad680fc331b942a5f535f76eb9d02). In the next sections we'll use the results against i3.xlarge.\n\nThe Nginx/OpenResty configuration is very simple and available [here](confs/openresty.conf). While not perfect, we tried to keep it simple, the number of workers has been updated depending of the instance type used.\n\n#### Successful responses with no additional latency\n\nIn this scenario we were looking for maximum throughput. Notice how OpenResty excels on smaller files were results were pretty equal for bigger files.\n\nrecipe:\n```json\n{\n    \"origin\": \"https://www.bbc.co.uk/news\",\n    \"stages\": [\n        { \"at\": 0, \"status\": 200, \"latency\": \"0ms\"}\n    ]\n}\n```\n*Throughput with 0ms additional latency*\n\n| payload size | OriginSimulator | OpenResty |\n|--------------|----------------:|----------:|\n| 50kb         |          17,000 |    24,000 |\n| 100kb        |          12,000 |    12,000 |\n| 200kb        |           6,000 |     6,000 |\n| 428kb        |           2,900 |     2,800 |\n|              |                 |           |\n\n![No latency char](/gnuplot/throughput_no_latency.png)\n\n#### Successful responses with 100ms additional latency\n\nIn this scenario we had almost identical results with 100 concurrent connections, only after 5,000 connections we started seeing Openresty failing down, this is possibly due to misconfiguration.\n\nrecipe:\n```json\n{\n    \"origin\": \"https://www.bbc.co.uk/news\",\n    \"stages\": [\n        { \"at\": 0, \"status\": 200, \"latency\": \"100ms\"}\n    ]\n}\n```\n\n*payload 428kb 100ms added latency*\n\n| concurrent connections | throughput | OriginSimulator |  OpenResty |\n|-----------------------:|-----------:|-----------------|-----------:|\n|                    100 |        900 | 104.10ms        |   101.46ms |\n|                  1,000 |      1,000 | 214.73ms        |   225.70ms |\n|                  3,000 |      2,000 | 220.50ms        | 244.30ms * |\n|                  5,000 |      1,400 | 161.81ms        | 397.67ms * |\n|                 10,000 |      2,000 | 168.18ms        | 384.92ms * |\n\n\u003e **NOTE:** * OpenResty started increasingly timing out and 500ing after 3K\nconcurrent requests.\n\n![100ms latency chart](/gnuplot/response_time_100ms_latency.png)\n\n#### Successful responses with 1s additional latency\n\nWith 1s of latency we could see any difference in terms of performance.\n\nrecipe\n```json\n{\n    \"origin\": \"https://www.bbc.co.uk/news\",\n    \"stages\": [\n        { \"at\": 0, \"status\": 200, \"latency\": \"1s\"}\n    ]\n}\n```\n\n*payload 428kb 1s added latency*\n\n| concurrent connections | throughput | OriginSimulator | OpenResty |\n|-----------------------:|-----------:|----------------:|----------:|\n|                    100 |        100 |           1.03s |     1.02s |\n|                    500 |        500 |           1.05s |     1.03s |\n|                    600 |        600 |           1.24s |     1.20s |\n|                  2,000 |      1,000 |           1.10s |  1.11s ** |\n|                  4,000 |      2,000 |         1.09s * |  1.10s ** |\n\n\u003e **NOTE:** * OriginSimulator had a few timeouts at 4K concurrent connections. ** OpenResty started increasingly timing out and 500ing after 2K\nconcurrent requests.\n\n![100ms latency chart](/gnuplot/response_time_1s_latency.png)\n\n## Load Tests\n\nFor details on Load Test results visit the [Load Tests](docs/load-test-results/) results docs.\n\n## Docker\n\n\u003e **NOTE:** if you plan to use OriginSimulator from Docker for Mac via `docker-compose up` you might notice slow response times.\n\u003e This is down to [Docker for Mac networking integration with the OS](https://github.com/docker/for-mac/issues/2814), which is still the case in 18.09.0.\n\u003e\n\u003e So don't use this setup for load tests, and why would you in any case!\n\n### Docker releases\n\nTo generate a release targeted for Centos:\n\n``` shell\ndocker build -t origin_simulator .\ndocker run --mount=source=/path/to/build,target=/build,type=bind -it origin_simulator\n```\n\nYou'll find the package in `./build`\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbbc%2Forigin_simulator","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fbbc%2Forigin_simulator","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbbc%2Forigin_simulator/lists"}