https://github.com/copyleftdev/har-pilot
is a powerful tool designed to run HAR (HTTP Archive) files and perform load testing using these files.
https://github.com/copyleftdev/har-pilot
aws aws-s3 cli load-testing qa-automation qa-automation-test rust test-automation
Last synced: 4 months ago
JSON representation
is a powerful tool designed to run HAR (HTTP Archive) files and perform load testing using these files.
- Host: GitHub
- URL: https://github.com/copyleftdev/har-pilot
- Owner: copyleftdev
- Created: 2024-07-22T19:43:59.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2024-07-23T19:33:01.000Z (11 months ago)
- Last Synced: 2024-07-24T22:32:16.778Z (11 months ago)
- Topics: aws, aws-s3, cli, load-testing, qa-automation, qa-automation-test, rust, test-automation
- Language: Rust
- Homepage:
- Size: 9.82 MB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: readme.md
Awesome Lists containing this project
README
# har-pilot

**har-pilot** is a powerful tool designed to run HAR (HTTP Archive) files and perform load testing using these files.
## What is a HAR File?
A HAR (HTTP Archive) file is a JSON-formatted archive file format that contains a record of web requests and responses. HAR files are typically generated by web browsers' developer tools and capture detailed information about network interactions.
### Values and History
The HAR file format was developed to facilitate the export and analysis of HTTP communication sessions. Initially, it was mainly used for debugging and performance analysis by web developers. Over time, it gained widespread adoption due to its ability to provide a comprehensive view of web transactions.
### RFC Details
Although HAR does not have an official RFC, it has become a de facto standard for HTTP session recording. The format was primarily influenced by the Web Performance Working Group at the W3C, which sought to create a unified format for logging network requests and responses.
### Adoption
HAR files are widely supported by major web browsers, including Chrome, Firefox, and Edge. Developer tools in these browsers can export network activity as HAR files, making them accessible for various performance analysis and debugging tools.
## Benefits of Using HAR Files
- **Debugging and Performance Analysis:** HAR files help developers debug network issues and analyze the performance of web applications.
- **Load Testing:** By replaying HAR files, developers can simulate real user interactions and test how their application performs under load.
- **Detailed Metrics:** HAR files provide detailed metrics about each request and response, including headers, cookies, request timings, and more.## Features of har-pilot
- **Run HAR Files:** Execute HTTP requests captured in HAR files.
- **Load Testing:** Perform load testing by running HAR files multiple times.
- **Detailed Metrics:** Collect and store metrics such as response time, status codes, and response bodies in a SQLite database.
- **Concurrency:** Handle multiple requests concurrently for efficient load testing.
- **Progress Tracking:** Track progress with a detailed progress bar.
- **S3 Upload:** Optionally upload the results to an S3 bucket for storage and further analysis.## How har-pilot Pairs with Testing Realistic User Workflows
har-pilot is designed to help developers test their applications under realistic conditions. By replaying HAR files, which capture actual user interactions with a web application, har-pilot can simulate real-world usage patterns. This allows developers to identify performance bottlenecks, assess the impact of concurrent requests, and ensure their applications can handle the expected load.
## Installation
To get started with `har-pilot`, clone the repository and install the dependencies:
```sh
git clone https://github.com/copyleftdev/har-pilot.git
cd har-pilot
cargo build
```## Usage
### Running HAR Files
To run a HAR file and execute the HTTP requests contained within it, use the following command:
```sh
cargo run -- --itercount [--s3-bucket ]
```- ``: The path to the HAR file you want to run.
- ``: The number of times to run the HAR file for load testing.
- `[--s3-bucket ]`: (Optional) The name of the S3 bucket to upload the results to.For example:
```sh
# Running without S3 upload
cargo run -- example.har --itercount 5# Running with S3 upload
cargo run -- example.har --itercount 5 --s3-bucket your-s3-bucket-name
```### Storing Metrics
`har-pilot` stores detailed metrics of each request in a SQLite database. The database file is named with a unique identifier to ensure no conflicts:
- **URL**: The URL of the request.
- **Method**: The HTTP method used (e.g., GET, POST).
- **Response**: The response body.
- **Status**: The HTTP status code.
- **Timestamp**: The timestamp when the request was made.
- **Duration**: The response time in milliseconds.### Querying Metrics
You can query the SQLite database to analyze the metrics collected during the load testing. Here are some useful queries:
#### Average Response Time Per Second
```sql
SELECT
strftime('%Y-%m-%d %H:%M:%S', timestamp) as second,
AVG(duration_ms) as avg_response_time_ms
FROM
metrics
GROUP BY
strftime('%Y-%m-%d %H:%M:%S', timestamp)
ORDER BY
second;
```#### Total Requests Per Second
```sql
SELECT
strftime('%Y-%m-%d %H:%M:%S', timestamp) as second,
COUNT(*) as total_requests
FROM
metrics
GROUP BY
strftime('%Y-%m-%d %H:%M:%S', timestamp)
ORDER BY
second;
```#### Combined Query for Both Average Response Time and Total Requests
```sql
SELECT
strftime('%Y-%m-%d %H:%M:%S', timestamp) as second,
COUNT(*) as total_requests,
AVG(duration_ms) as avg_response_time_ms
FROM
metrics
GROUP BY
strftime('%Y-%m-%d %H:%M:%S', timestamp)
ORDER BY
second;
```