Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/hjsblogger/async-io-python

Demonstration of asyncio in Python using a string of different usecases (or test scenarios)
https://github.com/hjsblogger/async-io-python

asyncio automation beautifulsoup beautifulsoup4 python-asynchronous python-asyncio requests selenium selenium-python web-scraping webscraping

Last synced: 3 months ago
JSON representation

Demonstration of asyncio in Python using a string of different usecases (or test scenarios)

Awesome Lists containing this project

README

        

# Asyncio in Python

Bulb

Image Credit


In this 'Asyncio with Python' repo, we have covered the following usecases:

* Fetching Pokemon names using [Pokemon APIs](https://pokeapi.co/api/v2/pokemon/)
* Fetching Weather information for certain cities in the US using [Openweather APIs](https://openweathermap.org/api)
* Reading Automation Builds & Session information using [LambdaTest APIs](https://www.lambdatest.com/support/api-doc/)
* Checking URL Health of links present on [LambdaTest Selenium Playground](https://lambdatest.com/selenium-playground)
* Web Scraping of content on [LambdaTest E-Commerce Playground](https://ecommerce-playground.lambdatest.io/index.php?route=product/category&path=57)

The scenarios mentioned above are executed using Synchronous & Asynchronous (or Async) modes in Python. The Async Programming is realized by leveraging the [asyncio](https://docs.python.org/3/library/asyncio.html) library in Python.

## Pre-requisites for test execution

**Step 1**

Create a virtual environment by triggering the *virtualenv venv* command on the terminal

```bash
virtualenv venv
```
VirtualEnvironment

**Step 2**

Navigate the newly created virtual environment by triggering the *source venv/bin/activate* command on the terminal

```bash
source venv/bin/activate
```

Follow steps(3) and (4) for using the LambdaTest Cloud Grid (particularly used for Web Scraping & URL Health Checking Scenarios):

**Step 3**

Procure the LambdaTest User Name and Access Key by navigating to [LambdaTest Account Page](https://accounts.lambdatest.com/security). You might need to create an an account on LambdaTest since it is used for running tests (or scraping) on the cloud Grid.

LambdaTestAccount

**Step 4**

Add the LambdaTest User Name and Access Key in the *.env* (or *Makefile*)that is located in the parent directory. Once done, save the Makefile.

LambdaTestEnv-Change

LambdaTestAccount

**Step 5**

For realizing the *Weather Information* scenario, you need to register on [OpenWeather](https://openweathermap.org/). Once done, set the environment variable *OPEN_WEATHER_API* in .env. The API keys can be located from [OpenWeather API Page](https://home.openweathermap.org/api_keys)

OpenWeather-API-2

OpenWeather-API-1

## Dependency/Package Installation

Run the *make install* command on the terminal to install the desired packages (or dependencies) - Pytest, Selenium, Beautiful Soup, etc.

```bash
make install
```
Makefile-1

Makefile-2

For benchmarking (over a certain number of runs), we have used [hyperfine](https://github.com/sharkdp/hyperfine), a command-line benchmarking tool. Installation of hyperfine on macOS is done by triggering ```brew install hyperfine``` on the terminal.

Hyperfine-1

With this, all the dependencies and environment variables are set. It's time for some action !!

## Execution

Follow the below mentioned steps to benchmark the performance of sync and async code (via the hyperfine tool):

**Step 1**

For the URL health checking scenario, the Chrome browser is invoked in the non-headless mode. It is recommended to install Chrome on your machine before you proceed to Step(4)

**Step 2**

Trigger the command *make clean* to clean the remove _pycache_ folder(s) and .pyc files. It also cleans .DS_Store folder from the project.

MakeClean

**Step 3**

Trigger the *make* command on the terminal to realize the usecases in the sync & async mode.

### Fetching Pokemon names using Pokemon APIs

Trigger the command *make fetch-pokemon-names* to fetch the Pokemon names using [Pokemon APIs](https://pokeapi.co/api/v2/pokemon/) and running code in sync & async mode (using Asyncio in Python).

1_Pokemon_Execution

### Result: Asyncio - 5.77+ seconds faster than sync execution

### Fetching Weather information for certain cities in the US using Openweather APIs

In this scenario, the city names from [Page-1](https://www.latlong.net/category/cities-236-15-1.html) thru' [Page-15](https://www.latlong.net/category/cities-236-15-13.html) are first scraped using BeautifulSoup.

WeatherScenario

Now that we have the city names, let's fetch the weather of those cities by providing Latitude & Longitude to the [OpenWeather API](https://api.openweathermap.org/data/2.5/weather?lat=&lon=&appid=)

Trigger the command *make fetch-sync-weather-info* to fetch the Pokemon names using [Pokemon APIs](https://pokeapi.co/api/v2/pokemon/) and running code in sync mode.

2 1_Weather_Info_Sync_Execution

2 2_Weather_Info_Sync_Execution

2 3_Weather_Info_Sync_Execution

Trigger the command *make fetch-async-weather-info* to fetch the Pokemon names using [Pokemon APIs](https://pokeapi.co/api/v2/pokemon/) and running code in sync & async mode (using Asyncio in Python).

2 4_Weather_Info_ASync_Execution

2 5_Weather_Info_ASync_Execution

2 6_Weather_Info_ASync_Execution

### Result: Asyncio - 318.53+ seconds faster than sync execution

### Checking URL health

In this scenario, the hyperlinks on [LambdaTest Selenium Playground](https://lambdatest.com/selenium-playground) are first scraped using BeautifulSoup. Once the links are available, their health is checked using the response code via the [requests](https://pypi.org/project/requests/) library

URL-Health-Checking

Trigger the command *make check-url-health* to check the health (i.e. response code) of the links present on the LambdaTest Selenium Playground.

4_Check_URL_Health_Execution

### Result: Asyncio - 1.41+ seconds faster than sync execution

### Web Scraping

In this scenario, the product details from [Page-1](https://ecommerce-playground.lambdatest.io/index.php?route=product/category&path=57) thru' [Page-5](https://ecommerce-playground.lambdatest.io/index.php?route=product/category&path=57&page=5) are scraped using the BeautifulSoup library.

The implementation using the sync programming is similar to the Web Scraping scenario that covered in [Web Scraping with Python](https://github.com/hjsblogger/web-scraping-with-python/blob/main/tests/beautiful-soup/test_ecommerce_scraping.py) repo.

Trigger the command *make perform-web-scraping* to check scrap the product details on the test page mentioned earlier.

5_Web_Scraping_Execution

### Result: Asyncio - 2.93+ seconds faster than sync execution

## Have feedback or need assistance?
Feel free to fork the repo and contribute to make it better! Email to [himanshu[dot]sheth[at]gmail[dot]com](mailto:[email protected]) for any queries or ping me on the following social media sites:

LinkedIn: [@hjsblogger](https://linkedin.com/in/hjsblogger)

Twitter: [@hjsblogger](https://www.twitter.com/hjsblogger)