Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/vamshi5s/effortx-automation-e2e

EffortX Automation Testing Project
https://github.com/vamshi5s/effortx-automation-e2e

allure-report ci-cd css java javascript json pom production selenium-webdriver staging-environment suite test-automation testng xml

Last synced: about 21 hours ago
JSON representation

EffortX Automation Testing Project

Awesome Lists containing this project

README

        

# EffortX Automation Testing Project

## Overview
This repository contains an automation testing suite built with Selenium WebDriver, TestNG, and Maven for the EffortX platform (https://secure.spoors.in/effortx/login). The project was developed over a period of 1.5 years to ensure the quality and functionality of the web application by performing end-to-end testing of various features, with a specific focus on basic sanity checks post-deployment on different environments, including test, staging, and live servers.

### Project Goals:
- Perform automated sanity testing on Test, Staging, and Live environments after every deployment.
- Ensure the most critical application features (e.g., login functionality, basic workflows) work correctly after each deployment.
- Build a scalable and reusable test suite that can be run in different environments.
- Integrate with Maven for project management and dependency resolution.
- Ensure a smooth CI/CD pipeline using Jenkins
- Generate detailed and understandable reports after each test run.

## Technologies Used
- Java – Programming language used to write the test cases.
- Selenium WebDriver – For browser automation and interaction with the web application.
- TestNG – Testing framework to organize and run tests.
- Maven – For project management and dependency management.
- Log4j – For logging.
- Jenkins (optional) – For CI/CD pipeline integration.
- WebDriverManager – For automatic management of browser drivers.

## Project Structure
The project follows a Page Object Model (POM) design pattern for better maintainability and scalability. The structure is as follows:

```
Effort-Automation/

├── src/
│ ├── main/
│ │ └── java/
│ │ └── com/
│ │ └── effortx/
│ │ └── pages/ # Page Object Model classes (POM)
│ │ └── utils/ # Utility classes (e.g., WebDriver setup, waits, etc.)
│ ├── test/
│ │ └── java/
│ │ └── com/
│ │ └── effortx/
│ │ └── tests/ # TestNG test classes
│ │ └── reports/ # Test execution reports (HTML, XML)
│ │ └── sanity/ # Contains sanity check tests (TestNG XML config, etc.)

├── pom.xml # Maven configuration and dependencies
└── README.md # This file
```

## Sanity Testing Overview

One of the primary goals of this project was to perform sanity testing after every deployment on Test, Staging, and Live servers. These sanity tests focus on validating the most critical functionalities of the application, ensuring the system is stable and ready for further user testing or production use. Here's how the sanity tests are structured:

### Sanity Testing Breakdown
1. Post-Deployment Testing:
- After each deployment to Test, Staging, or Live servers, the basic sanity tests are executed to verify that core features like login, form submissions, and essential workflows are functioning as expected.
- This helps identify any immediate issues or regressions introduced by recent changes.

2. Test Environment:
- The Test server is where initial development builds are deployed. This environment is used for running sanity checks to verify that basic features are operational before moving to more detailed functional or regression tests.

3. Staging Environment:
- The Staging server mimics the production environment and is used for more comprehensive testing. However, basic sanity checks are executed here as well to ensure that deployment has not broken critical functionality.
- This is the last step before moving to Live, so it’s essential that the sanity tests pass in this environment.

4. Live Environment:
- After deployment to the Live server, the final round of sanity tests ensures that everything is working as expected in the production environment.
- These tests are performed on the actual live URL to ensure the system behaves as intended under real-world conditions.

### Basic Sanity Test Scenarios:
- Login Verification: Ensure that valid credentials allow login and invalid credentials show appropriate error messages.
- Navigation: Check that core pages (dashboard, settings, etc.) are loading correctly.
- Basic Form Submission: Validate that forms can be submitted without errors.
- UI Elements: Verify that essential UI elements (buttons, input fields, links) are accessible and functional.

## Setup Instructions

### Prerequisites
- Java (version 8 or higher) should be installed on your system.
- Maven should be installed and properly set up.
- IDE: Use an IDE like IntelliJ IDEA or Eclipse for editing and running tests.
- WebDriver: Compatible WebDriver for your browser (ChromeDriver, GeckoDriver, etc.) should be available on the system PATH.
- Test Servers: Ensure that you have the correct URLs for Test, Staging, and Live environments.

### Clone the Repository
To get started, clone this repository to your local machine:

```bash
git clone https://github.com/Vamshi5S/Effort-Automation.git
```

### Install Dependencies
Navigate to the project directory and run the following command to download necessary dependencies:

```bash
mvn clean install
```

### Configure Environment URLs
In the `src/main/resources/config.properties` file, configure the URLs for Test, Staging, and Live environments. Example:

```properties
test.url=https://test.spoors.in/effortx
staging.url=https://staging.spoors.in/effortx
live.url=https://secure.spoors.in/effortx
```

### Run the Tests
To run the automated sanity tests for a specific environment, you can pass the environment URL as a system property in the Maven command:

```bash
mvn test -Denvironment=test
```

This command will execute the sanity tests on the Test environment. You can replace `test` with `staging` or `live` based on the environment you want to test.

### Running Tests in Parallel
TestNG supports parallel test execution out-of-the-box. To run tests in parallel, ensure the configuration is set properly in the testng.xml file. TestNG will split the tests based on the configuration and execute them in parallel.

## Test Reporting

Once the tests have executed, a test report will be generated in the target/test-classes/test-output directory. The reports will be available in both HTML and XML formats, and the HTML report will provide a detailed overview of test results, including pass/fail status for each test case.

## Contributing

If you'd like to contribute to this project, feel free to fork the repository, make your changes, and submit a pull request. Please ensure that the new tests you add are appropriate and run as expected. Also, remember to add any new configuration parameters that may be needed for other environments.

## Contact

For any questions or feedback, feel free to reach out:

- Email: [email protected]
- GitHub: [Vamshi5S/Effort-Automation](https://github.com/Vamshi5S/Effort-Automation)

## Acknowledgements

- Selenium WebDriver – For browser automation.
- TestNG – For the testing framework.
- Maven – For project management and dependency resolution.
- EffortX Team – For providing the platform to develop and automate the test cases.

### License
This project is licensed under the MIT License.