https://github.com/nouraalgohary/examination-system
ITI Graduation Project
https://github.com/nouraalgohary/examination-system
automation data-warehouse elt etl power-apps power-automate power-bi sql-server ssas ssis ssrs stored-procedures
Last synced: 7 months ago
JSON representation
ITI Graduation Project
- Host: GitHub
- URL: https://github.com/nouraalgohary/examination-system
- Owner: NouraAlgohary
- Created: 2024-02-06T19:51:03.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2024-03-21T22:04:34.000Z (almost 2 years ago)
- Last Synced: 2025-04-08T18:38:51.402Z (10 months ago)
- Topics: automation, data-warehouse, elt, etl, power-apps, power-automate, power-bi, sql-server, ssas, ssis, ssrs, stored-procedures
- Language: TSQL
- Homepage:
- Size: 44.9 MB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Examination-System
This repository contains the documentation for an automated Examination System designed to facilitate online exams and manage exam-related data efficiently. The system includes a desktop application and a SQL database to support various functionalities and reports.
https://github.com/NouraAlgohary/Examination-System/assets/103903785/ef0027cc-0722-4196-9b61-46b1bbb3c6bb
## Project Milestones
- **Initial Planning**: Project scope, requirements, and milestones were defined.
- **Database Design and Creation**: The entity-relationship diagram (ERD) and database schema were developed.
- **ETL Implementation (CSV to Database)**: ETL processes were implemented to efficiently transfer data from CSV files into the MS SQL Server database.
- **Stored Procedures Implementation**: Stored procedures for database operations were designed and implemented.
- **Data Warehouse (DWH) Design and Creation**: The schema and architecture for the Data Warehouse were designed.
- **ETL Implementation (Database to Data Warehouse)**: Additional ETL processes were implemented to transfer data from the MS SQL Server database to a Data Warehouse (DWH).
- **Desktop Application Development**: The desktop application for administering exams and managing data was created.
- **Reporting Functionality**: Reporting tools were integrated, and various reports were implemented as per requirements.
- **Testing and Debugging**: Thorough testing was conducted to ensure system functionality and reliability.
- **Documentation**: The system architecture, functionalities, and usage guidelines were documented.
## 2. Database Design and Creation
### Entity-Relationship Diagram (ERD)
The Entity-Relationship Diagram (ERD) serves as a graphical representation of the Examination System's database structure. It illustrates the entities, attributes, and relationships between them. The ERD is instrumental in conceptualizing and designing the database schema, providing a visual roadmap for database developers and stakeholders.

#### Components of the ERD:
- **Entities**: These represent real-world objects or concepts within the Examination System, such as students, courses, exams, and instructors. Each entity is depicted as a rectangle in the ERD.
- **Attributes**: Attributes describe the properties or characteristics of entities. They are represented as ovals connected to their respective entities in the ERD. Examples of attributes include student ID, course name, exam date, and instructor name.
- **Relationships**: Relationships define how entities are connected within the database. They depict associations between entities and can be one-to-one, one-to-many, or many-to-many relationships. Relationship lines connecting entities in the ERD indicate these associations.
- **Cardinalities**: Cardinalities specify the number of instances of one entity that are related to another entity in a given relationship. They are indicated using symbols such as "1" for one-to-one relationships, "1-M" for one-to-many relationships, and "M-N" for many-to-many relationships.
### Mapping
The mapping process involves translating the conceptual model represented by the ERD into a physical database schema. This entails defining tables, columns, primary keys, foreign keys, and constraints based on the entities and relationships identified in the ERD.

#### Steps in Entity-Relationship Mapping:
1. **Identifying Entities and Attributes**: Entities and their attributes are identified from the ERD, with each entity corresponding to a database table and each attribute mapping to a table column.
2. **Defining Relationships**: Relationships between entities are translated into foreign key constraints in the database schema, establishing connections between related tables.
3. **Establishing Primary Keys**: Primary keys are defined for each table to uniquely identify records. Typically, primary keys correspond to attributes designated as unique identifiers in the ERD.
4. **Enforcing Referential Integrity**: Referential integrity constraints are implemented to maintain data consistency and enforce relationships between related tables. These constraints ensure that foreign key values in child tables reference valid primary key values in parent tables.
5. **Indexing**: Indexes are created on columns frequently used in queries to improve data retrieval performance. Indexes enhance database efficiency by facilitating rapid access to specific data subsets.
### Advantages of Entity-Relationship Mapping:
- **Clarity and Consistency**: Mapping ensures that the database schema accurately reflects the conceptual model depicted in the ERD, promoting clarity and consistency in database design.
- **Data Integrity**: By enforcing referential integrity constraints, mapping helps maintain data integrity and prevents inconsistencies in the database.
- **Optimized Performance**: Well-designed mappings, including appropriate indexing strategies, contribute to optimized database performance and efficient query processing.
- **Scalability and Flexibility**: A carefully mapped database schema provides a scalable and flexible foundation for accommodating future changes and enhancements to the Examination System.
The entity-relationship mapping process lays the groundwork for the development of a robust and efficient database that meets the data storage and retrieval needs of the Examination System.
## Database Creation
The creation of the database for the Examination System involved several steps to ensure its proper setup and configuration. This section outlines the process of creating the database, including the tools used, configuration settings, and post-creation tasks.
### Steps:
1. **Database Management System Selection**:
- Microsoft SQL Server was chosen as the Database Management System (DBMS) for the Examination System due to its robust features, scalability, and compatibility with the project requirements.
2. **Database Creation Script**:
- A SQL script was prepared to define the database schema, including tables and other database objects. This script served as a blueprint for creating the database structure.
3. **Execution of SQL Script**:
- The SQL script was executed using SQL Server Management Studio (SSMS) or another SQL client tool. This process created the database and its associated objects according to the specifications outlined in the script.
You can find the script [here](https://github.com/NouraAlgohary/Examination-System/blob/main/1.%20Database/Creation%20Queries/1-%20Compelete%20DB.sql)
6. **Data Population**:
- Mockaroo was configured to generate sample data for entities such as students, courses, exams, and instructors. The generated data was exported to CSV files for further processing.

### Tools Used:
- **Microsoft SQL Server Management Studio (SSMS)**: SSMS was used as the primary tool for executing SQL scripts, managing database objects, and configuring database settings.
- **[Mockaroo](https://mockaroo.com/)**: a web-based tool for generating realistic test data, was utilized to populate the database with initial data. Mockaroo offers customizable data templates and supports various data formats, including CSV.
### Benefits:
- **Data Integrity**: The structured approach to database creation ensured data integrity and consistency within the Examination System.
- **Scalability**: The database design allowed for scalability, accommodating future growth and evolving requirements of the system.
- **Security**: Robust security measures were implemented to protect the confidentiality, integrity, and availability of the database.
- **Maintainability**: Well-documented database schemas and configuration settings facilitated system maintenance and troubleshooting efforts.
The creation of the database using Microsoft SQL Server laid a solid foundation for storing, managing, and securing data within the Examination System.
## 3. ETL Implementation (CSV to SQL Server Using SSIS)
The ETL (Extract, Transform, Load) process for transferring data from CSV files to the Microsoft SQL Server database was a pivotal step in establishing the Examination System's database. This section provides a detailed explanation of the ETL implementation using SQL Server Integration Services (SSIS), focusing on extracting data from CSV files and loading it into SQL Server.
### Tools Used:
1. **SQL Server Integration Services (SSIS)**: SSIS served as the core tool for orchestrating the ETL process. SSIS provides a comprehensive platform for designing, deploying, and managing ETL workflows, making it well-suited for data integration tasks.
2. **Microsoft SQL Server**: The destination database for the ETL process was hosted on Microsoft SQL Server.
### ETL Implementation Steps:
1. **Project Setup**:
- A new SSIS project was created within Visual Studio to encapsulate the ETL workflows.
2. **Data Source Configuration**:
- CSV files containing the source data were identified and configured as the data source for the SSIS package. Connection managers were established to connect to the CSV files.
3. **Data Extraction**:
- SSIS Data Flow Tasks were employed to extract data from the CSV files. Flat File Source components within the Data Flow Task were configured to read data from the CSV files and pass it downstream for further processing.
4. **Data Transformation**:
- Transformation tasks were applied to the extracted data to prepare it for loading into the SQL Server database. This involved cleansing, validating, and formatting the data as per the requirements of the target database schema.
- SSIS provides a wide range of transformation components, such as Derived Column, Data Conversion, and Lookup, to facilitate various data manipulation operations.
5. **Data Loading**:
- The transformed data was loaded into the SQL Server database tables using SSIS Destination components. SQL Server Destination components were configured to map the incoming data columns to the corresponding database table columns.
6. **Error Handling and Logging**:
- SSIS includes built-in features for error handling and logging to capture and manage errors encountered during the ETL process. Error outputs and logging mechanisms were configured to track data validation errors, integrity violations, and other issues.
7. **Execution and Monitoring**:
- The SSIS package was executed to initiate the ETL process. Execution options, debugging settings, and logging levels were configured as needed to monitor the progress and performance of the ETL workflow.
- SSIS logging and monitoring tools were utilized to track the execution status, diagnose errors, and optimize the ETL process for efficiency.
### Benefits of Using SSIS for ETL:
- **Visual Development Environment**: SSIS offers a visual development environment with drag-and-drop functionality, enabling easy design and management of complex ETL workflows.
- **Performance Optimization**: SSIS provides features for parallel processing, data streaming, and in-memory operations, facilitating high-performance ETL solutions.
- **Scalability and Flexibility**: SSIS supports scalable and flexible ETL architectures, accommodating diverse data sources and large data volumes with ease.
- **Integration with SQL Server**: SSIS seamlessly integrates with Microsoft SQL Server, offering native connectivity and optimized performance for SQL Server databases.
The successful implementation of ETL processes from CSV files to the MS SQL Server database lays the foundation for efficient data management and utilization within the Examination System.
## 4. Stored Procedures Implementation
Stored procedures play a crucial role in streamlining database operations and enhancing system functionality within the Examination System project. This section outlines the implementation of stored procedures for key functionalities, including CRUD operations (Create, Read, Update, Delete) on database tables and specific procedures for exam generation, handling exam answers, and conducting exam corrections.
### Select, Insert, Update, and Delete Operations
Stored procedures have been created to facilitate CRUD operations on database tables. These procedures encapsulate SQL queries and operations, promoting code reuse, security, and maintainability. Each CRUD operation has a corresponding stored procedure with parameters to specify input data and conditions for filtering, updating, or deleting records.
### Exam Generation
A stored procedure automates the generation of exams within the system. This procedure selects random questions from the question bank based on predefined criteria such as course and student. Each exam consists of 10 multiple-choice questions (MCQs), which are dynamically included in the exam papers. The stored procedure organizes the selected questions into exam papers and assigns unique identifiers to each exam for tracking and management.
### Exam Answers
Stored procedures handle exam answers submitted by students. These procedures validate and store student responses, associating them with the corresponding exam and student identifiers.
### Exam Correction
Stored procedures automate the correction of exams by grading student responses against predefined answer keys, calculating scores based on grading criteria, and generating feedback reports. They update student records with graded scores and performance metrics for further analysis and reporting.
### Benefits of Stored Procedures Implementation
- **Centralized Logic**: Stored procedures centralize database logic, reducing redundancy and promoting consistency in data manipulation operations.
- **Enhanced Security**: Stored procedures offer an additional layer of security by controlling access to database functionality and limiting exposure to SQL injection attacks.
- **Improved Performance**: Stored procedures can improve performance by reducing network traffic and optimizing query execution plans through parameterization and caching.
- **Ease of Maintenance**: Centralized stored procedures simplify maintenance and updates, allowing for easier troubleshooting, debugging, and version control of database logic.
The implementation of stored procedures within the Examination System project enhances efficiency, security, and maintainability in database operations, contributing to a robust and scalable system architecture.
## 4. Data Warehouse (DWH) Design and Creation Using SSMS
The Data Warehouse (DWH) stands as a vital cornerstone within the Examination System project, acting as a centralized hub for consolidated and organized data from diverse origins. This segment delves into the methodology of crafting and establishing the Data Warehouse via SQL Server Management Studio (SSMS), shedding light on pivotal facets of its structure and deployment.
### Design Phase
1. **Requirement Analysis**: Initially, a thorough analysis of reporting and analytical needs was conducted to understand the data requirements of the Examination System stakeholders.
2. **Data Modeling**: Based on the requirements, a dimensional model was designed to support reporting needs. This included identifying fact tables, dimension tables, and defining relationships between them.

4. **ETL Planning**: The Extract, Transform, Load (ETL) processes required to populate the Data Warehouse from source systems were planned. SSMS was used to create SSIS packages for ETL operations.
### Creation Process
1. **Database Creation**: Using SSMS, the Data Warehouse database schema was created based on the dimensional model designed during the planning phase. Tables were created to represent fact and dimension entities.
3. **ETL Development**: SSMS was used to develop SSIS packages for ETL processes. These packages extracted data from source systems, transformed it according to the dimensional model, and loaded it into the Data Warehouse tables.
4. **Data Quality Assurance**: Data quality checks and validation processes were implemented within SSIS packages to ensure the accuracy and integrity of data loaded into the Data Warehouse.
### Benefits of Data Warehouse Implementation
- **Centralized Data**: The Data Warehouse consolidates data from multiple sources into a single repository, providing a unified view of organizational data.
- **Improved Reporting**: With a structured data model in place, SSMS facilitates efficient reporting and analysis, enabling stakeholders to derive valuable insights from data.
- **Historical Analysis**: Historical data stored in the Data Warehouse allows for trend analysis and longitudinal studies over time.
- **Scalability and Flexibility**: SSMS provides scalability and flexibility in managing the Data Warehouse, allowing it to accommodate growing data volumes and evolving business requirements.
- **Data Governance and Security**: SSMS offers robust data governance and security features, ensuring data privacy, integrity, and compliance with regulatory requirements.
The use of SSMS for Data Warehouse design and creation in the Examination System project ensures a robust and scalable architecture, empowering stakeholders with timely and actionable insights.
-------------------------------------------------------------------------------------------------------------------------------------------------
## *To be continued*
-------------------------------------------------------------------------------------------------------------------------------------------------