Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/astrodynamic/multilayerperceptron-in-qt-cpp

MultilayerPerceptron Project is a C++ implementation of a multilayer perceptron capable of classifying handwritten Latin alphabet images with 2 to 5 hidden layers. Built with the MVC pattern and Qt library, it requires C++17, CMake, Qt5 Widgets/Charts, and Google Test library. The program can be customized and features options.
https://github.com/astrodynamic/multilayerperceptron-in-qt-cpp

cmake cpp cpp-programming cpp17 gui image-classification machine-learning makefile mlp multilayer-perceptron multilayer-perceptron-network neural-network qt ui

Last synced: 3 days ago
JSON representation

MultilayerPerceptron Project is a C++ implementation of a multilayer perceptron capable of classifying handwritten Latin alphabet images with 2 to 5 hidden layers. Built with the MVC pattern and Qt library, it requires C++17, CMake, Qt5 Widgets/Charts, and Google Test library. The program can be customized and features options.

Awesome Lists containing this project

README

        

# MultilayerPerceptron Project

This project is an implementation of a multilayer perceptron in C++ language using C++17 standard. The perceptron is able to classify images with handwritten letters of the Latin alphabet and has from 2 to 5 hidden layers. The program is built using the MVC pattern and the GUI implementation is based on the Qt library.

## Table of Contents
- [Dependencies](#dependencies)
- [Build and Installation](#build-and-installation)
- [Usage](#usage)
- [License](#license)

## Dependencies

The following dependencies are required to build and run this project:
- C++17
- CMake
- Qt5 Widgets
- Qt5 Charts
- Google Test library

## Build and Installation

To build and install this project, please follow the instructions below:

1. Clone this repository to your local machine.
2. Open a terminal and navigate to the project directory.
3. Run `cmake -S . -B ./build` to generate the build files.
4. Run `cmake --build ./build` to build the project.
5. Run `./build/MLP` to launch the program.

If you want to uninstall the project, you can run `find ./ -name "build" -type d -exec rm -rf {} +`.

## Usage

### Running the Program

To run the program, please follow the instructions below:

1. Launch the program by running `./build/MLP`.
2. Click on the "Load Dataset" button to load the dataset.
3. Click on the "Train" button to train the perceptron.
4. Click on the "Test" button to test the perceptron.
5. Use the other buttons and input fields to customize the settings of the perceptron.

### Saving and Loading Weights

To save or load weights of the perceptron, please follow the instructions below:

1. Click on the "Save Weights" button to save the current weights of the perceptron to a file.
2. Click on the "Load Weights" button to load the weights of the perceptron from a file.

### Drawing Images

To draw images, please follow the instructions below:

1. Click on the "Draw Image" button to open the drawing window.
2. Draw an image by clicking and dragging the mouse.
3. Click on the "Classify" button to classify the drawn image.

### Real-Time Training

To start the real-time training process, please follow the instructions below:

1. Click on the "Real-Time Training" button to open the training window.
2. Input the number of epochs to train for and click on the "Start" button.
3. The error control values for each training epoch will be displayed in the graph.

### Cross-Validation

To run the training process using cross-validation, please follow the instructions below:

1. Click on the "Cross-Validation" button to open the cross-validation window.
2. Input the number of groups k to use and click on the "Start" button.
3. The average accuracy, precision, recall, f-measure, and total time spent on the experiment will be displayed on the screen.

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

MLP


Main window view


1. Basic application settings
2. Perceptron settings area
3. Perceptron learning control area
4. Manual image input area
5. Image processing result
6. Brush settings area for manual input zone
7. Trained network statistics

All the above-mentioned entities can be manipulated in various ways (displayed over the display area, closed/opened, resized and relocated), for example, like this:


Basic settings


File tab:

1. Load network edge weights from file
2. Save network edge weights to file
3. Load image to manual input area for recognition


Window tab:

1. Display brush settings area
2. Display image processing result
3. Display Perceptron settings
4. Display trained network statistics



Test MLP tab:

1. Load data for network training
2. Load data for network testing


Perceptron settings area

1. The ability to change the type of perceptron (matrix and graph)
2. Ability to change the number of training epochs
3. Ability to change the Learning Rate
4. Ability to use cross-validation
5. Change the number of hidden layers of the Perceptron (from 2 to 5) and their depth

Note: the user cannot change the initial and final layers.



Perceptron learning control area


In this program block, the user can observe the process of training and testing the network in real time.




Manual image input area

1. When the LMB is pressed, an image is created according to the mouse movements in the specified area.
2. When the RMB is pressed, the image is completely erased (the area is filled with white).


Image processing result


Displays a chart with the result of image processing from the manual input area, the result may be ambiguous, i.e. the network will find several matches and based on the chart you can see which option it leans towards more.




Brush settings area

1. Ability to choose brush mode:
2. Brush - paintbrush (draws)
3. Erase - eraser (erases)
4. Ability to choose brush width (from 1 to 100)



Statistics of the Trained Network



Shows:


  • Average accuracy

  • Precision

  • Error rate

  • Recall

  • Training time

  • Testing time

  • Error plot


Research



10 runs
100 runs
1000 runs
Average runtime per run


Matrix Perceptron
3510 sec
35100 sec
351000 sec
351 sec


Graph Perceptron
5940 sec
59400 sec
594000 sec
594 sec


Program Output Examples