https://github.com/ilikepizza2/perceptron
A neat, lightweight and single neuron perceptron written in C++ from scratch without any external library, trained using the perceptron trick and loss function
https://github.com/ilikepizza2/perceptron
deep-learning loss-function perceptron perceptron-trick
Last synced: 11 months ago
JSON representation
A neat, lightweight and single neuron perceptron written in C++ from scratch without any external library, trained using the perceptron trick and loss function
- Host: GitHub
- URL: https://github.com/ilikepizza2/perceptron
- Owner: Ilikepizza2
- Created: 2023-12-19T11:07:00.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2023-12-20T11:05:46.000Z (about 2 years ago)
- Last Synced: 2023-12-20T17:03:06.254Z (about 2 years ago)
- Topics: deep-learning, loss-function, perceptron, perceptron-trick
- Language: C++
- Homepage:
- Size: 26.4 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Perceptron Implementation in C++
This C++ program demonstrates a basic implementation of a perceptron trained using the loss function and the perceptron trick. The perceptron is a fundamental building block of neural networks and is capable of binary classification.
## Getting Started
### Prerequisites
- C++ Compiler
### Usage
1. Clone the repository:
```bash
git clone https://github.com/your-username/perceptron-cpp.git
```
2. Compile the C++ program:
```bash
g++ perceptron.cpp -o perceptron
```
3. Run the executable:
```bash
./perceptron
```
### Input Data
The program expects two CSV files:
1. `train.csv` - Training data containing input features and corresponding binary labels.
2. `test.csv` - Testing data for evaluating the trained perceptron.
## Code Overview
- `split_nums`: Function to split a string into a vector of doubles.
- `printCols`: Function to print input features and labels.
- `sum`: Function to calculate the sum of two vectors, including a bias term.
- `classifier`: Function to classify input using the trained perceptron.
The program performs training using the perceptron trick and then tests the perceptron on a separate dataset.
## Parameters
- Learning Rate: 0.1
- Epochs: 1000
## Considerations
- This is a very basic single neuron perceptron with random weights. The sample data given is also very small (~100 rows). So the current outputs may vary `a lot`. Thus, it is for learning purposes only.
## Contributing
Feel free to contribute by opening issues or submitting pull requests.
## License
This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details.
## Acknowledgments
- Inspired by the concept of perceptrons and neural networks.