Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/atharva309/functionapproximation_backpropagation
function approximations (sine, cos, log, user defined function) using back propagation
https://github.com/atharva309/functionapproximation_backpropagation
back-propagation
Last synced: 21 days ago
JSON representation
function approximations (sine, cos, log, user defined function) using back propagation
- Host: GitHub
- URL: https://github.com/atharva309/functionapproximation_backpropagation
- Owner: Atharva309
- Created: 2023-05-05T20:05:43.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-08-16T22:01:45.000Z (5 months ago)
- Last Synced: 2024-08-16T23:20:38.810Z (5 months ago)
- Topics: back-propagation
- Language: Jupyter Notebook
- Homepage:
- Size: 138 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Function Approximation Using Backpropagation
This project involves using backpropagation to approximate mathematical functions such as sine, cosine, logarithm, and user-defined functions. The goal is to train neural networks to learn these functions accurately through iterative learning.
## What is Backpropagation?
Backpropagation is a key algorithm used for training artificial neural networks. It calculates the gradient of the loss function with respect to each weight in the network, allowing the model to update weights efficiently. The process involves two main phases:
1. **Forward Pass**: The input is propagated through the network to produce an output.
2. **Backward Pass**: The error between the predicted output and the actual output is propagated back through the network to adjust weights using gradient descent.This iterative process continues until the model accurately approximates the target function.
## Functions Covered
- **Sine Function**
- **Cosine Function**
- **Logarithmic Function**
- **User-Defined Functions**## How to Run
1. Clone the repository and set up the environment.
2. Define the function you want to approximate or select from the existing functions.
3. Run the training script to see how the model approximates the function.## Conclusion
This project demonstrates how neural networks can be effectively used for function approximation using the backpropagation algorithm, highlighting its versatility in solving complex mathematical problems.