https://github.com/nazanin1369/miniflow
Building a Neural Network from scratch
https://github.com/nazanin1369/miniflow
backpropagation-learning-algorithm neural-network
Last synced: 2 months ago
JSON representation
Building a Neural Network from scratch
- Host: GitHub
- URL: https://github.com/nazanin1369/miniflow
- Owner: Nazanin1369
- Created: 2017-02-28T04:41:59.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2017-03-04T22:57:53.000Z (about 8 years ago)
- Last Synced: 2025-01-07T20:49:44.875Z (4 months ago)
- Topics: backpropagation-learning-algorithm, neural-network
- Language: Python
- Size: 9.77 KB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
### miniFlow
####Building a Neural Network from scratch :tada: :cat:Here I have built a library called MiniFlow which will be my own version of TensorFlow!
TensorFlow is one of the most popular open source neural network libraries, built by the team at Google Brain over just the last few years.
So why building MiniFlow?
The goal of this lab is to demystify two concepts at the heart of neural networks - **backpropagation** and **differentiable graphs**.
* Backpropagation is the process by which neural networks update the weights of the network over time. (You may have seen it in this video earlier.)
* Differentiable graphs are graphs where the nodes are differentiable functions. They are also useful as visual aids for understanding and calculating complicated derivatives.
This is the fundamental abstraction of TensorFlow - it's a framework for creating differentiable graphs.
With graphs and backpropagation, we will be able to create our own nodes and properly compute the derivatives.
Even more importantly, we will be able to think and reason in terms of these graphs.