https://github.com/emilwallner/Deep-Learning-From-Scratch
Six snippets of code that made deep learning what it is today.
https://github.com/emilwallner/Deep-Learning-From-Scratch
backpropagation deep-learning gradient-descent least-squares linear-regression mnist perceptron
Last synced: 8 months ago
JSON representation
Six snippets of code that made deep learning what it is today.
- Host: GitHub
- URL: https://github.com/emilwallner/Deep-Learning-From-Scratch
- Owner: emilwallner
- License: mit
- Created: 2017-08-16T07:17:27.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2019-10-10T17:13:10.000Z (about 6 years ago)
- Last Synced: 2025-03-31T12:39:08.234Z (9 months ago)
- Topics: backpropagation, deep-learning, gradient-descent, least-squares, linear-regression, mnist, perceptron
- Language: Jupyter Notebook
- Homepage:
- Size: 10.9 MB
- Stars: 261
- Watchers: 13
- Forks: 57
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Deep Learning From Scratch
There are six snippets of code that made deep learning what it is today. [Coding the History of Deep Learning](https://medium.com/@emilwallner/the-history-of-deep-learning-explored-through-6-code-snippets-d0a0e8545202) covers the inventors and the background to their breakthroughs. In this repo, you can find all the code samples from the story.
- **The Method of Least Squares**: The first cost function
- **Gradient Descent**: Finding the minimum of the cost function
- **Linear Regression**: Automatically decrease the cost function
- **The Perceptron**: Using a linear regression type equations to mimic a neuron
- **Artificial Neural Networks**: Leveraging backpropagation to solve non-linear problems
- **Deep Neural Networks**: Neural networks with more than one hidden layer