https://github.com/emilwallner/deep-learning-from-scratch
Six snippets of code that made deep learning what it is today.
https://github.com/emilwallner/deep-learning-from-scratch
backpropagation deep-learning gradient-descent least-squares linear-regression mnist perceptron
Last synced: about 2 months ago
JSON representation
Six snippets of code that made deep learning what it is today.
- Host: GitHub
- URL: https://github.com/emilwallner/deep-learning-from-scratch
- Owner: emilwallner
- License: mit
- Created: 2017-08-16T07:17:27.000Z (almost 8 years ago)
- Default Branch: master
- Last Pushed: 2019-10-10T17:13:10.000Z (over 5 years ago)
- Last Synced: 2025-04-18T07:50:12.573Z (2 months ago)
- Topics: backpropagation, deep-learning, gradient-descent, least-squares, linear-regression, mnist, perceptron
- Language: Jupyter Notebook
- Homepage:
- Size: 10.9 MB
- Stars: 261
- Watchers: 13
- Forks: 57
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Deep Learning From Scratch
There are six snippets of code that made deep learning what it is today. [Coding the History of Deep Learning](https://medium.com/@emilwallner/the-history-of-deep-learning-explored-through-6-code-snippets-d0a0e8545202) covers the inventors and the background to their breakthroughs. In this repo, you can find all the code samples from the story.
- **The Method of Least Squares**: The first cost function
- **Gradient Descent**: Finding the minimum of the cost function
- **Linear Regression**: Automatically decrease the cost function
- **The Perceptron**: Using a linear regression type equations to mimic a neuron
- **Artificial Neural Networks**: Leveraging backpropagation to solve non-linear problems
- **Deep Neural Networks**: Neural networks with more than one hidden layer