Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/morvanzhou/pytorch-tutorial
Build your neural network easy and fast, 莫烦Python中文教学
https://github.com/morvanzhou/pytorch-tutorial
autoencoder batch batch-normalization classification cnn dqn dropout gan generative-adversarial-network machine-learning neural-network python pytorch pytorch-tutorial pytorch-tutorials regression reinforcement-learning rnn tutorial
Last synced: 6 days ago
JSON representation
Build your neural network easy and fast, 莫烦Python中文教学
- Host: GitHub
- URL: https://github.com/morvanzhou/pytorch-tutorial
- Owner: MorvanZhou
- License: mit
- Created: 2017-05-05T15:12:04.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2023-03-23T05:01:42.000Z (almost 2 years ago)
- Last Synced: 2025-01-08T18:06:35.302Z (13 days ago)
- Topics: autoencoder, batch, batch-normalization, classification, cnn, dqn, dropout, gan, generative-adversarial-network, machine-learning, neural-network, python, pytorch, pytorch-tutorial, pytorch-tutorials, regression, reinforcement-learning, rnn, tutorial
- Language: Jupyter Notebook
- Homepage: https://mofanpy.com/tutorials/machine-learning/torch/
- Size: 14.7 MB
- Stars: 8,196
- Watchers: 214
- Forks: 3,112
- Open Issues: 30
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
### If you'd like to use **Tensorflow**, no worries, I made a new **Tensorflow Tutorial** just like PyTorch. Here is the link: [https://github.com/MorvanZhou/Tensorflow-Tutorial](https://github.com/MorvanZhou/Tensorflow-Tutorial)
# pyTorch Tutorials
In these tutorials for pyTorch, we will build our first Neural Network and try to build some advanced Neural Network architectures developed recent years.
Thanks for [liufuyang's](https://github.com/liufuyang) [**notebook files**](tutorial-contents-notebooks)
which is a great contribution to this tutorial.* pyTorch basic
* [torch and numpy](tutorial-contents/201_torch_numpy.py)
* [Variable](tutorial-contents/202_variable.py)
* [Activation](tutorial-contents/203_activation.py)
* Build your first network
* [Regression](tutorial-contents/301_regression.py)
* [Classification](tutorial-contents/302_classification.py)
* [An easy way](tutorial-contents/303_build_nn_quickly.py)
* [Save and reload](tutorial-contents/304_save_reload.py)
* [Train on batch](tutorial-contents/305_batch_train.py)
* [Optimizers](tutorial-contents/306_optimizer.py)
* Advanced neural network
* [CNN](tutorial-contents/401_CNN.py)
* [RNN-Classification](tutorial-contents/402_RNN_classifier.py)
* [RNN-Regression](tutorial-contents/403_RNN_regressor.py)
* [AutoEncoder](tutorial-contents/404_autoencoder.py)
* [DQN Reinforcement Learning](tutorial-contents/405_DQN_Reinforcement_learning.py)
* [A3C Reinforcement Learning](https://github.com/MorvanZhou/pytorch-A3C)
* [GAN (Generative Adversarial Nets)](tutorial-contents/406_GAN.py) / [Conditional GAN](tutorial-contents/406_conditional_GAN.py)
* Others (WIP)
* [Why torch dynamic](tutorial-contents/501_why_torch_dynamic_graph.py)
* [Train on GPU](tutorial-contents/502_GPU.py)
* [Dropout](tutorial-contents/503_dropout.py)
* [Batch Normalization](tutorial-contents/504_batch_normalization.py)**For Chinese speakers: All methods mentioned below have their video and text tutorial in Chinese.
Visit [莫烦 Python](https://mofanpy.com/tutorials/) for more.
You can watch my [Youtube channel](https://www.youtube.com/channel/UCdyjiB5H8Pu7aDTNVXTTpcg) as well.**### [Regression](tutorial-contents/301_regression.py)
### [Classification](tutorial-contents/302_classification.py)
### [CNN](tutorial-contents/401_CNN.py)
### [RNN](tutorial-contents/403_RNN_regressor.py)
### [Autoencoder](tutorial-contents/404_autoencoder.py)
### [GAN (Generative Adversarial Nets)](tutorial-contents/406_GAN.py)
### [Dropout](tutorial-contents/503_dropout.py)
### [Batch Normalization](tutorial-contents/504_batch_normalization.py)
# Donation
*If this does help you, please consider donating to support me for better tutorials. Any contribution is greatly appreciated!*