Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/pppw/deep-learning-random-explore


https://github.com/pppw/deep-learning-random-explore

backward-propagation cnn-architecture deep-learning fastai keras lstm

Last synced: about 3 hours ago
JSON representation

Awesome Lists containing this project

README

        

# Deep Learning: Random Explore

A set of notebooks explore deep learning related topics.

* [CNN architectures](CNN_archs/cnn_archs.ipynb): look into the structures of common CNN architectures, such as ResNet, [ResNeXt](https://arxiv.org/abs/1611.05431), [SENet](https://arxiv.org/pdf/1709.01507.pdf), [Densenet](https://arxiv.org/pdf/1608.06993.pdf), [Inception V4](https://arxiv.org/pdf/1602.07261.pdf), [WRN](https://arxiv.org/pdf/1605.07146.pdf), [Xception](https://arxiv.org/pdf/1610.02357.pdf), [Dual Path Networks](https://arxiv.org/abs/1707.01629), [NASNet](https://arxiv.org/abs/1707.07012), [Progressive Neural Architecture Search](https://arxiv.org/abs/1712.00559), [VGG](https://arxiv.org/pdf/1409.1556.pdf), etc., and how to use them in [fastai](https://docs.fast.ai/).

* [EfficientNet paper study](efficientnet/EfficientNet.ipynb): study the official implementation. As the building blocks, the mobile inverted residual blocks and the Squeeze-and-Excitation networks are also studied here.

* [WGAN paper study](wgan/wgan.ipynb): replicate some results in the WGAN paper.

* [An easy way to do the backward propagation math](backward_propagation_for_all/README.md): use a simple rule to derive the backward propagation for all different kinds of neural networks, such as LSTM, CNN, etc.

* [Resume interrupted 1cycle policy training](divide_1cycle/README.md): divide the long training process into smaller ones and resume the training.

* [How the LSTM's memory works?](LSTM_memory_cells/README.md): dig into the LSTM's internal states to see how it manages to generate valid XML texts.

* [Parameter counts for popular CNN architectures](CNN_archs_param_counts/README.md)

* _To be continued ..._