Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dpressel/dliss-tutorial
Tutorial for International Summer School on Deep Learning, 2019
https://github.com/dpressel/dliss-tutorial
deep-learning machine-learning nlp
Last synced: 3 days ago
JSON representation
Tutorial for International Summer School on Deep Learning, 2019
- Host: GitHub
- URL: https://github.com/dpressel/dliss-tutorial
- Owner: dpressel
- Created: 2019-06-26T02:59:24.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2022-04-04T15:09:40.000Z (over 2 years ago)
- Last Synced: 2024-12-20T00:07:35.296Z (10 days ago)
- Topics: deep-learning, machine-learning, nlp
- Language: Jupyter Notebook
- Homepage:
- Size: 1.14 MB
- Stars: 317
- Watchers: 15
- Forks: 57
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# dliss-tutorial
Tutorial for [International Summer School
on Deep Learning, 2019](http://dl-lab.eu/) in Gdansk, Poland## Sections
### Overview Talk
https://docs.google.com/presentation/d/1DJI1yX4U5IgApGwavt0AmOCLWwso7ou1Un93sMuAWmA/
### Tutorial
There are currently 3 hands-on sections to this tutorial.- The [first section](1_pretrained_vectors.ipynb) covers pre-trained word embeddings [(colab)](https://colab.research.google.com/github/dpressel/dlss-tutorial/blob/master/1_pretrained_vectors.ipynb)
- The [second section](2_context_vectors.ipynb) covers pre-trained contextual emeddings [(colab)](https://colab.research.google.com/github/dpressel/dlss-tutorial/blob/master/2_context_vectors.ipynb)
- The [third section](3_finetuning.ipynb) covers fine-tuning a pre-trained model [(colab)](https://colab.research.google.com/github/dpressel/dlss-tutorial/blob/master/3_finetuning.ipynb)### Updates
- *April 2022* If you are interested in learning how to build different Transformer architectures from the ground up, I have a [new set of tutorials](https://github.com/dpressel/tfs) with in-depth details and full implementations of several popular Transformer models. They show how to build models step by step, how to pretrain them, and how to use them for downstream tasks. There is an accompanying Python package that contains all of the tutorial pieces put together
- *July 2020* I have posted a set of [Colab tutorials](https://github.com/dpressel/mead-tutorials) using [MEAD](https://github.com/dpressel/mead-baseline) which is referenced in these tutorials. This new set of notebooks covers similar material, including transfer learning for classification and taggers, as well as training Transformer-based models from scratch using the [MEAD API](https://github.com/dpressel/mead-baseline/tree/master/layers) with TPUs. MEAD makes it easy to train lots of powerful models for NLP using a simple YAML configuration and makes it easy to extend the code with new models while comparing against strong baselines!