Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/abhilash1910/transformers-workshop

Transformers Workshop on behalf of ML India. Contains resource notebook for training/inferring large scale transformer models for different downstream task
https://github.com/abhilash1910/transformers-workshop

bert distilbert electra gpt2 keras t5 tf2 torch transformers

Last synced: 4 days ago
JSON representation

Transformers Workshop on behalf of ML India. Contains resource notebook for training/inferring large scale transformer models for different downstream task

Awesome Lists containing this project

README

        

# Transformers-Workshop :rocket:

# NLP-Workshop-ML-India

This repository contains the codes and the notebooks for NLP Workshop which was organized by ML India from June 19- July 11.

## Contents

[Notebook](https://www.kaggle.com/abhilash1910/nlp-workshop-ml-india-autoregressive-models) contains the contents for the Transformers part of the session. This mainly relies on Transformer models.

The contents include:

1. Encoder Decoder Architecture
2. Disadvantages of Encoder Decoders
3. Transformer Architectures
4. Attention Mechanism
5. Bahdanau,Luong Attention
6. Self and Multi Head Attention
7. Designing a Keras Transformer
8. Extacting Distilbert/BERT embeddings for finetuning on classification task
9. Working with input ids,tokens and attention masks for Transformer models
10. Inference Tasks using different transformers
11. Bert based QA inference
12. Encoder Decoder T5 architecture for Summarization Inference
13. GPT2 model for Text Generation Inference
14. Encoder Decoder Electra Model for NER Inference
15. DialogRPT Model for Text Classification Inference
16. T5 for Text 2 Text Paraphrasing/Generation
17. BART encoder decoder model for Zero Shot Classification
18. Also contains samples for training Transformers on downstream tasks such as Token Classification /SQuAD etc.

## Guidelines

This code has been released under [Apache License](https://www.apache.org/licenses/GPL-compatibility.html). The resources for the notebooks is present inside Kaggle,particularly embedding
files. These can be used locally by either downloading them from kaggle manually or can be used in kaggle notebooks by using the "Add Data" tab in kaggle notebooks.