Skip to content

Commit 51dd97d

Browse files
committed
master: Applications of transformers.
1 parent 1c8b0ef commit 51dd97d

File tree

2 files changed

+24
-1
lines changed
  • Chapter-wise code/Code - PyTorch/7. Attention Models/2. Neural Text Summarization

2 files changed

+24
-1
lines changed

Chapter-wise code/Code - PyTorch/7. Attention Models/2. Neural Text Summarization/1. Transformer Models/Readme.md

Lines changed: 24 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ LSTMs and GRUs can help to overcome the vanishing gradient problem, but even tho
1010
2. In a conventional Encoder-decoder architeture, the model would again take T timesteps to compute the translation.<br><br>
1111
<img src="../images/2. basic encoder-decoder.png" width="50%"></img><br>
1212

13-
## Transformers - Basics
13+
## RNN v/s Transformers
1414
```buildoutcfg
1515
TLDR:
1616
1. In RNNs, parallel computing is difficult to implement.
@@ -30,3 +30,26 @@ TLDR:
3030
<img src="../images/5. positional encoding.png" width="50%"></img><br>
3131

3232
6. Unlike the recurrent layer, the multi-head attention layer computes the outputs of each inputs in the sequence independently then it allows us to parallelize the computation. But it fails to model the sequential information for a given sequence. That is why you need to incorporate the positional encoding stage into the transformer model.
33+
34+
## Applications of Transformers
35+
36+
Some of the applications of Transformers include:
37+
1. Text summarization.
38+
2. Auto-complete.
39+
3. NER
40+
4. Automatic question-answering.
41+
5. NMT
42+
6. Chat-bots.
43+
7. Other NLP tasks:
44+
* Sentiment analysis.
45+
* Market intelligence.
46+
* Text classification.
47+
* Charecter recognition.
48+
* Spell checking.
49+
50+
## State of the art Transformers
51+
52+
1. *GPT-2*: Generative Pre-training for Transformers
53+
2. *BERT* : Bi-directional Encoder Decoder Representations from Transformers.
54+
3. *T5* : Text-To-Text Transfer Transformer.<br>
55+
<img src="../images/6. T5 model.png" width="50%"></img><br>
518 KB
Loading

0 commit comments

Comments
 (0)