Skip to content

Commit 7d3fef2

Browse files
Update rnn.md
1 parent 0b46d70 commit 7d3fef2

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

rnn.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -207,7 +207,7 @@ So far we have seen only a simple recurrence formula for the Vanilla RNN. In pra
207207
rarely ever use Vanilla RNN formula. Instead, we will use what we call a Long-Short Term Memory (LSTM)
208208
RNN.
209209

210-
### Vanilla RNN Gradient Flow
210+
### Vanilla RNN Gradient Flow & Vanishing Gradient Problem
211211
An RNN block takes in input $$x_t$$ and previous hidden representation $$h_{t-1}$$ and learn a transformation, which is then passed through tanh to produce the hidden representation $$h_{t}$$ for the next time step and output $$y_{t}$$ as shown in the equation below.
212212

213213
$$ h_t = tanh(W_{hh}h_{t-1} + W_{xh}x_t) $$

0 commit comments

Comments
 (0)