Skip to content

Commit c32a9d5

Browse files
Update rnn.md
1 parent b3fa8b8 commit c32a9d5

File tree

1 file changed

+7
-3
lines changed

1 file changed

+7
-3
lines changed

rnn.md

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -54,13 +54,17 @@ of variable length. Convnets can only take in inputs with a fixed size of width
5454
inputs with different sizes. In order to tackle this problem, we introduce Recurrent Neural Networks (RNNs).
5555

5656
### Recurrent Neural Network
57-
RNN is basically a blackbox (Figure 2), where it has an “internal state” that is updated as a sequence is processed. At every single timestep, we feed in an input vector into RNN where it modifies that state as a function of what it receives. When we tune RNN weights,
57+
RNN is basically a blackbox (Left of Figure 2), where it has an “internal state” that is updated as a sequence is processed. At every single timestep, we feed in an input vector into RNN where it modifies that state as a function of what it receives. When we tune RNN weights,
5858
RNN will show different behaviors in terms of how its state evolves as it receives these inputs.
5959
We are also interested in producing an output based on the RNN state, so we can produce these output vectors on top of the RNN (as depicted in Figure 2.
6060

61+
If we unroll an RNN model (Right of Figure 2), then there are inputs (e.g. video frame) at different timesteps shown as $$x_1, x_2, x_3$$ ... $$x_t$$.
62+
RNN at each timestep takes in two inputs -- an input frame ($$x_i$$) and previous representation of what it seems so far (i.e. history) -- to generate an output $$y_i$$ and update its history, which will get forward propagated over time. All the RNN blocks in Figure 2 (Right) are the same block that share the same parameter, but have different inputs and history at each timestep.
63+
6164
<div class="fig figcenter fighighlight">
62-
<img src="/assets/rnn/rnn_blackbox.png" width="20%" >
63-
<div class="figcaption"><b>Figure 2. </b>Simplified RNN box.</div>
65+
<img src="/assets/rnn/rnn_blackbox.png" width="16%" >
66+
<img src="/assets/rnn/unrolledRNN.png" width="60%" >
67+
<div class="figcaption"><b>Figure 2. </b>Simplified RNN box (Left) and Unrolled RNN (Right).</div>
6468
</div>
6569

6670
More precisely, RNN can be represented as a recurrence formula of some function $$f_W$$ with

0 commit comments

Comments
 (0)