Skip to content

Commit 7635595

Browse files
committed
Fix minor typos in char rnn notebooks
1 parent 4af6b16 commit 7635595

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

recurrent-neural-networks/char-rnn/Character_Level_RNN_Exercise.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -324,7 +324,7 @@
324324
"In `__init__` the suggested structure is as follows:\n",
325325
"* Create and store the necessary dictionaries (this has been done for you)\n",
326326
"* Define an LSTM layer that takes as params: an input size (the number of characters), a hidden layer size `n_hidden`, a number of layers `n_layers`, a dropout probability `drop_prob`, and a batch_first boolean (True, since we are batching)\n",
327-
"* Define a dropout layer with `dropout_prob`\n",
327+
"* Define a dropout layer with `drop_prob`\n",
328328
"* Define a fully-connected layer with params: input size `n_hidden` and output size (the number of characters)\n",
329329
"* Finally, initialize the weights (again, this has been given)\n",
330330
"\n",
@@ -557,7 +557,7 @@
557557
},
558558
"outputs": [],
559559
"source": [
560-
"## TODO: set you model hyperparameters\n",
560+
"## TODO: set your model hyperparameters\n",
561561
"# define and print the net\n",
562562
"n_hidden=\n",
563563
"n_layers=\n",

recurrent-neural-networks/char-rnn/Character_Level_RNN_Solution.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -383,7 +383,7 @@
383383
"In `__init__` the suggested structure is as follows:\n",
384384
"* Create and store the necessary dictionaries (this has been done for you)\n",
385385
"* Define an LSTM layer that takes as params: an input size (the number of characters), a hidden layer size `n_hidden`, a number of layers `n_layers`, a dropout probability `drop_prob`, and a batch_first boolean (True, since we are batching)\n",
386-
"* Define a dropout layer with `dropout_prob`\n",
386+
"* Define a dropout layer with `drop_prob`\n",
387387
"* Define a fully-connected layer with params: input size `n_hidden` and output size (the number of characters)\n",
388388
"* Finally, initialize the weights (again, this has been given)\n",
389389
"\n",

0 commit comments

Comments
 (0)