Skip to content

Commit 7f35115

Browse files
Text correction
We do not use a leaky ReLU in Generator.
1 parent d5a4e7c commit 7f35115

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

dcgan-svhn/DCGAN_Exercise.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -251,7 +251,7 @@
251251
"\n",
252252
"What's new here is we'll use transpose convolutional layers to create our new images. \n",
253253
"* The first layer is a fully connected layer which is reshaped into a deep and narrow layer, something like 4x4x512. \n",
254-
"* Then, we use batch normalization and a leaky ReLU activation. \n",
254+
"* Then, we use batch normalization and ReLU activation. \n",
255255
"* Next is a series of [transpose convolutional layers](https://pytorch.org/docs/stable/nn.html#convtranspose2d), where you typically halve the depth and double the width and height of the previous layer. \n",
256256
"* And, we'll apply batch normalization and ReLU to all but the last of these hidden layers. Where we will just apply a `tanh` activation.\n",
257257
"\n",

0 commit comments

Comments
 (0)