Skip to content

Commit fec314c

Browse files
Fixed a typo
The explanation line under Activation Functions read 'softmax' instead of 'sigmoid'
1 parent 92534a8 commit fec314c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -333,7 +333,7 @@
333333
"source": [
334334
"### Activation functions\n",
335335
"\n",
336-
"So far we've only been looking at the softmax activation, but in general any function can be used as an activation function. The only requirement is that for a network to approximate a non-linear function, the activation functions must be non-linear. Here are a few more examples of common activation functions: Tanh (hyperbolic tangent), and ReLU (rectified linear unit).\n",
336+
"So far we've only been looking at the sigmoid activation function, but in general any function can be used as an activation function. The only requirement is that for a network to approximate a non-linear function, the activation functions must be non-linear. Here are a few more examples of common activation functions: Tanh (hyperbolic tangent), and ReLU (rectified linear unit).\n",
337337
"\n",
338338
"<img src=\"assets/activation.png\" width=700px>\n",
339339
"\n",

0 commit comments

Comments
 (0)