Skip to content

Commit d3f2732

Browse files
authored
Merge pull request #109 from underscoreorcus/patch-1
Fixed a typo
2 parents 92534a8 + fec314c commit d3f2732

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -333,7 +333,7 @@
333333
"source": [
334334
"### Activation functions\n",
335335
"\n",
336-
"So far we've only been looking at the softmax activation, but in general any function can be used as an activation function. The only requirement is that for a network to approximate a non-linear function, the activation functions must be non-linear. Here are a few more examples of common activation functions: Tanh (hyperbolic tangent), and ReLU (rectified linear unit).\n",
336+
"So far we've only been looking at the sigmoid activation function, but in general any function can be used as an activation function. The only requirement is that for a network to approximate a non-linear function, the activation functions must be non-linear. Here are a few more examples of common activation functions: Tanh (hyperbolic tangent), and ReLU (rectified linear unit).\n",
337337
"\n",
338338
"<img src=\"assets/activation.png\" width=700px>\n",
339339
"\n",

0 commit comments

Comments
 (0)