Skip to content

Commit 6a699ad

Browse files
authored
Add periods to summary (#245)
1 parent a763ed6 commit 6a699ad

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

neural-networks-1.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -214,8 +214,8 @@ The takeaway is that you should not be using smaller networks because you are af
214214

215215
In summary,
216216

217-
- We introduced a very coarse model of a biological **neuron**
218-
- We discussed several types of **activation functions** that are used in practice, with ReLU being the most common choice
217+
- We introduced a very coarse model of a biological **neuron**.
218+
- We discussed several types of **activation functions** that are used in practice, with ReLU being the most common choice.
219219
- We introduced **Neural Networks** where neurons are connected with **Fully-Connected layers** where neurons in adjacent layers have full pair-wise connections, but neurons within a layer are not connected.
220220
- We saw that this layered architecture enables very efficient evaluation of Neural Networks based on matrix multiplications interwoven with the application of the activation function.
221221
- We saw that that Neural Networks are **universal function approximators**, but we also discussed the fact that this property has little to do with their ubiquitous use. They are used because they make certain "right" assumptions about the functional forms of functions that come up in practice.

0 commit comments

Comments
 (0)