You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The notebook `FullyConnectedNets.ipynb` will have you implement fully connected
48
48
networks of arbitrary depth. To optimize these models you will implement several
49
49
popular update rules.
50
50
51
-
### Q2: Batch Normalization
51
+
### Q2: Batch Normalization (34%)
52
52
53
53
In notebook `BatchNormalization.ipynb` you will implement batch normalization, and use it to train deep fully connected networks.
54
54
55
-
### Q3: Dropout
55
+
### Q3: Dropout (10%)
56
56
57
57
The notebook `Dropout.ipynb` will help you implement dropout and explore its effects on model generalization.
58
58
59
-
### Q4: Convolutional Neural Networks
59
+
### Q4: Convolutional Neural Networks (30%)
60
60
61
61
In the notebook `ConvolutionalNetworks.ipynb` you will implement several new layers that are commonly used in convolutional networks.
62
62
63
-
### Q5: PyTorch/TensorFlow on CIFAR-10
63
+
### Q5: PyTorch/TensorFlow on CIFAR-10 (10%)
64
64
65
65
For this last part, you will be working in either TensorFlow or PyTorch, two popular and powerful deep learning frameworks. **You only need to complete ONE of these two notebooks.** While you are welcome to explore both for your own learning, there will be no extra credit.
0 commit comments