Skip to content

Commit a7bbe6a

Browse files
authored
Merge pull request #60 from AllanHasegawa/fix/crash_when_batch_size_uneven
Fix crash when using an uneven batch size
2 parents 46cb729 + d7e456b commit a7bbe6a

File tree

3 files changed

+3
-3
lines changed

3 files changed

+3
-3
lines changed

convolutional-neural-networks/mnist-mlp/mnist_mlp_exercise.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -309,7 +309,7 @@
309309
" # compare predictions to true label\n",
310310
" correct = np.squeeze(pred.eq(target.data.view_as(pred)))\n",
311311
" # calculate test accuracy for each object class\n",
312-
" for i in range(batch_size):\n",
312+
" for i in range(len(target)):\n",
313313
" label = target.data[i]\n",
314314
" class_correct[label] += correct[i].item()\n",
315315
" class_total[label] += 1\n",

convolutional-neural-networks/mnist-mlp/mnist_mlp_solution.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -424,7 +424,7 @@
424424
" # compare predictions to true label\n",
425425
" correct = np.squeeze(pred.eq(target.data.view_as(pred)))\n",
426426
" # calculate test accuracy for each object class\n",
427-
" for i in range(batch_size):\n",
427+
" for i in range(len(target)):\n",
428428
" label = target.data[i]\n",
429429
" class_correct[label] += correct[i].item()\n",
430430
" class_total[label] += 1\n",

convolutional-neural-networks/mnist-mlp/mnist_mlp_solution_with_validation.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -509,7 +509,7 @@
509509
" # compare predictions to true label\n",
510510
" correct = np.squeeze(pred.eq(target.data.view_as(pred)))\n",
511511
" # calculate test accuracy for each object class\n",
512-
" for i in range(batch_size):\n",
512+
" for i in range(len(target)):\n",
513513
" label = target.data[i]\n",
514514
" class_correct[label] += correct[i].item()\n",
515515
" class_total[label] += 1\n",

0 commit comments

Comments
 (0)