Minor edit when training with a custom dataset (#484)

When we want to train a custom dataset, the code should have the flexibility to accommodate variable number of classes.
This commit is contained in:
Kumar Ashutosh 2021-10-22 02:27:06 -05:00 committed by GitHub
parent ca70f0a6f6
commit dae1a1d112
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -293,8 +293,8 @@
"source": [ "source": [
"def test(model, epoch):\n", "def test(model, epoch):\n",
" model.eval()\n", " model.eval()\n",
" class_correct = list(0. for i in range(10))\n", " class_correct = list(0. for i in range(len(classes)))\n",
" class_total = list(0. for i in range(10))\n", " class_total = list(0. for i in range(len(classes)))\n",
" with torch.no_grad():\n", " with torch.no_grad():\n",
" for idx, (sounds, sample_rate, inputs, labels) in enumerate(test_loader):\n", " for idx, (sounds, sample_rate, inputs, labels) in enumerate(test_loader):\n",
" inputs = inputs.to(device)\n", " inputs = inputs.to(device)\n",