Project 1: FNN FOR MNIST
- Due Jan 22, 2020 by 11:59pm
- Points 100
- Submitting a website url or a file upload
In this section we learn the workflow mentioned in above sections through a “Hello World!” project using the MNIST data set to go through the key steps in building a deep neural network.
The starter code can be found at
https://www.kaggle.com/scaomath/simple-mnist-numpy-from-scratch Links to an external site.
For this starter project, we need to do the following:
- Split the 42,000 training sample further into a training set and a
cross-validation set. -
Add an extra hidden layer in the model, so that our model now has 2 hidden
layers. -
Train the network using mini-batch SGD instead of the full gradient descent
method. -
Validate the performance of the model (e.g., change the combination of the
sizes of the hidden layers, the strength of the regularization), to keep it from
over-fitting using the two datasets from the first step.