top of page
Classified characters in images for training Neural Network. Examined learning cost and percentage prediction for backpropagation algorithm and an accuracy of 93% is achieved. Orthogonalization and early stopping methods are examined and then the hyperparameter space is explored to see if our model selection was correct. Tuning over the hyperparameter space is done by varying number of hidden nodes, lambda, number of training examples and number of iterations.
Language used : Matlab and Octave
Dataset: MNIST database of handwritten digits
Server: AWS EC2 linux instance
Paper published: “Searching the hyperparameter space for tuning of neural networks” in IJIRCCE scholarly international journal.
bottom of page