HOME
PROJECTS
CERTIFICATION & PUBLICATION
PROFESSIONAL SKILLS
EXPERIENCE & EDUCATION
CONTACT
More
Goal:
Analyzed the difference between gradient descent, mini-batch gradient descent, and stochastic gradient descent
Implemented Momentum, RMSProp & Adam optimization algorithm.
Language used : numpy