Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
63,175 ratings

About the Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

AS

Apr 18, 2020

Very good course to give you deep insight about how to enhance your algorithm and neural network and improve its accuracy. Also teaches you Tensorflow. Highly recommend especially after the 1st course

XG

Oct 30, 2017

Thank you Andrew!! I know start to use Tensorflow, however, this tool is not well for a research goal. Maybe, pytorch could be considered in the future!! And let us know how to use pytorch in Windows.

Filter by:

7251 - 7253 of 7,253 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By anas e o

•

Nov 12, 2024

i work hard but i don't get any certaficate

By G V V K I

•

Jun 8, 2021

Not understandable

By Daniel A M R

•

Oct 2, 2024

non fatelo