[Google_Bootcamp_Day9]

Updated:

RMSprop (Root Mean Square prop)

rms

Adam optimization algorithm (RMSprop + momentum)

adam

  • alpha : needs to be tune
  • beta_1 : 0.9 (dw) -> recommended default value
  • beta_2 : 0.999 (dw^2) -> recommended default value
  • epsilon : 10^(-8) -> recommended default value

Learning rate decay

rl rl_2

Other learning rate decay methods

Local optima in neural networks

local

Problem of plateaus

  • Plateaus : region where the derivative is close to zero for a long time plateaus

[Source] https://www.coursera.org/learn/deep-neural-network

Categories:

Updated:

Leave a comment