[Google_Bootcamp_Day9]
Updated:
RMSprop (Root Mean Square prop)

Adam optimization algorithm (RMSprop + momentum)

- alpha : needs to be tune
- beta_1 : 0.9 (dw) -> recommended default value
- beta_2 : 0.999 (dw^2) -> recommended default value
- epsilon : 10^(-8) -> recommended default value
Learning rate decay

Other learning rate decay methods
Local optima in neural networks

Problem of plateaus
- Plateaus : region where the derivative is close to zero for a long time

[Source] https://www.coursera.org/learn/deep-neural-network
Leave a comment