[Google_Bootcamp_Day5]
Updated:
Parameters W and BPermalink
Vectorized ImplementationPermalink
Intuition about deep representationPermalink
Building blocks of deep neural networksPermalink
Forward and Backward propagationPermalink
-
Forward propagation for layer l
-
Backward propagation for layer l
-
Summary
HyperparametersPermalink
- Parameters : W[1], b[1], W[2], b[2], …
- Hyperparameters
- learning rate
- number of iterations
- number of hidden layers
- number of hidden units
- choice of activation functions
- momentum
- mini-batch size
- regularization
- etc …
Final review for forward and backward propagationPermalink
[Source] https://www.coursera.org/learn/neural-networks-deep-learning
Leave a comment