[Google_Bootcamp_Day2]
Updated:
Logistic regression recap

Logistic regression derivatives

1 single step of Gradient descent on m examples

- 2 for loops (1 for m training examples and 1 or number of features) are inefficient
- Needs to be vectorized implementation
Vectorization

- In neural network programming, whenever possible, avoid explicit for-loops
Logistic regression derivatives

Vectorizing Logistic regression
-
Before vectorization (with for loop)

-
After vectorization (without for loop)

Vectorizing Logistic regression’s gradient computation

Implementing Logistic Regression

[Source] https://www.coursera.org/learn/neural-networks-deep-learning
Leave a comment