[Google_Bootcamp_Day2]

Updated:

Logistic regression recap

recap

Logistic regression derivatives

logistic

1 single step of Gradient descent on m examples

implementation

  • 2 for loops (1 for m training examples and 1 or number of features) are inefficient
  • Needs to be vectorized implementation

Vectorization

vectorized

  • In neural network programming, whenever possible, avoid explicit for-loops

Logistic regression derivatives

ex2

Vectorizing Logistic regression

  • Before vectorization (with for loop) before

  • After vectorization (without for loop) after

Vectorizing Logistic regression’s gradient computation

gradient vector

Implementing Logistic Regression

imple

[Source] https://www.coursera.org/learn/neural-networks-deep-learning

Categories:

Updated:

Leave a comment