@aleksagordic9593

2:10 - 7:20 gradient descent setup
7:20 - 17:55 building intuition (n = 1 case, digression to Taylor theorem and Lagrange Remainder)
17:55 - 23:40 building intuition for n > 1
23:40 - 33:15 gradient descent constraints on the function (alfa-strong convexity, beta-smoothness)
33:15 - 51:00 theorem, proof..
51:30 - 59:50 key lemma proof
1:02:15 - 1:05:10 min squares example
1:05:10 - 1:09:00 stochastic gradient descent
1:09:00 - 1:14:00 gradient descent on non-convex functions

@aparajitanath8644

Really good lecture!

@PANKAJBEHERA-t1g

Good analysis.,

@ramneet-singh

In the proof of the geometric bound on distance between Xt and X*, to show that (eta^2 - 2*eta*beta_prime)<=0, we only need (eta <= 1/beta). I saw that there was a little bit of doubt about whether it should be eta <= 1/(2*beta) or 1/beta in the statement of the theorem. Perhaps this shows that we only need the 1/beta condition?

@fahimsanghariyat9898

Why this video buffering ??