Thank you for saving my summer project essay! It's really helpful to have someone show me the details rather than one sentence saying "doing this with Lagrange multipliers" on Boswell's paper 🥳
Yoooooo!! With super crisp explanations of the maths behind it, visualization super-awesome and picture-perfect presentation this video is a nice contribution to the community of ML. Super awesome work!! Keep it up.
Why doesn't anyone go through an example with real numbers so we can actually see these formulas in action?
Great stuff, hats off ! Dude kindly keep making videos of "Maths you should know" Can you please go for Hidden Markov Models and Kalman filters ?
you were just wonderful!. you explained the concept amazingly, exactly what I needed to hear and at correct speed.
The best video of SVM! Thank you for your excellent work!
This is exactly what I was looking for. End to end explanation clearly showing the steps involved. Thanks a ton man!❤
Maybe a tad to rushed, but balancing time to make video and release is propably a tricky buisness :) That being said, I like your videos 1000x more than Siraj, wish I could move some of his views to you ;) Keep it up!
Why is phi used to denote the transformation function at 1:22? I asked Chat GPT and it doesn't know why, I looked online and did not see phi used in this context at all. At the risk of sounding ignorant, I don't believe this symbol should have been used. It seems any other alternative would have been suitable without the additional confusion of the symbols meaning in context of the equation. Like why use the symbol phi when you'd assume it adds some additional context but instead it adds nothing to the understanding of the function?
I think you have a typo in the equation appearing in 9:06, last term. It should be "sum_n(xi_n(C-alpha_n-lambda_n))"
Plz for God sake keep doing this
Very well done on the explanation part and i am obsessed with your Math explanation especially the term and usage of it.
Most crisp and to the point explanation.
thank you for the deep dive into rbf kernels, my hope is to fill in my gaps w my maths so I can watch these videos and get better intuition of these topics.. for now not quite there but we still getting there !
0:42 that “kernalization” had me laughing 😂😂😂😂
I would be thankful if you tell me what maths are required for SVM and any reference to learn that. I have searched on internet for a looooooong time and every single one says just the basic math. I'm too frustrated over the internet rn.
The most informative and easily understandable video....Thanks a lot ❤❤❤
hey that slack variable which you introduced in the objective function basically gives us control over the margin from the decison boundary right (mathematically speaking what you wrote)coz in practical cases definitely data points are not always linearly separable due to presence of outliers and hence whenever our model misclassifies the slack term makes sure that the margin is increased from the decison boundary hence like we tsake steps somewhere between underfitting(no slack variable) and overfitting(slack variable present along with the penalty term) am i right in my intuition please reply dada...
At 6:09, shouldn't that be \xi>2, so that one minus that be less than -1?
@CodeEmporium