Loading...
「ツール」は右上に移動しました。
利用したサーバー: wtserver1
5584いいね 249,351 views回再生

Backpropagation Details Pt. 1: Optimizing 3 parameters simultaneously.

The main ideas behind Backpropagation are super simple, but there are tons of details when it comes time to implementing it. This video shows how to optimize three parameters in a Neural Network simultaneously and introduces some Fancy Notation.

NOTE: This StatQuest assumes that you already know the main ideas behind Backpropagation:    • Neural Networks Pt. 2: Backpropagatio...  
...and that also means you should be familiar with...
Neural Networks:    • The Essential Main Ideas of Neural Ne...  
The Chain Rule:    • The Chain Rule  
Gradient Descent:    • Gradient Descent, Step-by-Step  

LAST NOTE: When I was researching this 'Quest, I found this page by Sebastian Raschka to be helpful: https://sebastianraschka.com/faq/docs...

For a complete index of all the StatQuest videos, check out:
https://statquest.org/video-index/

If you'd like to support StatQuest, please consider...

Patreon:   / statquest  
...or...
YouTube Membership:    / @statquest  

...buying one of my books, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...
https://statquest.org/statquest-store/

...or just donating to StatQuest!
https://www.paypal.me/statquest

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
  / joshuastarmer  

0:00 Awesome song and introduction
3:01 Derivatives do not change when we optimize multiple parameters
6:28 Fancy Notation
10:51 Derivatives with respect to two different weights
15:02 Gradient Descent for three parameters
17:19 Fancy Gradient Descent Animation

#StatQuest #NeuralNetworks #Backpropagation

コメント