We'll build a recurrent neural network (RNNs) in NumPy. RNNs can process sequences of data, like sentences. We'll start with the theory of RNNs, then build the forward and backward pass in NumPy.
You can find a text version of this video here - github.com/VikParuchuri/zero_to_gpt/blob/master/ex… .
And all of the previous lessons here - github.com/VikParuchuri/zero_to_gpt .
Chapters
0:00 RNN overview
6:32 Step by step forward pass
15:10 tanh activation function
19:23 Full forward pass
22:59 Full backward pass
39:43 Complete implementation
This video is part of our new course, Zero to GPT - a guide to building your own GPT model from scratch. By taking this course, you'll learn deep learning skills from the ground up. Even if you're a complete beginner, you can start with the prerequisites we offer at Dataquest to get you started.
If you're dreaming of building deep learning models, this course is for you.
Best of all, you can access the course for free while it's still in beta!
Sign up today!
bit.ly/4016NfK
コメント