Myvideo

Guest

Login

Backpropagation Explained

Uploaded By: Myvideo
1 view
0
0 votes
0

The most popular optimization strategy in machine learning is called gradient descent. When gradient descent is applied to neural networks, its called back-propagation. In this video, i'll use analogies, animations, equations, and code to give you an in-depth understanding of this technique. Once you feel comfortable with back-propagation, everything else becomes easier. It uses calculus to help us update our machine learning models. Enjoy! Code for this video:

Share with your friends

Link:

Embed:

Video Size:

Custom size:

x

Add to Playlist:

Favorites
My Playlist
Watch Later