Myvideo

Guest

Login

Lecture 2 - ML Refresher / Softmax Regression

Uploaded By: Myvideo
2 views
0
0 votes
0

Lecture 2 of the online course Deep Learning Systems: Algorithms and Implementation. This lecture covers a refresher of the basic principles of (supervised) machine learning, as exemplified by the softmax regression algorithm. We will go through the derivation of the softmax regression method and stochastic gradient descent applied to train this class of model. Sign up for the course for free at . Contents: 00:00 - Introduction 01:08 - Machine learning and data-driven programming 05:34 - Three ingredients of a machine learning algorithm 08:40 - Multi-class classification setting 12:04 - Linear hypothesis function 16:52 - Matrix batch notation 22:34 - Loss function #1: classification error 26:44 - Loss function #2: softmax / cross-entropy loss 35:28 - The softmax regression optimization problem 39:16 - Optimization: gradient descent 50:35 - Stochastic gradient descent 55:26 - The gradient of the softmax objective 1:08:16 - The slide I'm embarrassed to include... 1:16:49 - Putting it all together

Share with your friends

Link:

Embed:

Video Size:

Custom size:

x

Add to Playlist:

Favorites
My Playlist
Watch Later