01 Mini Batch Gradient Descent 02 Understanding Mini-Batch Gradient Dexcent 03 Exponentially Weighted Averages 04 Understanding Exponentially Weighted Averages 05 Bias Correction of Exponentially Weighted Averages 06 Gradient Descent With Momentum 07 RMSProp 08 Adam Optimization Algorithm 09 Learning Rate Decay
Hide player controls
Hide resume playing