Myvideo

Guest

Login

Beyond neural scaling laws Paper Explained NVIDIA GTC giveaway!

Uploaded By: Myvideo
1 view
0
0 votes
0

„Beyond neural scaling laws: beating power law scaling via data pruning” paper explained with animations. You do not need to train your neural network on the entire dataset! Sponsor: NVIDIA ❗ use this link to register for the GTC 👉 Google Form to enter DLI credits giveaway: 📺 PaLM model explained: Check out our daily #MachineLearning Quiz Questions: Paper 📜: Sorscher, Ben, Robert Geirhos, Shashank Shekhar, Surya Ganguli, and Ari S. Morcos. “Beyond neural scaling laws: beating power law scaling via data pruning.“ arXiv preprint arXiv: (2022). Outline: 00:00 Stable Diffusion is a Latent Diffusion Model 01:43 NVIDIA (sponsor): Register for the GTC! 03:00 What are neural scaling laws? Power laws explained. 05:15 Exponential scaling in theory 07:40 What the theory predicts 09:50 Unsupervised data pruning with foundation models Thanks to our Patrons who support us in Tier 2, 3, 4: 🙏 Don Rosenthal, Dres. Trost GbR, Julián Salazar, Edvard Grødem, Vignesh Valliappan, Mutual Information, Mike Ton ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ 🔥 Optionally, pay us a coffee to help with our Coffee Bean production! ☕ Patreon: Ko-fi: ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ 🔗 Links: AICoffeeBreakQuiz: Twitter: Reddit: YouTube: #AICoffeeBreak #MsCoffeeBean #MachineLearning #AI #research​

Share with your friends

Link:

Embed:

Video Size:

Custom size:

x

Add to Playlist:

Favorites
My Playlist
Watch Later