Myvideo

Guest

Login

Continual Learning in Neural Networks: on Catastrophic Forgetting and Beyond in Russian

Uploaded By: Myvideo
2 views
0
0 votes
0

Slides: Speaker: Polina Kirichenko, New York University Learning new tasks continually without forgetting on a constantly changing data distribution is essential for real-world problems but is challenging for modern deep learning. Deep learning models suffer from catastrophic forgetting: when presented with a sequence of tasks, deep neural networks can successfully learn the new tasks, but the performance on the old tasks degrades. In this talk, I will present an overview of the continual learning algorithms including well-established methods as well as recent state-of-the-art approaches. We will talk about several continual learning scenarios (task-, class-, and domain-incremental learning), review the most common approaches in alleviating forgetting and discuss other challenges in the field beyond catastrophic forgetting (including forward & backward transfer, learning on continuously drifting data and continua

Share with your friends

Link:

Embed:

Video Size:

Custom size:

x

Add to Playlist:

Favorites
My Playlist
Watch Later