Myvideo

Guest

Login

Theoretical Deep Learning #2: Worst-case bounds. Part 4

Uploaded By: Myvideo
4 views
0
0 votes
0

We complete with bounding test-train risk difference of a deep fully-connected network with 1-Lipschitz non-linearity. Resulting bound grows with Lipschitz constant of the learned network and with number of layers. Find all relevant info on github page: Our open-source framework to develop and deploy conversational assistants:

Share with your friends

Link:

Embed:

Video Size:

Custom size:

x

Add to Playlist:

Favorites
My Playlist
Watch Later