Presentation given by Matthew Colbrook on 11th May 2022 in the one world seminar on the mathematics of machine learning on the topic “Smale’s 18th Problem and the Barriers of Deep Learning“. Abstract: Deep learning (DL) has had unprecedented success and is now rapidly entering scientific computing (SC). However, DL suffers from a universal phenomenon: instability, despite universal approximation results that often guarantee the existence of stable and accurate neural networks (NNs). We show the following paradox. There are well-conditioned problems in SC where one can prove the existence of NNs with great approximation qualities, however, there does not exist any algorithm that can train such a NN. For any positive integers n (greater than 2) and M, there are cases where simultaneously: (a) no algorithm can train a NN correct to n digits, (b) there exists an algorithm that trains a NN with n-1 correct digits, but any such algorithm needs arbitrarily many training data, (c) there exists an algorithm that trai
Hide player controls
Hide resume playing