We show that deep networks are better than shallow networks at approximating functions that can be expressed as a composition of functions described by a directed acyclic graph, because the deep networks can be designed to have the same compositional structure, while a shallow network cannot exploit this knowledge. Thus, the blessing of compositionality mitigates the curse of dimensionality. On the other hand, a theorem called good propagation of errors allows to “lift” theorems about shallow networks to those about deep networks with an appropriate choice of norms, smoothness, etc. We illustrate this in three contexts where each channel in the deep network calculates a spherical polynomial, a non-smooth ReLU network, or another zonal function network related closely with the ReLU network.

%8 05/2019 %1https://arxiv.org/pdf/1905.12882.pdf

%2https://hdl.handle.net/1721.1/121183

%0 Journal Article %J Analysis and Applications %D 2016 %T Deep vs. shallow networks: An approximation theory perspective %A Mhaskar, H. N. %A Tomaso Poggio %K blessed representation %K deep and shallow networks %K Gaussian networks %K ReLU networks %XThe paper briefly reviews several recent results on hierarchical architectures for learning from examples, that may formally explain the conditions under which Deep Convolutional Neural Networks perform much better in function approximation problems than shallow, one-hidden layer architectures. The paper announces new results for a non-smooth activation function — the ReLU function — used in present-day neural networks, as well as for the Gaussian networks. We propose a new definition of *relative dimension* to encapsulate different notions of sparsity of a function class that can possibly be exploited by deep networks but not by shallow ones to drastically reduce the complexity required for approximation and learning.

%B Analysis and Applications
%V 14
%P 829 - 848
%8 01/2016
%G eng
%U http://www.worldscientific.com/doi/abs/10.1142/S0219530516400042
%N 06
%! Anal. Appl.
%R 10.1142/S0219530516400042