What if...

TitleWhat if...
Publication TypeViews & Reviews
Year of Publication2015
AuthorsPoggio, TA
Date Published06/2015
Abstract

The background: DCLNs (Deep Convolutional Learning Networks) are doing very well

Over the last 3 years and increasingly so in the last few months, I have seen supervised DCLNs — feedforward and recurrent — do more and more of everything quite well. They seem to learn good representations for a growing number of speech and text problems (for a review by the pioneers in the field see LeCun, Bengio, Hinton, 2015). More interestingly, it is increasingly clear, as I will discuss later, that instead of being trained on millions of labeled examples they can be trained in implicitly supervised ways. This breakthrough in machine learning triggers a few dreams. What if we have now the basic answer to how to develop brain-like intelligence and its basic building blocks?...

Citation Key2062

Download: