%0 Generic %D 2015 %T What if... %A Tomaso Poggio %X

The background: DCLNs (Deep Convolutional Learning Networks) are doing very well

Over the last 3 years and increasingly so in the last few months, I have seen supervised DCLNs — feedforward and recurrent — do more and more of everything quite well. They seem to learn good representations for a growing number of speech and text problems (for a review by the pioneers in the field see LeCun, Bengio, Hinton, 2015). More interestingly, it is increasingly clear, as I will discuss later, that instead of being trained on millions of labeled examples they can be trained in implicitly supervised ways. This breakthrough in machine learning triggers a few dreams. What if we have now the basic answer to how to develop brain-like intelligence and its basic building blocks?...

%8 06/2015