September 15, 2020 - 4:00 pm
Hosted via Zoom
Prof. George Em Karniadakis, Brown University
Abstract: It is widely known that neural networks (NNs) are universal approximators of continuous functions, however, a less known but powerful result is that a NN with a single hidden layer can approximate accurately any nonlinear continuous operator. This universal approximation theorem of...
June 23, 2020 - 2:00 pm
Zoom
Marco Baroni, Facebook AI Research (Paris) and Catalan Institute for Research and Advanced Studies (Barcelona...
Title:
Is compositionality over-rated? A view from emergent neural network language analysis
Abstract:
Compositionality is the property whereby linguistic expressions that denote new composite meanings are derived by a rule-based combination of expressions denoting their parts. Linguists agree that...
April 21, 2020 - 4:00 pm
Luca Carlone
Abstract:
Spatial perception has witnessed an unprecedented progress in the last decade. Robots are now able to detect objects and create large-scale maps of an unknown environment, which are crucial capabilities for navigation and manipulation. Despite these advances, both researchers and...
March 31, 2020 - 1:00 pm
Zoom Webinar - registration Required
Profs. Amnon Shashua and Shai Shalev-Shwartz, The Hebrew University of Jerusalem, Israel
Registration is required, please see details below.
Abstract: We present an analysis of a risk-based selective quarantine model where the population is divided into low and high-risk groups. The high-risk group is quarantined until the low-risk group achieves herd-immunity. We tackle the question...
Photo of Lior Wolf
March 16, 2020 - 4:00 pm
Singleton Auditorium
Lior Wolf, Tel Aviv University and Facebook AI Research.
Please note that this talk has been canceled.
We will reschedule his talk at the earliest convenience.
 
Abstract: Hypernetworks, also known as dynamic networks, are neural networks in which the weights of at least some of the layers vary dynamically based on the input. Such networks have...
Michael Douglas
February 25, 2020 - 4:00 pm
Singleton Auditorium
Michael Douglas, Stony Brook
Title:  How will we do mathematics in 2030 ?
Abstract:
We make the case that over the coming decade, computer assisted reasoning will become far more widely used in the mathematical sciences. This includes interactive and automatic theorem verification, symbolic algebra,  and emerging technologies...
February 4, 2020 - 4:00 pm
Singleton Auditorium
Leslie Pack Kaelbling, CSAIL
Abstract: We, as robot engineers, have to think hard about our role in the design of robots and how it interacts with learning, both in "the factory" (that is, at engineering time) and in "the wild" (that is, when the robot is delivered to a customer). I will share some general thoughts about the...
Photo of Thomas Serre
November 5, 2019 - 4:00 pm
Singleton Auditorium
Thomas Serre, Cognitive, Linguistic & Psychological Sciences Department, Carney Institute for Brain...
Title: Feedforward and feedback processes in visual recognition
Abstract: Progress in deep learning has spawned great successes in many engineering applications. As a prime example, convolutional neural networks, a type of feedforward neural networks, are now approaching – and sometimes even...
Photo of Thomas Icard
October 29, 2019 - 4:00 pm
Star Seminar Room (Stata D463)
Thomas Icard, Stanford
Abstract: How might we assess the expressive capacity of different classes of probabilistic generative models? The subject of this talk is an approach that appeals to machines of increasing strength (finite-state, recursive, etc.), or equivalently, by probabilistic grammars of increasing complexity...
Photo of Mikhail Belkin
October 28, 2019 - 4:00 pm
Singleton Auditorium
Mikhail Belkin, Professor, The Ohio State University - Department of Computer Science and Engineering,...
Title: Beyond Empirical Risk Minimization: the lessons of deep learning
Abstract: "A model with zero training error is  overfit to the training data and  will typically generalize poorly"  goes statistical textbook wisdom.  Yet, in modern practice, over-parametrized deep networks with   near ...
Photo of Jack Hidary
October 2, 2019 - 11:00 am
Singleton Auditorium
Jack Hidary, Alphabet X, formerly Google X
Abstract: Jack Hidary will take us through the nascent, but promising field of quantum computing and his new book, Quantum Computing: An Applied Approach
Bio: Jack D. Hidary is a research scientist in quantum computing and in AI at Alphabet X, formerly Google X. He and his group develop and...
Photo of Maia Fraser
September 17, 2019 - 4:00 pm
MIT Building 46-3002 (Singleton Auditorium)
Maia Fraser, Assistant Professor University of Ottawa
Abstract: Hierarchical learning is found widely in biological organisms. There are several compelling arguments for advantages of this structure. Modularity (reusable components) and function approximation are two where theoretical support is readily available. Other, more statistical, arguments...
Photo of Blake Richards
April 26, 2019 - 4:00 pm
Singleton Auditorium(46-3002)
Blake Richards, Assistant Professor, Associate Fellow of the Canadian Institute for Advanced Research (CIFAR)
Abstract: 
Theoretical and empirical results in the neural networks literature demonstrate that effective learning at a real-world scale requires changes to synaptic weights that approximate the gradient of a global loss function. For neuroscientists, this means that the brain must have mechanisms...
Photo of Jon Bloom
April 2, 2019 - 4:00 pm
MIT Building 46-3002 (Singleton Auditorium)
Dr. Jon Bloom, Broad Institute
Abstract:  When trained to minimize reconstruction error, a linear autoencoder (LAE) learns the subspace spanned by the top principal directions but cannot learn the principal directions themselves. In this talk, I'll explain how this observation became the focus of a project on representation...
March 22, 2019 - 4:00 pm
Julio Martinez-Trujillo
Abstract: The brain’s memory systems are like time machines for thought: they transport sensory experiences from the past to the present, to guide our current decisions and actions. Memories have been classified into long-term, stored for time intervals of days, months, or years, and short-term,...

Pages