With a glance, you perceive whether a stack of dishes is going to topple, a tipping glass of wine is going to splatter on your clothes, or the plate you dropped is likely to shatter. Given the high complexity of the underlying physics of these scenarios and that we do not have direct perception of the parameters involved (mass, density, position, etc.), how do we do this? Humans somehow represent a coarse approximation of the physics in real-time under perceptual uncertainty. Inspired by the graphics literature, we are developing "particle-based simulation" as a general framework for human physical reasoning about the mechanics of objects and materials in our environments. This framework seeks not only to explain the underpinnings of physical reasoning in adults, but also its developmental origins. The current project develops probabilistic simulation models to explain people's predictions about how a liquid will flow through solid obstacles.