|Title||AI Feynman 2.0: Pareto-optimal symbolic regression exploiting graph modularity|
|Publication Type||Conference Paper|
|Year of Publication||2020|
|Authors||Udrescu, S-M, Tan, A, Feng, J, Neto, O, Wu, T, Tegmark, M|
|Conference Name||Advances in Neural Information Processing Systems 33 pre-proceedings (NeurIPS 2020)|
We present an improved method for symbolic regression that seeks to fit data toformulas that are Pareto-optimal, in the sense of having the best accuracy for a givencomplexity. It improves on the previous state-of-the-art by typically being ordersof magnitude more robust toward noise and bad data, and also by discovering manyformulas that stumped previous methods. We develop a method for discoveringgeneralized symmetries (arbitrary modularity in the computational graph of aformula) from gradient properties of a neural network fit. We use normalizingflows to generalize our symbolic regression method to probability distributionsfrom which we only have samples, and employ statistical hypothesis testing toaccelerate robust brute-force search.
- CBMM Funded