AI Feynman 2.0: Pareto-optimal symbolic regression exploiting graph modularity

TitleAI Feynman 2.0: Pareto-optimal symbolic regression exploiting graph modularity
Publication TypeConference Paper
Year of Publication2020
AuthorsUdrescu, S-M, Tan, A, Feng, J, Neto, O, Wu, T, Tegmark, M
Conference NameAdvances in Neural Information Processing Systems 33 pre-proceedings (NeurIPS 2020)
Date Published12/2020
Abstract

We present an improved method for symbolic regression that seeks to fit data toformulas that are Pareto-optimal, in the sense of having the best accuracy for a givencomplexity. It improves on the previous state-of-the-art by typically being ordersof magnitude more robust toward noise and bad data, and also by discovering manyformulas that stumped previous methods. We develop a method for discoveringgeneralized symmetries (arbitrary modularity in the computational graph of aformula) from gradient properties of a neural network fit. We use normalizingflows to generalize our symbolic regression method to probability distributionsfrom which we only have samples, and employ statistical hypothesis testing toaccelerate robust brute-force search.

GitHub: https://github.com/SJ001/AI-Feynman

Readthedocs: https://ai-feynman.readthedocs.io/en/latest/

Database: https://space.mit.edu/home/tegmark/aifeynman.html

Download:  PDF icon 2006.10782.pdf

Associated Module: 

CBMM Relationship: 

  • CBMM Funded