Deep Symbolic Regression for Recurrent Sequences

June 28, 2022

Abstract

Symbolic regression, i.e. predicting a function from the observation of its values, is well-known to be a challenging task. In this paper, we train Transformers to infer the function or recur- rence relation underlying sequences of integers or floats, a typical task in human IQ tests which has hardly been tackled in the machine learning literature. We evaluate our integer model on a subset of OEIS sequences, and show that it out- performs built-in Mathematica functions for re- currence prediction. We also demonstrate that our float model is able to yield informative ap- proximations of out-of-vocabulary functions and constants, e.g. bessel0(x) ≈ (sin(x)+cos(x))/πx and 1.644934 ≈ π^2/6.

Download the Paper

AUTHORS

Written by

François Charton

Guillaume Lample

Pierre-Alexandre Kamienny

Stéphane d'Ascoli

Publisher

ICML

Research Topics

Human & Machine Intelligence

Core Machine Learning

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.