Paper
OmniPred: Language Models as Universal Regressors
Published Feb 22, 2024 · Xingyou Song, Oscar Li, Chansoo Lee
ArXiv
16
Citations
3
Influential Citations
Abstract
Regression is a powerful tool to accurately predict the outcome metric of a system given a set of parameters, but has traditionally been restricted to methods which are only applicable to a specific task. In this paper, we propose OmniPred, a framework for training language models as universal end-to-end regressors over $(x,y)$ data from arbitrary formats. Using data sourced from Google Vizier, one of the largest proprietary blackbox optimization databases in the world, our extensive experiments demonstrate that language models are capable of very precise numerical regression using only textual representations of mathematical parameters and values, and if given the opportunity to train at scale over multiple tasks, can significantly outperform traditional regression models.
OmniPred enables language models to perform precise numerical regression using textual representations of mathematical parameters and values, outperforming traditional regression models.
Full text analysis coming soon...