Drift-Resilient TabPFN is an evolution of TabPFN, specifically designed to handle temporal distribution shifts in tabular data. Using In-Context Learning with a Prior-Data Fitted Network, it learns to recognize and adapt to changes in data distributions over time.
Pre-trained on millions of synthetic datasets generated by evolving structural causal models (SCMs), this framework effectively predicts in scenarios where data distributions are non-stationary.
We are in the process of preparing the public repository for this work. The repository will include the code and an interactive demo notebook that will allow users to reproduce the results from our paper and experiment with the pre-trained models using a scikit-learn interface. The repository will be available a few weeks after the NeurIPS conference.
For any questions or issues before the release, please contact Kai Helli or David Schnurr.
For more detailed information, please refer to our NeurIPS 2024 paper:
Drift-Resilient TabPFN: In-Context Learning Temporal Distribution Shifts on Tabular Data
If you use our work in your research, please cite us:
@inproceedings{
helli2024driftresilient,
title={Drift-Resilient Tab{PFN}: In-Context Learning Temporal Distribution Shifts on Tabular Data},
author={Kai Helli and David Schnurr and Noah Hollmann and Samuel M{\"u}ller and Frank Hutter},
booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems},
year={2024},
url={https://openreview.net/forum?id=p3tSEFMwpG}
}