Relational In-Context Learning via Synthetic Pre-training with Structural Prior
This paper introduces RDB-PFN, the first relational foundation model trained exclusively on over 2 million synthetic databases generated by a Relational Prior Generator, which enables strong few-shot in-context learning on real-world relational tasks despite the scarcity of private high-quality data.