Relational Transformer β PluRel Checkpoints
Relational Transformer (RT) model checkpoints pretrained on synthetic relational databases generated by PluRel.
Relational Transformer is a foundation model architecture for relational data that enables zero-shot transfer across heterogeneous schemas and tasks. It was introduced in:
Relational Transformer: Toward Zero-Shot Foundation Models for Relational Data
Rishabh Ranjan, Valter Hudovernik, Mark Znidar, Charilaos Kanatsoulis, Roshan Upendra, Mahmoud Mohammadi, Joe Meyer, Tom Palczewski, Carlos Guestrin, Jure Leskovec β arXiv:2510.06377 (ICLR 2026)
The checkpoints provided in this repository were trained using the methodology described in:
PluRel: Synthetic Data unlocks Scaling Laws for Relational Foundation Models
Kothapalli, Ranjan, Hudovernik, Dwivedi, Hoffart, Guestrin, Leskovec β arXiv:2602.04029 (2026)
Model Architecture
The Relational Transformer operates on multi-tabular relational databases, treating rows across linked tables as a sequence via BFS-ordered context sampling. It utilizes a Relational Attention mechanism over columns, rows, and primary-foreign key links.
| Hyperparameter | Value |
|---|---|
| Transformer blocks | 12 |
Model dimension (d_model) |
256 |
| Attention heads | 8 |
FFN dimension (d_ff) |
1,024 |
| Context length | 1,024 tokens |
| Text encoder | all-MiniLM-L12-v2 (d_text = 384) |
| Max BFS width | 128 |
The architecture and training loop build on the Relational Transformer codebase.
Download
huggingface-cli download kvignesh1420/relational-transformer-plurel \
--repo-type model \
--local-dir ~/scratch/rt_hf_ckpts