└── README.md
/README.md:
--------------------------------------------------------------------------------
1 | # Awesome Self-Supervised Learning for Non-Sequential Tabular Data (SSL4NSTD)
2 | 
3 | 
4 | 
5 |
6 | This repository contains the frontier research on **self-supervised learning** for tabular data which has been a popular topic recently.
7 | This list is maintained by [Wei-Wei Du](https://wwweiwei.github.io/) and [Wei-Yao Wang](https://github.com/wywyWang). (Actively keep updating)
8 | If you have come across relevant resources or found some errors in this repository, feel free to open an issue or submit a PR.
9 |
10 | ## Our Survey Paper
11 | [A Survey on Self-Supervised Learning for Non-Sequential Tabular Data (ACML-24 Journal Track)](https://arxiv.org/abs/2402.01204)
12 | ### Citation
13 | ```bibtex
14 | @article{DBLP:journals/corr/abs-2402-01204,
15 | author = {Wei{-}Yao Wang and
16 | Wei{-}Wei Du and
17 | Derek Xu and
18 | Wei Wang and
19 | Wen{-}Chih Peng},
20 | title = {A Survey on Self-Supervised Learning for Non-Sequential Tabular Data},
21 | journal = {CoRR},
22 | volume = {abs/2402.01204},
23 | year = {2024}
24 | }
25 | ```
26 |
27 | ## Papers
28 | ### Predictive Learning
29 | * VIME: Extending the Success of Self- and Semi-supervised Learning to Tabular Domain (NeurIPS'20) [[Paper]](https://proceedings.neurips.cc/paper/2020/file/7d97667a3e056acab9aaf653807b4a03-Paper.pdf) [[Supplementary]](https://proceedings.neurips.cc/paper/2020/file/7d97667a3e056acab9aaf653807b4a03-Supplemental.pdf) [[Code]](https://github.com/jsyoon0823/VIME)
30 | * TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data (ACL'20) [[Paper]](https://arxiv.org/abs/2005.08314)
31 | * TABBIE: Pretrained Representations of Tabular Data (NAACL'21) [[Paper]](https://arxiv.org/abs/2105.02584)) [[Code]](https://github.com/SFIG611/tabbie)
32 | * CORE: Self- and Semi-supervised Tabular Learning with COnditional REgularizations (NeurIPS'21) [[Paper]](https://sslneurips21.github.io/files/CameraReady/CORE_workshop.pdf)
33 | * TabTransformer: Tabular Data Modeling Using Contextual Embeddings [[Paper]](https://arxiv.org/abs/2012.06678)
34 | * TabNet: Attentive Interpretable Tabular Learning (AAAI'21) [[Paper]](https://arxiv.org/abs/1908.07442) [Code](https://github.com/dreamquark-ai/tabnet)
35 | * Self-Supervision Enhanced Feature Selection with Correlated Gates (ICLR'22) [[Paper]](https://openreview.net/pdf?id=oDFvtxzPOx) [[Code]](https://github.com/chl8856/SEFS)
36 | * TransTab: Learning Transferable Tabular Transformers Across Tables (NeurIPS'22) [[Paper]](https://arxiv.org/abs/2205.09328) [[Code]](https://github.com/RyanWangZf/transtab) [[Blog]](https://realsunlab.medium.com/transtab-learning-transferable-tabular-transformers-across-tables-1e34eec161b8)
37 | * LIFT: Language-Interfaced Fine-Tuning for Non-language Machine Learning Tasks (NeurIPS'22) [[Paper]](https://arxiv.org/pdf/2206.06565.pdf) [[Code]](https://github.com/UW-Madison-Lee-Lab/LanguageInterfacedFineTuning)
38 | * Self Supervised Pre-training for Large Scale Tabular Data (NeurIPS'22 TRL Workshop) [[Paper]](https://table-representation-learning.github.io/assets/papers/self_supervised_pre_training_f.pdf) [[Blog]](https://www.amazon.science/publications/self-supervised-pre-training-for-large-scale-tabular-data)
39 | * Local Contrastive Feature Learning for Tabular Data (CIKM'22) [[Paper]](https://dl.acm.org/doi/pdf/10.1145/3511808.3557630)
40 | * Revisiting Self-Training with Regularized Pseudo-Labeling for Tabular Data (preprint'23) [[Paper]](https://arxiv.org/abs/2302.14013)
41 | * Generative Table Pre-training Empowers Models for Tabular Prediction (EMNLP'23) [[Paper]](https://arxiv.org/pdf/2305.09696.pdf) [[Code]](https://github.com/ZhangTP1996/TapTap)
42 | * TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second (ICLR'23) [[Paper]](https://arxiv.org/pdf/2207.01848.pdf) [[Code]](https://github.com/automl/TabPFN)
43 | * STUNT: Few-shot Tabular Learning with Self-generated Tasks from Unlabeled Tables (ICLR'23) [[Paper]](https://arxiv.org/pdf/2303.00918.pdf) [[Code]](https://github.com/jaehyun513/STUNT)
44 | * Language Models are Realistic Tabular Data Generators (ICLR'23) [[Paper]](https://arxiv.org/pdf/2210.06280.pdf) [[Code]](https://github.com/kathrinse/be_great)
45 | * Self-supervised Representation Learning from Random Data Projectors (NeurIPS'23 TRL Workshop) [[Paper]](https://arxiv.org/pdf/2310.07756.pdf) [[Code]](https://github.com/layer6ai-labs/lfr)
46 | * SwitchTab: Switched Autoencoders Are Effective Tabular Learners (AAAI'24) [[Paper]](https://arxiv.org/pdf/2401.02013.pdf)
47 | * Making Pre-trained Language Models Great on Tabular Prediction (ICLR'24) [[Paper]](https://openreview.net/pdf?id=anzIzGZuLi)
48 | * Binning as a Pretext Task: Improving Self-Supervised Learning in Tabular Domains (ICML'24) [[Paper]](https://arxiv.org/abs/2405.07414) [[Code]](https://github.com/kyungeun-lee/tabularbinning)
49 | * Large Scale Transfer Learning for Tabular Data via Language Modeling (NeurIPS'24) [[Paper]](https://arxiv.org/abs/2406.12031) [[Code]](https://github.com/mlfoundations/rtfm)
50 | * Accurate predictions on small data with a tabular foundation model (Nature-25) [[Paper]](https://www.nature.com/articles/s41586-024-08328-6)
51 |
52 | ### Contrastive Learning
53 | * SCARF: Self-Supervised Contrastive Learning using Random Feature Corruption (ICLR'22) [[Paper]](https://arxiv.org/pdf/2106.15147.pdf) [[Code]](https://github.com/clabrugere/pytorch-scarf)
54 | * STab: Self-supervised Learning for Tabular Data (NeurIPS'22 Workshop on TRL) [[Paper]](https://openreview.net/pdf?id=EfR55bFcrcI)
55 | * TransTab: Learning Transferable Tabular Transformers Across Tables (NeurIPS'22) [[Paper]](https://arxiv.org/pdf/2205.09328.pdf)
56 | * PTaRL: Prototype-based Tabular Representation Learning via Space Calibration (ICLR'24) [[Paper]](https://openreview.net/pdf?id=G32oY4Vnm8)
57 |
58 | ### Hybrid Learning
59 | * SubTab: Subsetting Features of Tabular Data for Self-Supervised Representation Learning (NeurIPS'21) [[Paper]](https://arxiv.org/pdf/2110.04361.pdf) [[Supplementary]](https://openreview.net/attachment?id=vrhNQ7aYSdr&name=supplementary_material) [[Code]](https://github.com/AstraZeneca/SubTab)
60 | * SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training (NurIPS‘22 Workshop on TRL) [[Paper]](https://arxiv.org/pdf/2106.01342.pdf) [[Code]](https://github.com/somepago/saint)
61 | * Transfer Learning with Deep Tabular Models (ICLR'23) [[Paper]](https://arxiv.org/pdf/2206.15306.pdf) [[Code]](https://github.com/LevinRoman/tabular-transfer-learning)
62 | * DoRA: Domain-Based Self-Supervised Learning Framework for Low-Resource Real Estate Appraisal (CIKM'23) [[Paper]](https://arxiv.org/abs/2309.00855) [[Code]](https://github.com/wwweiwei/DoRA)
63 | * ReConTab: Regularized Contrastive Representation Learning for Tabular Data (NeurIPS'23 Workshop on TRL) [[Paper]](https://arxiv.org/pdf/2310.18541.pdf)
64 | * XTab: Cross-table Pretraining for Tabular Transformers (ICML'23) [[Paper]](https://arxiv.org/abs/2305.06090)
65 | * UniTabE: A Universal Pretraining Protocol for Tabular Foundation Model in Data Science (ICLR'24) [[Paper]](https://arxiv.org/pdf/2307.09249.pdf)
66 |
67 | ## Benchmarks
68 | | Benchmark | Task | #Datasets | Paper |
69 | |--------------|---------------------------------------|-----------|-------|
70 | | MLPCBench | Classification | 40 | [Kadra et al., 2021](https://arxiv.org/abs/2106.11189) |
71 | | DLBench | Classification, Regression | 11 | [Shwartz-Ziv and Armon, 2022](https://arxiv.org/abs/2106.03253) |
72 | | TabularBench | Classification, Regression | 45 | [Grinsztajn et al., 2022](https://arxiv.org/abs/2207.08815) |
73 | | TabZilla | Classification | 36 | [McElfresh et al., 2023](https://arxiv.org/abs/2305.02997) |
74 | | TabPretNet | Unlabeled, Classification, Regression | 2000 | [Ye et al., 2023](https://arxiv.org/abs/2307.04308) |
75 | | The Tremendous TabLib Trawl (T4) | Unlabeled | 3.1M | [Gardner et al., 2024](https://arxiv.org/abs/2406.12031v1) |
76 |
77 | ## Tutorials
78 | * Self-Supervised Learning: Self-Prediction and Contrastive Learning (NeurIPS'21) [[Website]](https://neurips.cc/virtual/2021/tutorial/21895)
79 |
80 | ## Workshops
81 | * Table Representation Learning (NeurIPS) [[Website]](https://table-representation-learning.github.io/)
82 |
83 | ## Related Survey
84 | * Deep Neural Networks and Tabular Data: A Survey [[Paper]](https://arxiv.org/abs/2110.01889)
85 | * Self-Supervised Learning for Recommender Systems: A Survey (TKDE) [[Paper]](https://arxiv.org/pdf/2203.15876.pdf)
86 | * Beyond Just Vision: A Review on Self-Supervised Representation Learning on Multimodal and Temporal Data [[Paper]](https://arxiv.org/abs/2206.02353)
87 | * Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects [[Paper]](https://arxiv.org/abs/2306.10125)
88 | * On the Opportunities and Challenges of Foundation Models for Geospatial Artificial Intelligence [[Paper]](https://arxiv.org/abs/2304.06798)
89 | * A Survey on Time-Series Pre-Trained Models [[Paper]](https://arxiv.org/abs/2305.10716)
90 |
91 | ## Tools & Libraries
92 | * Pytorch Frame: A modular deep learning framework for building neural network models on heterogeneous tabular data [[Link]](https://github.com/pyg-team/pytorch-frame#implemented-deep-tabular-models)
93 | * PyTorch Tabular: A Framework for Deep Learning with Tabular Data [[Link]](https://github.com/manujosephv/pytorch_tabular)
94 | * Pytorch wide-deep: A flexible package for multimodal-deep-learning to combine tabular data with text and images using Wide and Deep models in Pytorch [[Link]](https://github.com/jrzaurin/pytorch-widedeep)
95 |
--------------------------------------------------------------------------------