A prior-data fitted network amortizes causal sensitivity analysis by generating training labels via Lagrangian scalarization, achieving orders-of-magnitude faster bounds computation than per-instance methods.
Title resolution pending
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
years
2026 2verdicts
UNVERDICTED 2representative citing papers
TabDistill distills feature interactions from tabular foundation models via post-hoc attribution and inserts them into GAMs, yielding consistent predictive gains.
citing papers explorer
-
Amortizing Causal Sensitivity Analysis via Prior Data-Fitted Networks
A prior-data fitted network amortizes causal sensitivity analysis by generating training labels via Lagrangian scalarization, achieving orders-of-magnitude faster bounds computation than per-instance methods.
-
Selecting Feature Interactions for Generalized Additive Models by Distilling Foundation Models
TabDistill distills feature interactions from tabular foundation models via post-hoc attribution and inserts them into GAMs, yielding consistent predictive gains.