A model-agnostic two-stage estimator links high-fidelity quantiles to low-fidelity ones via a covariate-dependent level function for faster convergence and better accuracy with limited high-fidelity data.
Nature , volume=
3 Pith papers cite this work. Polarity classification is still indexing.
years
2026 3verdicts
UNVERDICTED 3representative citing papers
TAP couples a learner-conditioned policy with diffusion inpainting to generate and selectively inject high-utility tabular augmentations, yielding up to 15.6 pp accuracy gains and 32% RMSE reduction on seven datasets under severe scarcity.
ITBoost uses MDL-based complexity of residual trajectories to assign trust weights, improving robustness to label noise in tabular boosting without sacrificing clean-data performance.
citing papers explorer
-
Multi-Fidelity Quantile Regression
A model-agnostic two-stage estimator links high-fidelity quantiles to low-fidelity ones via a covariate-dependent level function for faster convergence and better accuracy with limited high-fidelity data.
-
Active Tabular Augmentation via Policy-Guided Diffusion Inpainting
TAP couples a learner-conditioned policy with diffusion inpainting to generate and selectively inject high-utility tabular augmentations, yielding up to 15.6 pp accuracy gains and 32% RMSE reduction on seven datasets under severe scarcity.
-
ITBoost: Information-Theoretic Trust for Robust Boosting
ITBoost uses MDL-based complexity of residual trajectories to assign trust weights, improving robustness to label noise in tabular boosting without sacrificing clean-data performance.