pith. machine review for the scientific record. sign in

Nature , volume=

3 Pith papers cite this work. Polarity classification is still indexing.

3 Pith papers citing it

years

2026 3

verdicts

UNVERDICTED 3

representative citing papers

Multi-Fidelity Quantile Regression

stat.ME · 2026-05-11 · unverdicted · novelty 6.0

A model-agnostic two-stage estimator links high-fidelity quantiles to low-fidelity ones via a covariate-dependent level function for faster convergence and better accuracy with limited high-fidelity data.

Active Tabular Augmentation via Policy-Guided Diffusion Inpainting

cs.LG · 2026-05-11 · unverdicted · novelty 6.0

TAP couples a learner-conditioned policy with diffusion inpainting to generate and selectively inject high-utility tabular augmentations, yielding up to 15.6 pp accuracy gains and 32% RMSE reduction on seven datasets under severe scarcity.

ITBoost: Information-Theoretic Trust for Robust Boosting

cs.LG · 2026-05-06 · unverdicted · novelty 5.0

ITBoost uses MDL-based complexity of residual trajectories to assign trust weights, improving robustness to label noise in tabular boosting without sacrificing clean-data performance.

citing papers explorer

Showing 3 of 3 citing papers.

  • Multi-Fidelity Quantile Regression stat.ME · 2026-05-11 · unverdicted · none · ref 2

    A model-agnostic two-stage estimator links high-fidelity quantiles to low-fidelity ones via a covariate-dependent level function for faster convergence and better accuracy with limited high-fidelity data.

  • Active Tabular Augmentation via Policy-Guided Diffusion Inpainting cs.LG · 2026-05-11 · unverdicted · none · ref 16

    TAP couples a learner-conditioned policy with diffusion inpainting to generate and selectively inject high-utility tabular augmentations, yielding up to 15.6 pp accuracy gains and 32% RMSE reduction on seven datasets under severe scarcity.

  • ITBoost: Information-Theoretic Trust for Robust Boosting cs.LG · 2026-05-06 · unverdicted · none · ref 40

    ITBoost uses MDL-based complexity of residual trajectories to assign trust weights, improving robustness to label noise in tabular boosting without sacrificing clean-data performance.