pith. machine review for the scientific record. sign in

arxiv: 2605.07100 · v1 · submitted 2026-05-08 · 📊 stat.ML · cs.LG

Recognition: 2 theorem links

· Lean Theorem

TRACE: Transport Alignment Conformal Prediction via Diffusion and Flow Matching Models

Aixin Tan, Jian Huang, Zhenhan Fang

Pith reviewed 2026-05-11 01:10 UTC · model grok-4.3

classification 📊 stat.ML cs.LG
keywords conformal predictiondiffusion modelsflow matchingnonconformity scoretransport alignmentmarginal coveragegenerative modelsmultimodal distributions
0
0 comments X

The pith

Averaging errors along transport trajectories produces scalar scores for valid conformal prediction in diffusion and flow matching models.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper introduces a nonconformity score for conformal prediction that measures how well a candidate output aligns with the dynamics of a learned generative model. Instead of computing likelihoods or imposing geometric conditions on the output space, it averages denoising or velocity-matching errors collected along random trajectories that transport the output back to noise. These scalar scores are then fed into standard split conformal prediction, which delivers finite-sample marginal coverage guarantees as long as the data are exchangeable. The construction matters for high-dimensional or multimodal outputs because it works directly on the generative process without requiring invertible mappings or explicit density estimation.

Core claim

TRACE defines nonconformity through transport alignment by averaging denoising or velocity-matching errors along stochastic transport trajectories in diffusion and flow matching models. The resulting scalar scores are calibrated with split conformal prediction to obtain valid marginal coverage under exchangeability, without explicit likelihood evaluation or additional geometric assumptions on the conditional distribution. Statistical properties of the scores are analyzed, including their behavior under limited computational budget, and experiments confirm that the induced prediction regions adapt to multimodal and non-convex supports.

What carries the argument

Transport alignment nonconformity score formed by averaging denoising or velocity-matching errors collected along stochastic trajectories of the generative dynamics.

If this is right

  • The method supplies finite-sample marginal coverage for multi-dimensional outputs from generative models.
  • Prediction regions adapt automatically to multimodal and non-convex conditional distributions.
  • Score quality trades off against computational budget through the number of trajectory steps.
  • The same score construction applies uniformly to both diffusion and flow-matching formulations.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same trajectory-averaging idea could be tested on other generative processes that admit a denoising or velocity field, such as score-based models outside the diffusion family.
  • Because the scores remain scalar, they could be combined with existing conformal techniques for conditional or adaptive coverage without further modification.
  • If trajectory length is treated as a tunable hyperparameter, one could study whether optimal length depends on the degree of multimodality in the target distribution.

Load-bearing premise

Averaging denoising or velocity-matching errors along stochastic transport trajectories produces nonconformity scores whose calibration yields valid coverage without additional geometric assumptions or likelihood evaluation.

What would settle it

Empirical coverage falling below the nominal level on an exchangeable collection of samples drawn from a known multimodal conditional distribution when the trajectory length used for score computation is held fixed.

Figures

Figures reproduced from arXiv: 2605.07100 by Aixin Tan, Jian Huang, Zhenhan Fang.

Figure 1
Figure 1. Figure 1: Forward and reverse processes of (a) DDPM and (b) Flow Matching on a pinwheel distribution. ◦: trajectory origin; ♦: current position. The reverse trajectories of DDPM are curved due to iterative denoising, whereas those of Flow Matching are nearly straight due to the OT path structure. At t = 0, the path starts at the reference noise y0; at t = 1, it arrives at the data sample y. The velocity needed to tr… view at source ↗
Figure 2
Figure 2. Figure 2: Prediction regions on synthetic datasets for a representative test input. The leftmost panel shows the true conditional density. Subsequent panels show the 90% conformal prediction region for each method. Gray dots are calibration samples; the red star marks the actual output y. producing slightly tighter boundaries on spiral but expanding visibly on pinwheel; in rows (c) and (d), the contrast is more pron… view at source ↗
Figure 3
Figure 3. Figure 3: Prediction regions on the Taxi dataset overlaid on a map of New York City. The cyan dot marks the pick-up location; the red star indicates the actual drop-off for this input. (a) KDE reference density; (b)–(h) conformal prediction regions for each method. Energy, RF1, and SCM20D. On Energy, TRACE yields the smallest regions, with TRACE-FM and TRACE-Diff achieving volumes of 3.1 and 3.2, respectively. These… view at source ↗
Figure 4
Figure 4. Figure 4: Prediction region volume across 20 repeated experiments on two synthetic and two real-world datasets. Each dot represents one repeat; the horizontal bar indicates the mean. remain concentrated, while in others they exhibit substantial variability, and their volumes are consistently larger overall. In short, the stability analysis indicates that TRACE not only achieves compact regions on average, but also m… view at source ↗
Figure 5
Figure 5. Figure 5: Effect of Monte Carlo budget B = |T | × R on conformal region volume (top row) and the standard deviation of TRACE evaluated on calibration points (bottom row) for TRACE-Diff and TRACE-FM on Pinwheel (left) and Energy (right). Results are averaged over 20 repeats; solid lines denote the mean and shaded bands indicate ±1 standard deviation.The dashed gray line shows the O(B−1/2 ) reference decay for score s… view at source ↗
read the original abstract

Constructing valid and informative conformal prediction regions for multi-dimensional outputs remains a fundamental challenge. While conformal prediction provides finite-sample, distribution-free coverage guarantees, its practical performance critically depends on the choice of nonconformity score. Existing approaches often rely on restrictive geometric assumptions or require explicit likelihood evaluation and invertible transformations, limiting their applicability in complex generative settings. In this work, we introduce TRACE (TRansport Alignment Conformal Estimation), a conformal prediction framework that defines nonconformity through transport alignment in diffusion and flow matching models. Rather than evaluating likelihoods, we measure how well a candidate output aligns with the learned generative dynamics by averaging denoising or velocity-matching errors along stochastic transport trajectories. The resulting transport-based scores are scalar-valued and can be calibrated using split conformal prediction, yielding valid marginal coverage under exchangeability. We further analyze the statistical properties of the proposed scores and their sensitivity to computational budget. Experiments on synthetic and real datasets demonstrate valid coverage and show that the resulting regions adapt naturally to multimodal and non-convex conditional distributions.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

0 major / 4 minor

Summary. The paper introduces TRACE, a conformal prediction framework for multi-dimensional outputs that defines nonconformity scores via transport alignment in diffusion and flow matching models. Scores are computed as averages of denoising or velocity-matching errors along stochastic transport trajectories from a fixed generative model; these scalar scores are then calibrated via split conformal prediction to obtain valid marginal coverage under exchangeability. The work further analyzes statistical properties of the scores (including sensitivity to computational budget) and reports experiments on synthetic and real datasets demonstrating adaptation to multimodal and non-convex conditionals.

Significance. If the central claim holds, the contribution is a practical, likelihood-free route to valid conformal regions in complex generative settings that avoids restrictive geometric assumptions or invertible maps. The approach reuses pre-trained diffusion/flow models to produce adaptive scores whose validity follows from the standard split-CP rank argument (conditional on the training data used to fit the generative model), which is a clean and useful observation.

minor comments (4)
  1. [Abstract] The abstract asserts valid marginal coverage and statistical analysis but provides no derivation sketch or reference to the precise exchangeability conditioning (training data vs. calibration/test points); a one-paragraph outline in §2 or §3 would clarify that the guarantee is the usual one and does not require extra geometric assumptions.
  2. [Method] The description of score computation (averaging along trajectories) is clear at a high level but lacks an explicit algorithmic box or pseudocode showing how the number of trajectory samples and denoising steps enter the final nonconformity value; this affects reproducibility of the reported efficiency results.
  3. [Experiments] Experiments are said to demonstrate valid coverage and adaptation to multimodal distributions, yet the abstract and summary give no details on number of Monte Carlo repetitions, exact coverage deviation observed, or baseline methods; adding a table with empirical coverage and interval lengths (with standard errors) would strengthen the efficiency claims.
  4. Notation for the transport-based score (e.g., how the averaging operator is denoted and whether it is conditional on the candidate point) should be introduced once and used consistently; occasional shifts between denoising-error and velocity-matching formulations are not always sign-posted.

Simulated Author's Rebuttal

0 responses · 0 unresolved

We thank the referee for the positive assessment of our work and the recommendation for minor revision. The provided summary correctly identifies the core idea of TRACE: defining nonconformity scores via averaged denoising or velocity errors along stochastic transport paths in pre-trained diffusion and flow-matching models, followed by standard split conformal calibration to obtain marginal coverage guarantees. We appreciate the recognition that this yields a practical, likelihood-free approach without requiring invertible maps or restrictive geometric assumptions.

Circularity Check

0 steps flagged

No significant circularity

full rationale

The paper defines a nonconformity score by averaging denoising or velocity-matching errors along trajectories from a pre-trained diffusion or flow-matching model, then applies standard split conformal prediction to these scalar scores. The marginal coverage guarantee is obtained directly from the classical exchangeability-based rank argument of split CP and does not depend on the internal construction of the score or on any fitted parameter that is later renamed as a prediction. No self-definitional steps, fitted-input predictions, or load-bearing self-citations appear in the derivation chain; the statistical validity remains independent of the generative-model details.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

The framework rests on the standard exchangeability assumption of conformal prediction and the existence of learned generative dynamics in diffusion/flow models; no free parameters or new entities are introduced in the abstract.

axioms (1)
  • domain assumption Data points are exchangeable so that split conformal prediction yields marginal coverage guarantees.
    Explicitly invoked in the abstract for the calibration step.

pith-pipeline@v0.9.0 · 5474 in / 1252 out tokens · 43388 ms · 2026-05-11T01:10:27.968358+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

Reference graph

Works this paper leans on

77 extracted references · 77 canonical work pages · 6 internal anchors

  1. [1]

    2024 , eprint=

    Conformal Prediction With Conditional Guarantees , author=. 2024 , eprint=

  2. [2]

    Proceedings of the Asian Conference on Machine Learning , pages =

    Conditional Validity of Inductive Conformal Predictors , author =. Proceedings of the Asian Conference on Machine Learning , pages =. 2012 , editor =

  3. [3]

    Information and Inference: A Journal of the IMA , year=

    The limits of distribution-free conditional predictive inference , author=. Information and Inference: A Journal of the IMA , year=

  4. [4]

    Multicalibration: Calibration for the (

    Hebert-Johnson, Ursula and Kim, Michael and Reingold, Omer and Rothblum, Guy , booktitle =. Multicalibration: Calibration for the (. 2018 , editor =

  5. [5]

    CoRR , volume =

    Vladimir Vovk , title =. CoRR , volume =. 2012 , url =. 1209.2673 , timestamp =

  6. [6]

    2020 , eprint=

    The limits of distribution-free conditional predictive inference , author=. 2020 , eprint=

  7. [7]

    Donald W. K. Andrews and Xiaoxia Shi , journal =. INFERENCE BASED ON CONDITIONAL MOMENT INEQUALITIES , urldate =

  8. [8]

    2020 , eprint=

    Classification with Valid and Adaptive Coverage , author=. 2020 , eprint=

  9. [9]

    2022 , eprint=

    Uncertainty Sets for Image Classifiers using Conformal Prediction , author=. 2022 , eprint=

  10. [10]

    The 40th Conference on Uncertainty in Artificial Intelligence , year=

    Normalizing Flows for Conformal Regression , author=. The 40th Conference on Uncertainty in Artificial Intelligence , year=

  11. [11]

    Density estimation using Real NVP

    Density estimation using real nvp , author=. arXiv preprint arXiv:1605.08803 , year=

  12. [12]

    International Conference on Machine Learning , pages=

    Neural autoregressive flows , author=. International Conference on Machine Learning , pages=. 2018 , organization=

  13. [13]

    Advances in Neural Information Processing Systems , volume=

    Residual flows for invertible generative modeling , author=. Advances in Neural Information Processing Systems , volume=

  14. [14]

    Advances in neural information processing systems , volume=

    Glow: Generative flow with invertible 1x1 convolutions , author=. Advances in neural information processing systems , volume=

  15. [15]

    Journal of Machine Learning Research , year =

    George Papamakarios and Eric Nalisnick and Danilo Jimenez Rezende and Shakir Mohamed and Balaji Lakshminarayanan , title =. Journal of Machine Learning Research , year =

  16. [16]

    ArXiv , year=

    Learning Likelihoods with Conditional Normalizing Flows , author=. ArXiv , year=

  17. [17]

    and Brubaker, Marcus A

    Kobyzev, Ivan and Prince, Simon J.D. and Brubaker, Marcus A. , year=. Normalizing Flows: An Introduction and Review of Current Methods , volume=. IEEE Transactions on Pattern Analysis and Machine Intelligence , publisher=. doi:10.1109/tpami.2020.2992934 , number=

  18. [18]

    Sbornik: Mathematics , abstract =

    V I Bogachev and A V Kolesnikov and K V Medvedev , title =. Sbornik: Mathematics , abstract =. 2005 , month =. doi:10.1070/SM2005v196n03ABEH000882 , url =

  19. [19]

    Machine Learning: ECML 2002: 13th European Conference on Machine Learning Helsinki, Finland, August 19--23, 2002 Proceedings 13 , pages=

    Inductive confidence machines for regression , author=. Machine Learning: ECML 2002: 13th European Conference on Machine Learning Helsinki, Finland, August 19--23, 2002 Proceedings 13 , pages=. 2002 , organization=

  20. [20]

    2005 , publisher=

    Algorithmic learning in a random world , author=. 2005 , publisher=

  21. [21]

    Annals of Mathematics and Artificial Intelligence , volume=

    A conformal prediction approach to explore functional data , author=. Annals of Mathematics and Artificial Intelligence , volume=. 2015 , publisher=

  22. [22]

    BERT : Pre-training of deep bidirectional transformers for language understanding

    Devlin, Jacob and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina. BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding. Proceedings of the 2019 Conference of the North A merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019. doi:10.18653/v...

  23. [23]

    DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter

    Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf , title =. CoRR , volume =. 2019 , url =. 1910.01108 , timestamp =

  24. [24]

    AG News and IMDB , year =

    Xinxin Li , publisher =. AG News and IMDB , year =. doi:10.21227/f9vv-5898 , url =

  25. [25]

    2023 , eprint=

    Class-Conditional Conformal Prediction with Many Classes , author=. 2023 , eprint=

  26. [26]

    1998 , howpublished =

    Blackard, Jock , title =. 1998 , howpublished =

  27. [27]

    1993 , howpublished =

    Srinivasan, Ashwin , title =. 1993 , howpublished =

  28. [28]

    THE MNIST DATABASE of handwritten digits

    LECUN, Y. THE MNIST DATABASE of handwritten digits. http://yann.lecun.com/exdb/mnist/

  29. [29]

    Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms

    Han Xiao and Kashif Rasul and Roland Vollgraf , title =. CoRR , volume =. 2017 , url =. 1708.07747 , timestamp =

  30. [30]

    2011 , url=

    Reading Digits in Natural Images with Unsupervised Feature Learning , author=. 2011 , url=

  31. [31]

    2009 , url=

    Learning Multiple Layers of Features from Tiny Images , author=. 2009 , url=

  32. [32]

    Character-level Convolutional Networks for Text Classification , url =

    Zhang, Xiang and Zhao, Junbo and LeCun, Yann , booktitle =. Character-level Convolutional Networks for Text Classification , url =

  33. [33]

    1997 , howpublished =

    Mitchell, Tom , title =. 1997 , howpublished =

  34. [34]

    Efficient Intent Detection with Dual Sentence Encoders , year =

    I. Efficient Intent Detection with Dual Sentence Encoders , year =

  35. [35]

    Annals of Mathematical Statistics , year=

    Asymptotic Minimax Character of the Sample Distribution Function and of the Classical Multinomial Estimator , author=. Annals of Mathematical Statistics , year=

  36. [36]

    Annals of Probability , year=

    The Tight Constant in the Dvoretzky-Kiefer-Wolfowitz Inequality , author=. Annals of Probability , year=

  37. [37]

    Scaling Learning Algorithms Towards

    Bengio, Yoshua and LeCun, Yann , booktitle =. Scaling Learning Algorithms Towards

  38. [38]

    and Osindero, Simon and Teh, Yee Whye , journal =

    Hinton, Geoffrey E. and Osindero, Simon and Teh, Yee Whye , journal =. A Fast Learning Algorithm for Deep Belief Nets , volume =

  39. [39]

    2016 , publisher=

    Deep learning , author=. 2016 , publisher=

  40. [40]

    Journal of the American Statistical Association , volume=

    Distribution-free predictive inference for regression , author=. Journal of the American Statistical Association , volume=. 2018 , publisher=

  41. [41]

    Advances in neural information processing systems , volume=

    Conformalized quantile regression , author=. Advances in neural information processing systems , volume=

  42. [42]

    Conformal and Probabilistic Prediction and Applications , pages=

    Conformal uncertainty sets for robust optimization , author=. Conformal and Probabilistic Prediction and Applications , pages=. 2021 , organization=

  43. [43]

    A Gentle Introduction to Conformal Prediction and Distribution-Free Uncertainty Quantification

    A gentle introduction to conformal prediction and distribution-free uncertainty quantification , author=. arXiv preprint arXiv:2107.07511 , year=

  44. [44]

    Conformal and Probabilistic Prediction with Applications , pages=

    Ellipsoidal conformal inference for multi-target regression , author=. Conformal and Probabilistic Prediction with Applications , pages=. 2022 , organization=

  45. [45]

    NICE: Non-linear Independent Components Estimation

    Nice: Non-linear independent components estimation , author=. arXiv preprint arXiv:1410.8516 , year=

  46. [46]

    Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics , pages =

    Flexible distribution-free conditional predictive bands using density estimators , author =. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics , pages =. 2020 , editor =

  47. [47]

    arXiv preprint arXiv:2206.06584 , year=

    Probabilistic conformal prediction using conditional random samples , author=. arXiv preprint arXiv:2206.06584 , year=

  48. [48]

    Pubblicazioni del R Istituto Superiore di Scienze Economiche e Commericiali di Firenze , volume=

    Teoria statistica delle classi e calcolo delle probabilita , author=. Pubblicazioni del R Istituto Superiore di Scienze Economiche e Commericiali di Firenze , volume=

  49. [49]

    Journal of the American statistical association , volume=

    Multiple comparisons among means , author=. Journal of the American statistical association , volume=. 1961 , publisher=

  50. [50]

    Journal of Machine Learning Research , volume=

    Cd-split and hpd-split: Efficient conformal regions in high dimensions , author=. Journal of Machine Learning Research , volume=

  51. [51]

    Multi-target regression via input space expansion: treating targets as inputs , volume=

    Spyromitros-Xioufis, Eleftherios and Tsoumakas, Grigorios and Groves, William and Vlahavas, Ioannis , year=. Multi-target regression via input space expansion: treating targets as inputs , volume=. Machine Learning , publisher=. doi:10.1007/s10994-016-5546-z , number=

  52. [52]

    Accurate quantitative estimation of energy performance of residential buildings using statistical machine learning tools , journal =

    Athanasios Tsanas and Angeliki Xifara , keywords =. Accurate quantitative estimation of energy performance of residential buildings using statistical machine learning tools , journal =. 2012 , issn =. doi:https://doi.org/10.1016/j.enbuild.2012.03.003 , url =

  53. [53]

    2013 , howpublished =

    Rana,Prashant , title =. 2013 , howpublished =

  54. [54]

    Prentic Hall of India Private Limited, New delhi , year=

    Topology , author=. Prentic Hall of India Private Limited, New delhi , year=

  55. [55]

    Proceedings of the National Academy of Sciences , volume=

    Distributional conformal prediction , author=. Proceedings of the National Academy of Sciences , volume=. 2021 , publisher=

  56. [56]

    and Pospisil, T

    Dalmasso, N. and Pospisil, T. and Lee, A.B. and Izbicki, R. and Freeman, P.E. and Malz, A.I. , year=. Conditional density estimation tools in python and R with applications to photometric redshifts and likelihood-free cosmological inference , volume=. doi:10.1016/j.ascom.2019.100362 , journal=

  57. [57]

    Approximating conditional distribution functions using dimension reduction , volume=

    Hall, Peter and Yao, Qiwei , year=. Approximating conditional distribution functions using dimension reduction , volume=. The Annals of Statistics , publisher=. doi:10.1214/009053604000001282 , number=

  58. [58]

    arXiv preprint arXiv:1401.3632 , year=

    Bayesian conditional density filtering , author=. arXiv preprint arXiv:1401.3632 , year=

  59. [59]

    Sbornik: Mathematics , volume=

    Triangular transformations of measures , author=. Sbornik: Mathematics , volume=. 2005 , publisher=

  60. [60]

    Advances in neural information processing systems , volume=

    Neural spline flows , author=. Advances in neural information processing systems , volume=

  61. [61]

    Journal of Machine Learning Research , volume=

    Calibrated multiple-output quantile regression with representation learning , author=. Journal of Machine Learning Research , volume=

  62. [62]

    2024 , eprint=

    Normalizing Flows for Conformal Regression , author=. 2024 , eprint=

  63. [63]

    CoRR , year=

    Adam: A Method for Stochastic Optimization , author=. CoRR , year=

  64. [64]

    Advances in neural information processing systems , volume=

    Denoising diffusion probabilistic models , author=. Advances in neural information processing systems , volume=

  65. [65]

    2025 , url=

    Zhenhan Fang and Aixin Tan and Jian Huang , booktitle=. 2025 , url=

  66. [66]

    Copula-based conformal prediction for multi-target regression , journal =

    Soundouss Messoudi and Sébastien Destercke and Sylvain Rousseau , keywords =. Copula-based conformal prediction for multi-target regression , journal =. 2021 , issn =. doi:https://doi.org/10.1016/j.patcog.2021.108101 , url =

  67. [67]

    The Eleventh International Conference on Learning Representations , year=

    Flow Matching for Generative Modeling , author=. The Eleventh International Conference on Learning Representations , year=

  68. [68]

    Denoising Diffusion Probabilistic Models

    Jonathan Ho and Ajay Jain and Pieter Abbeel , title =. CoRR , volume =. 2020 , url =. 2006.11239 , timestamp =

  69. [69]

    2026 , url=

    Eshant English and Christoph Lippert , booktitle=. 2026 , url=

  70. [70]

    1980 , publisher=

    Approximation Theorems of Mathematical Statistics , author=. 1980 , publisher=

  71. [71]

    International Conference on Learning Representations , year=

    Score-Based Generative Modeling through Stochastic Differential Equations , author=. International Conference on Learning Representations , year=

  72. [72]

    International Conference on Learning Representations , year=

    Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow , author=. International Conference on Learning Representations , year=

  73. [73]

    Variational diffusion models

    Diederik P. Kingma and Tim Salimans and Ben Poole and Jonathan Ho , title =. CoRR , volume =. 2021 , url =. 2107.00630 , timestamp =

  74. [74]

    2025 , eprint=

    Stochastic Interpolants: A Unifying Framework for Flows and Diffusions , author=. 2025 , eprint=

  75. [75]

    2025 , eprint=

    Multivariate Conformal Prediction using Optimal Transport , author=. 2025 , eprint=

  76. [76]

    Forty-second International Conference on Machine Learning , year=

    Optimal transport-based conformal prediction , author=. Forty-second International Conference on Machine Learning , year=

  77. [77]

    Journal of the American Statistical Association , volume =

    Jing Lei and James Robins and Larry Wasserman , title =. Journal of the American Statistical Association , volume =. 2013 , publisher =. doi:10.1080/01621459.2012.751873 , note =