pith. machine review for the scientific record. sign in

On the Equivalence between Neyman Orthogonality and Pathwise Differentiability

2 Pith papers cite this work. Polarity classification is still indexing.

2 Pith papers citing it
abstract

It has been frequently observed that Neyman orthogonality, the central device underlying double/debiased machine learning (Chernozhukov et al., 2018), and pathwise differentiability, a cornerstone concept from semiparametric theory, often lead to the same debiased estimators in practice. Despite the widespread adoption of both ideas, the precise nature of this equivalence has remained elusive, with the two concepts having been developed in largely separate traditions. In this work, we revisit the semiparametric framework of van der Laan and Robins (2003) and identify an implicit regularity assumption on the relationship between target and nuisance parameters -- a local product structure -- that allows us to establish a formal equivalence between Neyman orthogonality and pathwise differentiability. We also show that the two directions of this equivalence impose fundamentally different structural requirements. Finally, we illustrate the theory through three detailed examples of estimating the average treatment effect and expected density in a nonparametric model, as well as the slope in a partially linear model. This helps clarify the relationship between these two foundational frameworks and provides a useful reference for practitioners working at their intersection.

years

2026 2

verdicts

UNVERDICTED 2

representative citing papers

Doubly Robust Instrumented Difference-in-Differences

econ.EM · 2026-05-05 · unverdicted · novelty 7.0

Derives the efficient influence function and doubly robust estimators for the local average treatment effect on the treated in instrumented DiD designs with staggered exposure and covariates.

Calibeating Prediction-Powered Inference

stat.ML · 2026-04-23 · unverdicted · novelty 7.0

Post-hoc calibration of miscalibrated black-box predictions on a labeled sample improves efficiency of prediction-powered inference for semisupervised mean estimation.

citing papers explorer

Showing 2 of 2 citing papers.

  • Doubly Robust Instrumented Difference-in-Differences econ.EM · 2026-05-05 · unverdicted · none · ref 23 · internal anchor

    Derives the efficient influence function and doubly robust estimators for the local average treatment effect on the treated in instrumented DiD designs with staggered exposure and covariates.

  • Calibeating Prediction-Powered Inference stat.ML · 2026-04-23 · unverdicted · none · ref 4 · internal anchor

    Post-hoc calibration of miscalibrated black-box predictions on a labeled sample improves efficiency of prediction-powered inference for semisupervised mean estimation.