Derives the efficient influence function and doubly robust estimators for the local average treatment effect on the treated in instrumented DiD designs with staggered exposure and covariates.
On the Equivalence between Neyman Orthogonality and Pathwise Differentiability
2 Pith papers cite this work. Polarity classification is still indexing.
abstract
It has been frequently observed that Neyman orthogonality, the central device underlying double/debiased machine learning (Chernozhukov et al., 2018), and pathwise differentiability, a cornerstone concept from semiparametric theory, often lead to the same debiased estimators in practice. Despite the widespread adoption of both ideas, the precise nature of this equivalence has remained elusive, with the two concepts having been developed in largely separate traditions. In this work, we revisit the semiparametric framework of van der Laan and Robins (2003) and identify an implicit regularity assumption on the relationship between target and nuisance parameters -- a local product structure -- that allows us to establish a formal equivalence between Neyman orthogonality and pathwise differentiability. We also show that the two directions of this equivalence impose fundamentally different structural requirements. Finally, we illustrate the theory through three detailed examples of estimating the average treatment effect and expected density in a nonparametric model, as well as the slope in a partially linear model. This helps clarify the relationship between these two foundational frameworks and provides a useful reference for practitioners working at their intersection.
years
2026 2verdicts
UNVERDICTED 2representative citing papers
Post-hoc calibration of miscalibrated black-box predictions on a labeled sample improves efficiency of prediction-powered inference for semisupervised mean estimation.
citing papers explorer
-
Doubly Robust Instrumented Difference-in-Differences
Derives the efficient influence function and doubly robust estimators for the local average treatment effect on the treated in instrumented DiD designs with staggered exposure and covariates.
-
Calibeating Prediction-Powered Inference
Post-hoc calibration of miscalibrated black-box predictions on a labeled sample improves efficiency of prediction-powered inference for semisupervised mean estimation.