pith. machine review for the scientific record. sign in

arxiv: 2605.06861 · v1 · submitted 2026-05-07 · 💻 cs.LG · cs.NA· math.NA

Recognition: 2 theorem links

· Lean Theorem

Christoffel-DPS: Optimal sensor placement in diffusion posterior sampling for arbitrary distributions

Authors on Pith no claims yet

Pith reviewed 2026-05-11 01:13 UTC · model grok-4.3

classification 💻 cs.LG cs.NAmath.NA
keywords optimal sensor placementdiffusion posterior samplingChristoffel functiongenerative modelsstate estimationnon-Gaussian distributionssensor selectionposterior sampling
0
0 comments X

The pith

The Christoffel function yields a distribution-free framework for optimal sensor placement in diffusion posterior sampling that works for arbitrary signal distributions.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

This paper develops a new approach to optimal sensor placement for state estimation tasks where the underlying signals have complex, non-Gaussian distributions. It uses the Christoffel function to create a sensor selection strategy with theoretical guarantees on the number of sensors required for accurate recovery via generative posterior sampling. Classical methods break down under these conditions because they assume Gaussianity, and prior generative-model methods either demand excessive sensors or fail to align with advanced recovery techniques. The result is Christoffel-DPS, which shows practical gains on benchmarks with limited sensor budgets.

Core claim

The authors introduce Christoffel-DPS, a sensor placement method derived from the Christoffel function that provides non-asymptotic bounds on the number of sensors needed for recovery in posterior sampling with arbitrary distributions and sensors. This framework is model-agnostic and applies to diffusion and flow-matching models, outperforming Gaussian-based optimal sensor placement and existing generative placement strategies.

What carries the argument

The Christoffel function, which supplies a mathematical formulation of optimal sampling and recovery guarantees for posterior sampling with arbitrary sensors and signal distributions.

Load-bearing premise

That the Christoffel function supplies optimal sampling and recovery guarantees for posterior sampling with diffusion models and arbitrary non-Gaussian signal distributions.

What would settle it

An experiment showing that on a non-Gaussian signal distribution with a small number of sensors, Christoffel-DPS does not reduce reconstruction error below that of Gaussian baselines or existing methods.

Figures

Figures reproduced from arXiv: 2605.06861 by Ben Adcock, Carola-Bibiane Sch\"onlieb, James Rowbottom, Nick Huang.

Figure 1
Figure 1. Figure 1: Ensemble Christoffel-DPS. Two DPS sam￾pling trajectories on the Pinball dataset. Row 1 (ran￾dom): 6 fixed sensors at random positions. Row 2 (Christoffel-DPS): 3 fixed, anchor sensors (•) located via offline Christoffel-DPS, 3 mobile sensors (•) cho￾sen online by the ensemble-greedy update. Columns: ground-truth; intermediate noisy state xt∗ ; row 1 Tweedie etimate ensemble standard deviation σ(ˆx0), row 2… view at source ↗
Figure 2
Figure 2. Figure 2: Pinball problem: m-convergence: Relative￾L2 error vs sensor budget m. Our first experiment is on the Pinball prob￾lem Tomasetto et al. [2025]. We instantiate Christoffel-DPS in the infinite-dimensional set￾ting via GRIFDIR [Rowbottom et al., 2026], a function-space EDM diffusion model on an unstructured domain with continuous FEM￾continuous graph kernel layers and a multi￾scale graph and latent space trans… view at source ↗
Figure 3
Figure 3. Figure 3: Darcy flow: m-convergence: Relative-L2 error vs sensor budget m. Finally OPS methods that require a POD basis are implemented with independent a, u place￾ment strategies and the D-/E-optimal meth￾ods collapse due to rank-deficiency of the low￾frequency field u at small k ( [PITH_FULL_IMAGE:figures/full_fig_p009_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: Pinball reconstructions with m=10: sensor locations on the unstructured 7525-node mesh; [PITH_FULL_IMAGE:figures/full_fig_p022_4.png] view at source ↗
Figure 5
Figure 5. Figure 5 [PITH_FULL_IMAGE:figures/full_fig_p023_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Kolmogorov flow: m-convergence (a) Mean relative-L2 error for the Kolmogorov turbulence (Re = 1000, 2562 ) reconstruction task using masked-diffusion guidance. 23 [PITH_FULL_IMAGE:figures/full_fig_p023_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Kolmogorov-flow reconstructions at m=25: Mid-trajectory vorticity slice on the 256×256 grid. Two test samples; columns mirror [PITH_FULL_IMAGE:figures/full_fig_p024_7.png] view at source ↗
read the original abstract

State estimation is a critical task in scientific, engineering and control applications. Since the reliability of reconstructions depends on the number and position of sensors, optimal sensor placement (OSP) is essential in scenarios where measurements are sparse and expensive. Classical OSP approaches rely on Gaussian assumptions and are consequently unable to account for the complex distributions encountered in many real-world systems. Generative-model-based reconstruction using sensor guided diffusion posterior sampling (DPS) has emerged as a promising technique for reconstructing states from highly complex distributions. However, existing sensor-selection methods either require unrealistically many sensors or emulate classical OSP, creating a mismatch between modern recovery models with classical OSP tools motivating the need for fundamentally new ideas towards OSP that match the recent advances made in powerful recovery models. We introduce a distribution-free sensor placement framework based on the Christoffel function: a mathematical formulation of optimal sampling and recovery guarantees for posterior sampling with arbitrary sensors and signal distributions, from which we derive a new OSP strategy with non-asymptotic bounds on the number of sensors needed for recovery. We develop Christoffel-DPS, with offline and online variants, instantiating Christoffel sampling for generative models. Christoffel-DPS outperforms Gaussian OSP baselines and existing generative-model placement methods, validating that distribution-free sensing is both theoretically principled and practically superior. The framework is model-agnostic; we demonstrate its application to a range of unconditional DPS and flow-matching models on structurally non-Gaussian benchmarks, showing the efficacy of Christoffel-DPS in low sensor budget regimes.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The manuscript introduces Christoffel-DPS, a distribution-free framework for optimal sensor placement in diffusion posterior sampling (DPS) and flow-matching models. It leverages the Christoffel function to formulate optimal sampling and recovery guarantees for arbitrary sensors and signal distributions, deriving non-asymptotic bounds on the required number of sensors. The approach includes offline and online variants and is demonstrated to outperform Gaussian OSP baselines and other generative-model methods on structurally non-Gaussian benchmarks.

Significance. If the non-asymptotic bounds and optimality claims hold under the conditions of diffusion models, this work provides a significant advance by extending classical approximation theory tools to modern generative reconstruction methods, enabling reliable state estimation in complex, non-Gaussian settings with fewer sensors. The model-agnostic design and empirical validation across multiple models are strengths that could influence sensor placement strategies in scientific and engineering applications.

major comments (2)
  1. The central claim of distribution-free guarantees and non-asymptotic bounds on sensor count for recovery (abstract and theoretical sections) rests on the Christoffel function supplying optimal sampling and recovery for posterior sampling in DPS. Standard Christoffel-Darboux kernels require the measure to satisfy conditions such as finite moments of all orders or membership in a polynomial basis of sufficient degree; the manuscript does not enumerate the precise measure-theoretic hypotheses under which these guarantees transfer to the approximate score matching performed by diffusion models on arbitrary (non-Gaussian) distributions. This is load-bearing for the optimality and bound claims.
  2. In the experimental section, the reported outperformance on non-Gaussian benchmarks is promising, but the link between the theoretical Christoffel placement and the DPS recovery error is not quantified via ablations on score-matching approximation error or moment conditions of the learned posterior. Without this, it remains unclear whether the empirical gains validate the non-asymptotic bounds or arise from other factors.
minor comments (2)
  1. The distinction between the offline and online variants of Christoffel-DPS should be clarified with pseudocode or explicit algorithmic steps in the methods section.
  2. Ensure notation for the Christoffel function and its application to the diffusion posterior is consistent between the theoretical derivation and the implementation details.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for the constructive and detailed feedback. The comments highlight important aspects of rigor in the theoretical claims and the connection to experiments. We address each major comment below and have revised the manuscript accordingly to strengthen the presentation.

read point-by-point responses
  1. Referee: The central claim of distribution-free guarantees and non-asymptotic bounds on sensor count for recovery (abstract and theoretical sections) rests on the Christoffel function supplying optimal sampling and recovery for posterior sampling in DPS. Standard Christoffel-Darboux kernels require the measure to satisfy conditions such as finite moments of all orders or membership in a polynomial basis of sufficient degree; the manuscript does not enumerate the precise measure-theoretic hypotheses under which these guarantees transfer to the approximate score matching performed by diffusion models on arbitrary (non-Gaussian) distributions. This is load-bearing for the optimality and bound claims.

    Authors: We agree that explicit enumeration of the hypotheses is essential for the claims to be fully rigorous. The original development in Section 3 implicitly relies on the signal measure having finite moments of order up to twice the polynomial degree used in the Christoffel function and on the diffusion model's score approximation error being sufficiently small for the posterior sampling to inherit the recovery guarantees. In the revised manuscript we have added a new subsection (Section 3.2) that states these conditions precisely: (i) the data distribution belongs to the class of measures with finite 2k-th moments for the relevant degree k, (ii) the learned score function approximates the true score within an L2 error of order ε, and (iii) the diffusion process satisfies standard regularity conditions that allow the Christoffel-Darboux kernel to control the sampling density. Under these hypotheses Theorem 3.4 establishes the non-asymptotic sensor-count bound with high probability. We also note that “distribution-free” in the paper refers to the absence of a Gaussian assumption rather than the complete absence of any moment conditions; the added subsection makes this distinction explicit. revision: yes

  2. Referee: In the experimental section, the reported outperformance on non-Gaussian benchmarks is promising, but the link between the theoretical Christoffel placement and the DPS recovery error is not quantified via ablations on score-matching approximation error or moment conditions of the learned posterior. Without this, it remains unclear whether the empirical gains validate the non-asymptotic bounds or arise from other factors.

    Authors: We appreciate the request for tighter linkage between theory and experiments. In the revised version we have inserted a new ablation subsection (Section 5.3) that reports, for each benchmark and model, the empirical score-matching error (measured as the L2 discrepancy between the learned and oracle score functions) together with the corresponding DPS reconstruction error under Christoffel versus Gaussian placement. We further stratify the results by distributions with different moment properties (compact support, sub-Gaussian tails, and heavier tails). The plots show that when the score-matching error is below the threshold required by the theory, the observed sensor-efficiency gains track the non-asymptotic bounds; larger approximation errors degrade performance for all methods but preserve the relative advantage of Christoffel-DPS. These controlled ablations indicate that the reported improvements are consistent with the theoretical predictions rather than arising from unrelated implementation details. revision: yes

Circularity Check

0 steps flagged

No circularity: derivation invokes standard Christoffel properties as external input

full rationale

The abstract and claimed framework present the Christoffel function as an established mathematical object supplying sampling guarantees, then instantiate it for DPS. No equation or step reduces a claimed prediction or bound to a fitted parameter or self-citation by construction. The non-asymptotic sensor bounds are asserted to follow from the external properties of the Christoffel function applied to the posterior measure; this transfer is a modeling choice, not a definitional loop. Self-citations, if present, are not load-bearing for the central claim. The derivation chain remains self-contained against external benchmarks.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

Abstract-only review prevents full identification of free parameters, axioms, or invented entities; no explicit fitted constants or new postulated objects are named.

pith-pipeline@v0.9.0 · 5590 in / 1111 out tokens · 45670 ms · 2026-05-11T01:13:55.474351+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

74 extracted references · 74 canonical work pages · 2 internal anchors

  1. [1]

    How Many Measurements Are Enough?

    Adcock, Ben and Huang, Nick , year = 2025, month = oct, urldate =. How Many Measurements Are Enough?. The

  2. [2]

    Foundations of Computational Mathematics , volume =

    Optimal Sampling for Least-Squares Approximation , author =. Foundations of Computational Mathematics , volume =. doi:10.1007/s10208-025-09738-2 , urldate =

  3. [3]

    and Dexter, Nick and Moraga, Sebastian , editor =

    Adcock, Ben and Cardenas, Juan M. and Dexter, Nick and Moraga, Sebastian , editor =. Towards. High-. doi:10.1007/978-3-031-00832-0_2 , urldate =

  4. [4]

    Ahdab, Mohamad Al and Leth, John and Tan, Zheng-Hua , year = 2025, month = jun, urldate =. Optimal. Forty-Second

  5. [5]

    Physics of Fluids , volume =

    Guiding Diffusion Models to Reconstruct Flow Fields from Sparse Data , author =. Physics of Fluids , volume =. doi:10.1063/5.0304492 , urldate =

  6. [6]

    Numerische Mathematik , volume =

    Linear Least Squares Solutions by Householder Transformations , author =. Numerische Mathematik , volume =. doi:10.1007/BF01436084 , urldate =

  7. [7]

    and Adcock, Ben and Dexter, Nick , year = 2023, month = nov, urldate =

    Cardenas, Juan M. and Adcock, Ben and Dexter, Nick , year = 2023, month = nov, urldate =. Thirty-Seventh

  8. [8]

    Adaptive

    Chakraborty, Dibyajyoti and Kim, Hojin and Maulik, Romit , year = 2026, month = mar, number =. Adaptive. doi:10.48550/arXiv.2603.12635 , urldate =. arXiv , keywords =:2603.12635 , primaryclass =

  9. [9]

    Machine Learning: Earth , volume =

    Multimodal Atmospheric Super-Resolution with Deep Generative Models , author =. Machine Learning: Earth , volume =. doi:10.1088/3049-4753/ae286e , urldate =

  10. [10]

    Chaturantabut, D

    Chaturantabut, Saifon and Sorensen, Danny C. , year = 2010, month = jan, journal =. Nonlinear. doi:10.1137/090766498 , urldate =

  11. [11]

    Generating

    Chen, Panqi and Sun, Yifan and Cheng, Lei and Yang, Yang and Li, Weichang and Liu, Yang and Liu, Weiqing and Bian, Jiang and Fang, Shikai , year = 2025, month = oct, urldate =. Generating. The

  12. [12]

    Diffusion

    Chung, Hyungjin and Kim, Jeongsol and Mccann, Michael Thompson and Klasky, Marc Louis and Ye, Jong Chul , year = 2022, month = sep, urldate =. Diffusion. The

  13. [13]

    Dang, Hiep Vo and Nguyen, Phong C. H. , year = 2025, month = nov, journal =. Deep. doi:10.1115/1.4070332 , urldate =

  14. [14]

    Physics of Fluids , volume =

    Deep Neural Network-Based Strategy for Optimal Sensor Placement in Data Assimilation of Turbulent Flow , author =. Physics of Fluids , volume =. doi:10.1063/5.0035230 , urldate =

  15. [15]

    Drma. A. SIAM Journal on Scientific Computing , volume =. doi:10.1137/15M1019271 , urldate =

  16. [16]

    Duth. Graph. doi:10.48550/ARXIV.2501.17081 , urldate =

  17. [17]

    Computers & Fluids , volume =

    Optimization of a Synthetic Jet Actuator for Aerodynamic Stall Control , author =. Computers & Fluids , volume =. doi:10.1016/j.compfluid.2005.01.005 , urldate =

  18. [18]

    and Sirovich, L

    Everson, R. and Sirovich, L. , year = 1995, month = aug, journal =. Karhunen--. doi:10.1364/JOSAA.12.001657 , urldate =

  19. [19]

    Global Field Reconstruction from Sparse Sensors with

    Fukami, Kai and Maulik, Romit and Ramachandra, Nesar and Fukagata, Koji and Taira, Kunihiko , year = 2021, month = nov, journal =. Global Field Reconstruction from Sparse Sensors with. doi:10.1038/s42256-021-00402-2 , urldate =

  20. [20]

    arXiv preprint arXiv:2602.06021 , year=

    He, Ye and Qiu, Yitong and Tao, Molei , year = 2026, publisher =. Diffusion. doi:10.48550/ARXIV.2602.06021 , urldate =

  21. [21]

    Hertrich, H

    Hertrich, Johannes and Wong, Hok Shing and Denker, Alexander and Ducotterd, Stanislas and Fang, Zhenghan and Haltmeier, Markus and Kereta,. Learning. doi:10.48550/arXiv.2510.01755 , urldate =. arXiv , keywords =:2510.01755 , primaryclass =

  22. [22]

    Huang, Jiahe and Yang, Guandao and Wang, Zichen and Park, Jeong Joon , year = 2024, month = nov, urldate =. The

  23. [23]

    Joshi, Siddharth and Boyd, Stephen , year = 2009, month = feb, journal =. Sensor. doi:10.1109/TSP.2008.2007095 , urldate =

  24. [24]

    Diffusion

    Karczewski, Rafal and Heinonen, Markus and Garg, Vikas , year = 2024, month = oct, urldate =. Diffusion. The

  25. [25]

    and Klishin, Andrei A

    Karnik, Niharika and Bhangale, Yash and Abdo, Mohammad G. and Klishin, Andrei A. and Cogliati, Joshua J. and Brunton, Bingni W. and Kutz, J. Nathan and Brunton, Steven L. and Manohar, Krithika , year = 2025, month = sep, number =. doi:10.48550/arXiv.2509.08017 , urldate =. arXiv , keywords =:2509.08017 , primaryclass =

  26. [26]

    Elucidating the

    Karras, Tero and Aittala, Miika and Aila, Timo and Laine, Samuli , year = 2022, month = oct, urldate =. Elucidating the. Advances in

  27. [27]

    and Kutz, J

    Klishin, Andrei A. and Kutz, J. Nathan and Manohar, Krithika , year = 2025, month = sep, number =. Data-. doi:10.48550/arXiv.2307.11838 , urldate =. arXiv , keywords =:2307.11838 , primaryclass =

  28. [28]

    15 Controlling Transient Amplification Improves Long-horizon Rollouts doi: https://doi.org/10.1016/j.neunet.2026.108641

    Benchmarking Autoregressive Conditional Diffusion Models for Turbulent Flow Simulation , author =. Neural Networks , volume =. doi:10.1016/j.neunet.2026.108641 , urldate =

  29. [29]

    Krause, Andreas and Singh, Ajit and Guestrin, Carlos , year = 2008, journal =. Near-

  30. [30]

    Optimal Sensor Placements Using Modified

    Lee, Eun-Taik and Eun, Hee-Chang , year = 2021, month = jun, journal =. Optimal Sensor Placements Using Modified. doi:10.1177/15501477211023022 , abstract =

  31. [31]

    Lim, Jae Hyun and Kovachki, Nikola B. and Baptista, Ricardo and Beckham, Christopher and Azizzadenesheli, Kamyar and Kossaifi, Jean and Voleti, Vikram and Song, Jiaming and Kreis, Karsten and Kautz, Jan and Pal, Christopher and Vahdat, Arash and Anandkumar, Anima , year = 2023, month = feb, journal =. Score-Based

  32. [32]

    Lin, Thomas Y. L. and Yao, Jiachen and Chiang, Lufang and Berner, Julius and Anandkumar, Anima , year = 2026, month = mar, urldate =. Decoupled

  33. [33]

    Uncertainty-Aware

    Liu, Qiang and Thuerey, Nils , year = 2024, month = jun, urldate =. Uncertainty-Aware

  34. [34]

    Ma, Yuezhou and Wu, Haixu and Zhou, Hang and Weng, Huikun and Wang, Jianmin and Long, Mingsheng , year = 2025, month = oct, urldate =. The

  35. [35]

    doi: 10.1109/mcs.2018.2810460

    Manohar, Krithika and Brunton, Bingni W. and Kutz, J. Nathan and Brunton, Steven L. , year = 2018, month = jun, journal =. Data-. doi:10.1109/MCS.2018.2810460 , urldate =

  36. [36]

    Journey over Destination: Dynamic Sensor Placement Enhances Generalization , shorttitle =

    Marcato, Agnese and Guiltinan, Eric and Viswanathan, Hari and O'Malley, Daniel and Lubbers, Nicholas and Santos, Javier E , year = 2024, month = jun, journal =. Journey over Destination: Dynamic Sensor Placement Enhances Generalization , shorttitle =. doi:10.1088/2632-2153/ad4e06 , urldate =

  37. [37]

    SIAM Journal on Scientific Computing , urldate =

    Nonlinear. SIAM Journal on Scientific Computing , urldate =

  38. [38]

    Infinite-

    Pidstrigach, Jakiw and Marzouk, Youssef and Reich, Sebastian and Wang, Sven , year = 2023, month = feb, journal =. Infinite-

  39. [39]

    Explainable

    Ragonesi, Alice and Fresca, Stefania and Gillette, Karli and. Explainable. doi:10.48550/ARXIV.2511.05973 , urldate =

  40. [40]

    Optimizing

    Ravula, Sriram and Levac, Brett and Jalal, Ajil and Tamir, Jon and Dimakis, Alex , year = 2023, month = nov, urldate =. Optimizing

  41. [41]

    GRIFDIR: Graph Resolution-Invariant FEM Diffusion Models in Function Spaces over Irregular Domains

    Rowbottom, James and Baker, Elizabeth L. and Huang, Nick and Adcock, Ben and Sch. doi:10.48550/arXiv.2605.03497 , urldate =. arXiv , keywords =:2605.03497 , primaryclass =

  42. [42]

    and Fox, Zachary R

    Santos, Javier E. and Fox, Zachary R. and Mohan, Arvind and O'Malley, Daniel and Viswanathan, Hari and Lubbers, Nicholas , year = 2023, month = nov, journal =. Development of the. doi:10.1038/s42256-023-00746-x , urldate =

  43. [43]

    R., Cogswell, M., Das, A., Vedantam, R., Parikh, D., and Batra, D

    Selvaraju, Ramprasaath R. and Cogswell, Michael and Das, Abhishek and Vedantam, Ramakrishna and Parikh, Devi and Batra, Dhruv , year = 2017, month = oct, pages =. Grad-. 2017. doi:10.1109/ICCV.2017.74 , urldate =

  44. [44]

    Journal of Computational Physics , keywords =

    A Physics-Informed Diffusion Model for High-Fidelity Flow Field Reconstruction , author =. Journal of Computational Physics , volume =. doi:10.1016/j.jcp.2023.111972 , urldate =

  45. [45]

    Springenberg, Jost Tobias and Dosovitskiy, Alexey and Brox, Thomas and Riedmiller, Martin , year = 2015, abstract =

  46. [46]

    Wang, Yuepeng and Ding, Xuemei and Hu, Kun and Fang, Fangxin and Navon, I. M. and Lin, Guang , year = 2021, month = mar, journal =. Feasibility of. doi:10.1016/j.jcp.2020.110005 , urldate =

  47. [47]

    Computers & Fluids , volume =

    Unsteady Flow Sensing and Estimation via the Gappy Proper Orthogonal Decomposition , author =. Computers & Fluids , volume =. doi:10.1016/j.compfluid.2004.11.006 , urldate =

  48. [48]

    Journal of Computational Physics , volume =

    Optimal Sensor Placement for Ensemble-Based Data Assimilation Using Gradient-Weighted Class Activation Mapping , author =. Journal of Computational Physics , volume =. doi:10.1016/j.jcp.2024.113224 , urldate =

  49. [49]

    Yao, Jiachen and Mammadov, Abbas and Berner, Julius and Kerrigan, Gavin and Ye, Jong Chul and Azizzadenesheli, Kamyar and Anandkumar, Anima , year = 2025, month = oct, urldate =. Guided. The

  50. [50]

    Operator Learning for Reconstructing Flow Fields from Sparse Measurements:

    Zhang, Qian and Krotov, Dmitry and Karniadakis, George Em , year = 2025, month = oct, journal =. Operator Learning for Reconstructing Flow Fields from Sparse Measurements:. doi:10.1016/j.jcp.2025.114148 , urldate =

  51. [51]

    Zhao, Shihang and Wang, Feitong and Tang, Yumeng and Liu, Yangwei , year = 2025, pages =. Optimal. Proceedings of the 7th. doi:10.1007/978-981-97-9771-4_30 , abstract =

  52. [52]

    Geometric

    Zimmermann, Ralf and Peherstorfer, Benjamin and Willcox, Karen , year = 2018, month = jan, journal =. Geometric. doi:10.1137/17M1123286 , urldate =

  53. [53]

    Advances in Neural Information Processing Systems , volume=

    Ambient diffusion: Learning clean distributions from corrupted data , author=. Advances in Neural Information Processing Systems , volume=

  54. [54]

    arXiv preprint arXiv:2305.13128 , year=

    Gsure-based diffusion model training with corrupted data , author=. arXiv preprint arXiv:2305.13128 , year=

  55. [55]

    Bora, A. and A. Jalal and E. Price and A. G. Dimakis , booktitle =. Compressed sensing using generative models , year =

  56. [56]

    Jalal and S

    A. Jalal and S. Karmalkar and A. Dimakis and E. Price , title =. 38th International Conference on Machine Learning , year =

  57. [57]

    and Brugiapaglia, S

    Berk, A. and Brugiapaglia, S. and Joshi, B. and Plan, Y. and Scott, M. and Yilmaz, O. , journal=. A Coherence Parameter Characterizing Generative Compressed Sensing With Fourier Measurements , year=

  58. [58]

    and Hansen, A

    Adcock, B. and Hansen, A. C. , date-added =. Generalized Sampling and Infinite-Dimensional Compressed Sensing , volume =. Found. Comput. Math. , number =

  59. [59]

    How many measurements are enough? Bayesian recovery in inverse problems with general distributions , volume =

    Adcock, Ben and Huang, Zi Yuan (Nick) , booktitle =. How many measurements are enough? Bayesian recovery in inverse problems with general distributions , volume =

  60. [60]

    and Cardenas, J

    Adcock, B. and Cardenas, J. M. and Dexter, N. , booktitle =

  61. [61]

    and Rauhut, H

    Foucart, S. and Rauhut, H. , date-added =. A Mathematical Introduction to Compressive Sensing , year =

  62. [62]

    NeurIPS 2023 Workshop on Deep Learning and Inverse Problems , year=

    Model-adapted Fourier sampling for generative compressed sensing , author=. NeurIPS 2023 Workshop on Deep Learning and Inverse Problems , year=

  63. [63]

    Dimakis , booktitle =

    Ashish Bora and Ajil Jalal and Eric Price and Alexandros G. Dimakis , booktitle =. Compressed sensing using generative models , year =

  64. [64]

    and Cardenas, J

    Adcock, B. and Cardenas, J. M. and Dexter, N. and Moraga, S. , booktitle =. Towards optimal sampling for learning sparse approximation in high dimensions , year =

  65. [65]

    , date-added =

    Adcock, B. , date-added =. Optimal sampling for least-squares approximation , volume =. Found. Comput. Math. , pages =

  66. [66]

    The Thirty-ninth Annual Conference on Neural Information Processing Systems , year=

    Preconditioned Langevin Dynamics with Score-based Generative Models for Infinite-Dimensional Linear Bayesian Inverse Problems , author=. The Thirty-ninth Annual Conference on Neural Information Processing Systems , year=

  67. [67]

    and Migliorati, G

    Cohen, A. and Migliorati, G. , date-added =. Optimal weighted least-squares methods , volume =. SMAI J. Comput. Math. , pages =

  68. [68]

    and Varma, R

    Chen, S. and Varma, R. and Singh, A. and Kova. A statistical perspective of sampling scores for linear regression , year =. 2016 IEEE International Symposium on Information Theory (ISIT) , date-added =

  69. [69]

    and Mahoney, M

    Ma, P. and Mahoney, M. W. and Yu, B. , date-added =. A Statistical Perspective on Algorithmic Leveraging , volume =. J. Mach. Learn. Res. , pages =

  70. [70]

    and Kim, J

    Chung, H. and Kim, J. and McCann, M. T. and Klasky, M. L. and Ye, J. C. , booktitle =. Diffusion Posterior Sampling for General Noisy Inverse Problems , year =

  71. [71]

    and Marzouk, Y

    Pidstrigach, J. and Marzouk, Y. and Reich, S. and Wang, S. , date-added =. Infinite-Dimensional Diffusion Models , volume =. J. Mach. Learn. Res. , number =

  72. [72]

    Nature Communications , volume=

    Reduced order modeling with shallow recurrent decoder networks , author=. Nature Communications , volume=. 2025 , publisher=

  73. [73]

    and Kovachki, N

    Li, Z. and Kovachki, N. and Azizzadenesheli, K. and Liu, B. and Bhattacharya, K. and Stuart, A. and Anandkumar, A. , booktitle =. Fourier neural operator for parametric partial differential equations , year =

  74. [74]

    Y ., Yao, J., Chiang, L., Berner, J., and Anandkumar, A

    Decoupled Diffusion Sampling for Inverse Problems on Function Spaces , author=. arXiv:2601.23280 , year=