Recognition: unknown
Robust Optimal Experimental Design Accounting for Sensor Failure
Pith reviewed 2026-05-10 09:38 UTC · model grok-4.3
The pith
Robust optimal designs for accelerometer placement outperform classical designs when sensors fail during vibration tests.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
Although robust and classical designs are similar for the structural dynamics problem of interest, robust designs outperform classical designs on average over relevant failure scenarios of interest. The work employs a relaxation-based approach with gradient-based optimization and a binary-inducing penalty to generate sensor designs that are robust to failures, evaluated using log-determinant of parameter covariance and mean-squared errors.
What carries the argument
Relaxation-based robust optimal experimental design formulation with binary-inducing penalty applied to high-dimensional sensor placement in expensive finite-element models of structural dynamics.
If this is right
- Robust designs maintain better average parameter estimation accuracy across failure scenarios even when the nominal designs look similar.
- The relaxation plus penalty method produces usable binary sensor layouts without relying on post-optimization rounding.
- Metrics based on log-determinant of covariance and on mean-squared parameter or prediction errors can be used interchangeably to drive the placement.
- The same framework applies directly to other high-dimensional vibration problems where sensor loss is common.
- Classical designs remain adequate when failure probability is low, but the performance gap widens as failure likelihood increases.
Where Pith is reading between the lines
- Test engineers could pre-compute both a robust layout and a classical layout and switch to the robust one for high-risk experiments.
- The method could be extended to continuous failure probabilities rather than discrete scenarios to cover more realistic uncertainty.
- Similar robust formulations might improve sensor placement in other fields such as structural health monitoring or acoustic testing.
- Physical validation on a laboratory shaker table with deliberate sensor disconnections would provide the next concrete check.
Load-bearing premise
The specific failure scenarios examined are representative of actual sensor failures that occur during high-acceleration vibration experiments.
What would settle it
Running the same sensor-placement optimization on data from physical vibration experiments that actually experience random sensor failures and measuring whether the robust design still yields lower average estimation error than the classical design.
Figures
read the original abstract
Optimal experimental design provides a way of determining a-priori the best locations at which to place accelerometers in vibrations analysis experiments. However, in practice, sensors often fail during experimentation due high mechanical accelerations. There have been limited works exploring the use of robust OED in the context of vibrations analysis, where design spaces (i.e. candidate sensor locations and orientations) are high-dimensional and the finite-element models are expensive to compute. Therefore, this work considers the application of more general robust OED formulations to such a structural dynamics problem. We employ a relaxation-based approach that enables the use of efficient gradient-based optimization. Furthermore, we leverage a binary-inducing penalty during optimization to provide a binary sensor design as an alternative to leveraging post-optimization rounding heuristics. We consider performance metrics based on the log-determinant of the parameter covariance as well those based on parameter and prediction mean-squared errors. We find that although robust and classical designs are similar for the structural dynamics problem of interest, robust designs outperform classical designs on average over relevant failure scenarios of interest.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper develops a robust optimal experimental design (OED) method for placing accelerometers in structural vibration experiments that accounts for potential sensor failures due to high mechanical accelerations. It employs a relaxation-based formulation solved via gradient-based optimization, augmented with a binary-inducing penalty to directly yield binary sensor designs without post-hoc rounding. Performance is assessed using the log-determinant of the parameter covariance matrix as well as parameter and prediction mean-squared error criteria. For the structural dynamics test case, the resulting robust designs are similar to classical (non-robust) designs, yet the authors report that they outperform the classical designs on average when evaluated over the modeled failure scenarios.
Significance. If the modeled sensor-failure distribution accurately represents real high-acceleration vibration tests, the approach could improve the reliability of sensor placements in expensive structural-dynamics experiments. The use of an efficient relaxation scheme and a direct binary penalty are methodological strengths that avoid common rounding heuristics. However, the reported similarity between robust and classical designs implies that any advantage is modest and highly sensitive to the choice of failure model; without quantitative results, error bars, or validation against observed failure statistics, the practical impact remains limited.
major comments (2)
- [Abstract] Abstract: the central claim that 'robust designs outperform classical designs on average over relevant failure scenarios of interest' is stated without any numerical values, confidence intervals, number of scenarios, or effect-size metrics. Because the designs are described as similar for the nominal problem, the magnitude and statistical reliability of the reported average advantage cannot be assessed from the given information.
- [§3] §3 (failure model definition): the outperformance is computed exclusively with respect to the authors' chosen probabilistic failure scenarios (independent dropouts or worst-case subsets). No calibration, sensitivity study, or comparison to empirical failure statistics from high-acceleration tests is provided. Given that the robust and classical designs are similar, this modeling choice is load-bearing for the practical conclusion that robust designs should be preferred.
minor comments (2)
- [Abstract] The abstract and introduction would benefit from a brief statement of the dimensionality of the candidate sensor set and the computational cost of the finite-element model to contextualize the need for the relaxation approach.
- [§2] Notation for the binary penalty term and the relaxation parameter should be introduced once and used consistently; occasional switches between symbols for the same quantity reduce readability.
Simulated Author's Rebuttal
We thank the referee for their constructive and detailed feedback on our manuscript. We address each major comment below and will revise the manuscript accordingly to improve clarity and strengthen the presentation of results.
read point-by-point responses
-
Referee: [Abstract] Abstract: the central claim that 'robust designs outperform classical designs on average over relevant failure scenarios of interest' is stated without any numerical values, confidence intervals, number of scenarios, or effect-size metrics. Because the designs are described as similar for the nominal problem, the magnitude and statistical reliability of the reported average advantage cannot be assessed from the given information.
Authors: We agree that the abstract would benefit from quantitative details to support the outperformance claim. The manuscript includes Monte Carlo evaluations of the designs over failure scenarios, and we will revise the abstract to report specific metrics such as the average improvement in the log-determinant of the parameter covariance matrix, the number of scenarios sampled, and measures of variability (e.g., standard deviation across scenarios). This will enable readers to better assess the effect size and reliability of the reported advantage. revision: yes
-
Referee: [§3] §3 (failure model definition): the outperformance is computed exclusively with respect to the authors' chosen probabilistic failure scenarios (independent dropouts or worst-case subsets). No calibration, sensitivity study, or comparison to empirical failure statistics from high-acceleration tests is provided. Given that the robust and classical designs are similar, this modeling choice is load-bearing for the practical conclusion that robust designs should be preferred.
Authors: The failure models (independent Bernoulli dropouts and worst-case subsets) were selected as plausible representations of sensor risks in high-acceleration environments, given the limited availability of empirical failure statistics in the literature. We acknowledge that this assumption is central to the conclusions. In the revision, we will add a sensitivity study in §3 by varying the dropout probability and report its impact on the relative performance of robust versus classical designs. We will also clarify that the overall framework is general and can accommodate any specified failure distribution, including empirical ones when available. revision: partial
- Direct calibration or validation of the sensor failure model against empirical statistics from real high-acceleration vibration tests, as such data is not publicly available and would require dedicated experimental collaboration outside the scope of this work.
Circularity Check
No significant circularity in derivation chain
full rationale
The paper formulates a robust OED problem using a standard log-determinant covariance objective (augmented by a relaxation and binary penalty), optimizes designs for the structural dynamics model, and numerically evaluates both robust and classical designs on the same failure scenarios. This evaluation is a direct computational consequence of the optimization but does not redefine the metric in terms of itself, rename a fitted quantity as a prediction, or rely on self-citations for load-bearing uniqueness or ansatz justification. The central claim rests on independent numerical comparison rather than reducing to its inputs by construction. The paper is self-contained against external benchmarks with no evident circular steps.
Axiom & Free-Parameter Ledger
axioms (1)
- domain assumption The finite-element model used to simulate sensor responses accurately captures the relevant structural dynamics.
Reference graph
Works this paper leans on
-
[1]
wny]⊤, forw i ∈ {0,1}
Note that for simplicity, we will refer to the candidate DoFs ascandidate sensorsand the optimal design as theoptimal sensor placement strategy, while always referring to both location and orientation Since for every DoFione can either collect data (w i = 1) or not (w i = 0), we mathematically represent the design as binary, i.e.w= [w 1, . . . wny]⊤, forw...
-
[2]
Wieslaw Ostachowicz, Rohan Soman, and Pawel H. Malinowski. Optimization of sensor placement for structural health monitoring: A review.Structural health monitoring, 18(3):963–988, 2019.doi: 10.1177/1475921719825601
-
[3]
Computational methodologies for optimal sensor placement in structural health monitoring: A review.Structural Health Monitoring, 19(4):1287–1308, 2020.doi:10.1177/ 1475921719877579
Yi Tan and Limao Zhang. Computational methodologies for optimal sensor placement in structural health monitoring: A review.Structural Health Monitoring, 19(4):1287–1308, 2020.doi:10.1177/ 1475921719877579
2020
-
[4]
Sahar Hassani and Ulrike Dackermann. A systematic review of optimization algorithms for structural health monitoring and optimal sensor placement.Sensors (Basel), 23(6):3293, March 2023.doi: 10.3390/s23063293
-
[5]
Fedorov.Theory of Optimal Experiments
Valerii V. Fedorov.Theory of Optimal Experiments. Academic Press, 1972
1972
-
[6]
David F. Andrews and Agnes M. Herzberg. The robustness and optimality of response surface designs.Journal of Statistical Planning and Inference, 3(3):249–257, January 1979. URL:https://www.sciencedirect.com/science/article/pii/0378375879900168,doi:10.1016/ 0378-3758(79)90016-8
-
[7]
Lorens A. Imhof, Dale Song, and Weng K. Wong. Optimal design of experiments with possibly failing trials.Statistica Sinica, pages 1145–1155, 2002. URL:https://www.jstor.org/stable/24307020
-
[8]
Lorens A. Imhof, Dale Song, and Weng K. Wong. Optimal design of experiments with anticipated pattern of missing observations.Journal of theoretical biology, 228(2):251–260, 2004.doi:10.1016/j. jtbi.2004.01.002
work page doi:10.1016/j 2004
-
[9]
Smucker, Willis Jensen, Zichen Wu, and Bo Wang
Byran J. Smucker, Willis Jensen, Zichen Wu, and Bo Wang. Robustness of classical and optimal designs to missing observations.Computational Statistics & Data Analysis, 113:251–260, September
-
[10]
1016/j.csda.2016.12.001
URL:https://www.sciencedirect.com/science/article/pii/S0167947316302912,doi:10. 1016/j.csda.2016.12.001
2016
-
[11]
Wanida Limmun, Boonorm Chomtee, and John J. Borkowski. Generating Robust Optimal Mixture Designs Due to Missing Observation Using a Multi-Objective Genetic Algorithm.Mathematics, 11(16), 2023.doi:10.3390/math11163558
-
[12]
Exploring Application of the Coordinate Exchange to Generate Optimal Designs Robust to Data Loss
Asher Hanson. Exploring Application of the Coordinate Exchange to Generate Optimal Designs Robust to Data Loss. Master’s thesis, Utah State University, 2024.doi:10.26076/4385-2a82
-
[13]
Haichao An, Byeng D. Youn, and Heung S. Kim. Optimal sensor placement considering both sensor faults under uncertainty and sensor clustering for vibration-based damage detection.Structural and Multidisciplinary Optimization, 65(3):102, 2022.doi:10.1007/s00158-021-03159-9
-
[14]
Yichao Yang, Mayank Chadha, Zhen Hu, and Michael D. Todd. An optimal sensor design frame- work accounting for sensor reliability over the structural life cycle.Mechanical Systems and Sig- nal Processing, 202:110673, 2023. URL:https://www.sciencedirect.com/science/article/pii/ S0888327023005812,doi:10.1016/j.ymssp.2023.110673
-
[15]
David F. Andrews and Agnes M. Herzberg. Some considerations in the optimal design of experiments in non-optimal situations.Journal of Statistical Planning and Inference, B(38):284–289, 1976.doi: 10.1111/j.2517-6161.1976.tb01596.x
-
[16]
Siddharth Joshi and Stephen Boyd. Sensor selection via convex optimization.IEEE Transactions on Signal Processing, 57(2):451–462, 2009.doi:10.1109/TSP.2008.2007095. 20
-
[17]
Christian Aarset. Global optimality conditions for sensor placement, with extensions to binary low-rank A-optimal designs.Inverse Problems, 41(6):065013, June 2025.doi:10.1088/1361-6420/add9bf
-
[18]
Gregory Bunting, Nathan K. Crane, David M. Day, Clark R. Dohrmann, Brian A. Ferri, Robert C. Flicek, Sean Hardesty, Payton Lindsay, Scott T. Miller, Lynn Munday, et al. Sierra Structural Dynamics- Users notes 4.50. Technical report, Sandia National Lab.(SNL-NM), Albuquerque, NM (United States), 2018.doi:10.2172/1760401
-
[19]
Harvey T. Banks, Marie Davidian, John R. Samuels Jr, and Karyn L. Sutton. An inverse problem sta- tistical methodology summary. InMathematical and statistical estimation approaches in epidemiology, pages 249–302. Springer, 2009.doi:10.1007/978-90-481-2313-1_11
-
[20]
Lee, Stefanie Biedermann, and Robin Mitra
Kim M. Lee, Stefanie Biedermann, and Robin Mitra. Optimal design for experiments with possibly incomplete observations.Statistica Sinica, 28(3):1611–1632, 2018.doi:10.5705/ss.202015.0225
-
[21]
Drew P. Kouri, John D. Jakeman, and J. Gabriel Huerta. Risk-Adapted Optimal Experimental Design. SIAM/ASA Journal on Uncertainty Quantification, 10(2):687–716, June 2022. Publisher: Society for Industrial and Applied Mathematics.doi:10.1137/20M1357615
-
[22]
Alen Alexanderian. Optimal experimental design for infinite-dimensional Bayesian inverse problems governed by pdes: a review.Inverse Problems, 37(4):043001, March 2021.doi:10.1088/1361-6420/ abe10c
-
[23]
SIAM, 2006.doi:10.1137/1.9780898719109
Friedrich Pukelsheim.Optimal design of experiments. SIAM, 2006.doi:10.1137/1.9780898719109. 21
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.