pith. machine review for the scientific record. sign in

arxiv: 2604.25646 · v1 · submitted 2026-04-28 · 💻 cs.CV · cs.RO

Recognition: unknown

SAMe: A Semantic Anatomy Mapping Engine for Robotic Ultrasound

Authors on Pith no claims yet

Pith reviewed 2026-05-07 16:41 UTC · model grok-4.3

classification 💻 cs.CV cs.RO
keywords robotic ultrasoundanatomical mappingsemantic groundingprobe initializationpatient-specific anatomyorgan localization
0
0 comments X

The pith

SAMe turns one external body image into patient-specific organ locations and 6-DoF probe starting positions for robotic ultrasound from clinical complaints.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper introduces SAMe to supply robotic ultrasound systems with an explicit anatomical prior so they can decide where to begin scanning without expert help. It processes an under-specified complaint by first identifying the target organ, then constructing a lightweight patient-specific anatomical model solely from one external photograph of the body surface, and finally mapping that model to concrete probe placement states. No preoperative CT or MRI registration is required at any step. Real-robot trials report organ-hit rates of 97.3 percent for liver initialization and 81.7 percent for kidney initialization across tested targets, with the centroid-only case still exceeding a surface-heuristic baseline. The resulting anatomical representation is designed to be fast to compute and directly usable by downstream controllers.

Core claim

SAMe implements a target-to-anatomy-to-action pipeline that grounds clinical complaints into structured organs, instantiates an explicit patient-specific anatomical representation from a single external body image, and translates the representation into control-facing 6-DoF probe initialization states, achieving 97.3 percent liver and 81.7 percent kidney hit rates in real-robot experiments without additional registration.

What carries the argument

The semantic anatomy mapping engine that creates an explicit, lightweight patient-specific anatomical representation from a single external body image and converts it into probe initialization states.

If this is right

  • Robotic ultrasound systems can initiate scans directly from patient complaints without expert intervention at the start.
  • The initialization pipeline operates without any preoperative CT or MRI input.
  • SAMe positions outperform surface-heuristic baselines even when targeting only organ centroids.
  • The explicit anatomical layer is compatible with further image-driven control for complete autonomous scanning pipelines.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The single-image anatomy model could be tested for robustness across wider ranges of body habitus and patient positioning.
  • Combining SAMe initialization with existing local image optimization modules would create an end-to-end complaint-driven scanning system.
  • The approach suggests a path to reduce reliance on preoperative volumetric imaging for routine robotic ultrasound tasks.

Load-bearing premise

That a single external body image is sufficient to instantiate an accurate, patient-specific anatomical representation for the grounded targets without any additional registration using preoperative CT or MRI.

What would settle it

A real-robot initialization run on a new patient in which the probe placed by SAMe produces an ultrasound image that misses the target organ centroid by more than the acceptance tolerance, while a surface-heuristic placement on the same patient succeeds.

Figures

Figures reproduced from arXiv: 2604.25646 by Bo Du, Christoph F. Dietrich, Dacheng Tao, Duojie Chen, Jianxin Liu, Jing Zhang, Qinghong Zhao, Wentao Jiang, Xinwu Cui, Zihan Lou.

Figure 1
Figure 1. Figure 1: From manual expertise to autonomous robotic ultrasonography. view at source ↗
Figure 2
Figure 2. Figure 2: The SAMe system architecture bridging clinical semantics to robotic execution. view at source ↗
Figure 3
Figure 3. Figure 3: Overview of the SAMe semantic prior database and RAG performance across LLM view at source ↗
Figure 4
Figure 4. Figure 4: Clinical Semantics Grounding results. (a) System role of the semantic layer in SAMe: clinical complaint or report text is retrieved against the SAMe semantic prior and grounded into a structured target specification comprising target organ, related anatomy, prioritized location or ROI, and task-ready targets. (b) Baseline-versus-RAG grounding performance across evaluated model backends. (c) Output-quality … view at source ↗
Figure 5
Figure 5. Figure 5: Actionable Target Initialization in real robotic ultrasound. view at source ↗
Figure 6
Figure 6. Figure 6: Failure case (BMI 35.5), showing a superior offset in the predicted initialization re￾gion, with uncontrolled respira￾tion likely further increasing the anatomy–probe mismatch. servation beyond the learned prior distribution, and uncontrolled deep breathing introducing respiration-dependent liver motion not represented in the static, one-shot prior. More broadly, this failure highlights a fundamental bound… view at source ↗
Figure 7
Figure 7. Figure 7: Overview of the organ-layer modeling pipeline. Starting from CT-derived skin, skele view at source ↗
Figure 8
Figure 8. Figure 8: Skeleton-conditioned prior regression. A local joint subset in rest space is converted view at source ↗
Figure 9
Figure 9. Figure 9: Control-facing geometric interface. Anatomical targets are projected to candidate view at source ↗
read the original abstract

Robotic ultrasound has advanced local image-driven control, contact regulation, and view optimization, yet current systems lack the anatomical understanding needed to determine what to scan, where to begin, and how to adapt to individual patient anatomy. These gaps make systems still reliant on expert intervention to initiate scanning. Here we present SAMe, a semantic anatomy mapping engine that provides robotic ultrasound with an explicit anatomical prior layer. SAMe addresses scan initiation as a target-to-anatomy-to-action process: it grounds under-specified clinical complaints into structured target organs, instantiates a patient-specific anatomical representation for the grounded targets from a single external body image, and translates this representation into control-facing 6-DoF probe initialization states without any additional registration using preoperative CT or MRI. The anatomical representation maintained by SAMe is explicit, lightweight (single-organ inference in 0.08s), and compatible with downstream control by design. Across semantic grounding, anatomical instantiation, and real-robot evaluation, SAMe shows strong performance across the full initialization pipeline. In real-robot experiments, SAMe achieved overall organ-hit rates of 97.3% for liver initialization and 81.7% for kidney initialization across the evaluated target sets. Even when restricted to the centroid target, SAMe outperformed the surface-heuristic baseline for both liver and kidney initialization. These results establish an explicit anatomical prior layer that addresses scan initialization and is designed to support broader downstream autonomous scanning pipelines, providing the anatomical foundation for complaint-driven, anatomically informed robotic ultrasonography.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The paper introduces SAMe, a semantic anatomy mapping engine for robotic ultrasound that grounds clinical complaints into target organs, instantiates an explicit patient-specific anatomical representation (including 6-DoF probe poses) from a single external RGB body image without preoperative CT/MRI registration, and translates this into control-compatible initialization states. It reports real-robot organ-hit rates of 97.3% for liver and 81.7% for kidney initialization, outperforming a surface-heuristic baseline even at centroid targets, while emphasizing the representation's lightness (0.08s inference) and downstream compatibility.

Significance. If the central claims hold, SAMe would supply a missing explicit anatomical prior layer for complaint-driven robotic ultrasonography, reducing expert intervention at scan initiation and enabling broader autonomous pipelines. The real-robot evaluation and design for control compatibility are concrete strengths; the lightweight single-organ inference time is a practical advantage. However, the absence of direct anatomical validation metrics limits the strength of the patient-specific claim.

major comments (2)
  1. [Experimental Evaluation] Experimental Evaluation (real-robot results): The reported organ-hit rates (97.3% liver, 81.7% kidney) rest on downstream success without an independent ground-truth comparison of the inferred 6-DoF organ poses against registered preoperative CT or MRI for the same subjects. This leaves the accuracy of the single-image patient-specific instantiation supported only indirectly, which is load-bearing for the claim that external appearance alone suffices for probe initialization within ultrasound footprint tolerance.
  2. [Methods] Methods (anatomical instantiation): The manuscript does not report dataset size, patient variability, error bars, or the precise implementation details of the surface-heuristic baseline used for comparison. These omissions make it impossible to assess whether the hit-rate differences are statistically robust or generalizable beyond the tested cohort.
minor comments (2)
  1. [Abstract] Abstract: The performance numbers are presented without any mention of cohort size, variability, or statistical measures, which reduces clarity for readers evaluating the strength of the real-robot claims.
  2. [Figures] Figure captions and notation: Some diagrams of the target-to-anatomy-to-action pipeline use abbreviations (e.g., for semantic grounding components) that are not expanded on first use, hindering quick comprehension.

Simulated Author's Rebuttal

2 responses · 1 unresolved

We thank the referee for the constructive and detailed review of our manuscript. We address the major comments point by point below, providing clarifications on our evaluation approach and committing to revisions where additional details can be supplied without misrepresenting the work.

read point-by-point responses
  1. Referee: [Experimental Evaluation] Experimental Evaluation (real-robot results): The reported organ-hit rates (97.3% liver, 81.7% kidney) rest on downstream success without an independent ground-truth comparison of the inferred 6-DoF organ poses against registered preoperative CT or MRI for the same subjects. This leaves the accuracy of the single-image patient-specific instantiation supported only indirectly, which is load-bearing for the claim that external appearance alone suffices for probe initialization within ultrasound footprint tolerance.

    Authors: Our evaluation prioritizes the functional outcome relevant to robotic ultrasound: whether the predicted 6-DoF probe pose results in the target organ being visible within the ultrasound image footprint. This downstream success metric directly tests the utility of the initialization for complaint-driven scanning without requiring preoperative CT or MRI, consistent with the CT-free design emphasized throughout the paper. We agree that this constitutes indirect rather than direct validation of the inferred anatomical poses. We will revise the manuscript to more clearly articulate this evaluation choice, its alignment with the target application, and the associated limitations on claims about pose accuracy. revision: partial

  2. Referee: [Methods] Methods (anatomical instantiation): The manuscript does not report dataset size, patient variability, error bars, or the precise implementation details of the surface-heuristic baseline used for comparison. These omissions make it impossible to assess whether the hit-rate differences are statistically robust or generalizable beyond the tested cohort.

    Authors: We agree that these methodological details are important for assessing robustness and should have been included. In the revised manuscript we will add the dataset size (number of subjects and total images), a description of patient variability (including body habitus and demographics where available), error bars or statistical measures on the organ-hit rates, and a precise description of the surface-heuristic baseline implementation, including how centroid and other target poses are derived from surface information. revision: yes

standing simulated objections not resolved
  • Direct independent ground-truth comparison of the inferred 6-DoF organ poses to registered preoperative CT or MRI, as no such imaging was acquired for the subjects in the real-robot experiments.

Circularity Check

0 steps flagged

No significant circularity detected in the derivation chain.

full rationale

The paper describes SAMe as a pipeline that performs semantic grounding of complaints to organs, instantiates a patient-specific anatomical model from one RGB body image, and outputs 6-DoF probe poses. The headline performance figures (97.3 % liver and 81.7 % kidney organ-hit rates) are reported as measured outcomes from real-robot trials on a physical cohort, not as quantities that are algebraically or statistically forced by the input image or by any fitted parameter. No equations, self-definitional mappings, or load-bearing self-citations are present in the abstract or described pipeline that would reduce the claimed anatomical prior or success rates to tautological restatements of the single-image input. The central claim therefore remains an empirical engineering result rather than a circular re-labeling of its own premises.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

Review performed on abstract only; no explicit free parameters, axioms, or invented entities are stated in the provided text.

pith-pipeline@v0.9.0 · 5596 in / 1141 out tokens · 54075 ms · 2026-05-07T16:41:25.687012+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

71 extracted references · 4 canonical work pages

  1. [1]

    J.et al.Practice guidelines for performance of the routine mid-trimester fetal ultrasound scan.Ultrasound in Obstetrics & Gynecology37(2011)

    Salomon, L. J.et al.Practice guidelines for performance of the routine mid-trimester fetal ultrasound scan.Ultrasound in Obstetrics & Gynecology37(2011)

  2. [2]

    I.et al.Normative spatiotemporal fetal brain maturation with satisfactory development at 2 years.Nature623, 106–114 (2023)

    Namburete, A. I.et al.Normative spatiotemporal fetal brain maturation with satisfactory development at 2 years.Nature623, 106–114 (2023)

  3. [3]

    E.et al.Deep-learning-assisted analysis of echocardiographic videos im- proves predictions of all-cause mortality.Nature Biomedical Engineering5, 546–554 (2021)

    Ulloa Cerna, A. E.et al.Deep-learning-assisted analysis of echocardiographic videos im- proves predictions of all-cause mortality.Nature Biomedical Engineering5, 546–554 (2021)

  4. [4]

    Stein, J. H.et al.Use of carotid ultrasound to identify subclinical vascular disease and evaluate cardiovascular disease risk: a consensus statement from the american society of echocardiography carotid intima-media thickness task force endorsed by the society for vascular medicine.Journal of the American Society of echocardiography21, 93–111 (2008)

  5. [5]

    R., Buckley, A

    Tahmasebpour, H. R., Buckley, A. R., Cooperberg, P. L. & Fix, C. H. Sonographic exami- nation of the carotid arteries.Radiographics25, 1561–1575 (2005)

  6. [6]

    & Monteiro, L

    Ferraioli, G. & Monteiro, L. B. S. Ultrasound-based techniques for the diagnosis of liver steatosis.World journal of gastroenterology25, 6053 (2019)

  7. [7]

    Ferraioli, G.et al.Liver ultrasound elastography: an update to the world federation for ul- trasound in medicine and biology guidelines and recommendations.Ultrasound in medicine & biology44, 2419–2440 (2018)

  8. [8]

    Leenhardt, L.et al.2013 european thyroid association guidelines for cervical ultrasound scan and ultrasound-guided techniques in the postoperative management of patients with thyroid cancer.European thyroid journal2, 147–159 (2013)

  9. [9]

    Lin, M.et al.A fully integrated wearable ultrasound system to monitor deep tissues in moving subjects.Nature biotechnology42, 448–457 (2024)

  10. [10]

    Hu, H.et al.A wearable cardiac ultrasound imager.Nature613, 667–675 (2023). 25

  11. [11]

    Wang, C.et al.Monitoring of the central blood pressure waveform via a conformal ultrasonic device.Nature biomedical engineering2, 687–695 (2018)

  12. [12]

    & Scott, D

    Beales, L., Wolstenhulme, S., Evans, J., West, R. & Scott, D. Reproducibility of ultrasound measurement of the abdominal aorta.Journal of British Surgery98, 1517–1525 (2011)

  13. [13]

    Joakimsen, O., Bønaa, K. H. & Stensland-Bugge, E. Reproducibility of ultrasound assess- ment of carotid plaque occurrence, thickness, and morphology: the tromsø study.Stroke 28, 2201–2207 (1997)

  14. [14]

    Kojcev, R.et al.On the reproducibility of expert-operated and robotic ultrasound acqui- sitions.International journal of computer assisted radiology and surgery12, 1003–1011 (2017)

  15. [15]

    Journal of ultrasound in medicine43, 1289–1301 (2024)

    Won, D.et al.Sound the alarm: the sonographer shortage is echoing across healthcare. Journal of ultrasound in medicine43, 1289–1301 (2024)

  16. [16]

    2023 radiologic sciences workplace and staffing survey

    American Society of Radiologic Technologists. 2023 radiologic sciences workplace and staffing survey. Tech. Rep., American Society of Radiologic Technologists (2023). URLhttps://www.asrt.org/docs/default-source/research/staffing-surveys/ radiologic-sciences-workplace-and-staffing-survey-2023.pdf

  17. [17]

    Department of Labor

    Bureau of Labor Statistics, U.S. Department of Labor. Diagnostic medical sonographers. Occupational Outlook Handbook (2025). URLhttps://www.bls.gov/ooh/healthcare/ diagnostic-medical-sonographers.htm. Last modified August 28, 2025

  18. [18]

    & Moss, C

    McGregor, R., Pollard, K., Davidson, R. & Moss, C. Providing a sustainable sonographer workforce in australia: clinical training solutions.Sonography7, 141–147 (2020)

  19. [19]

    Addressing health human resource issues in diagnostic medical imag- ing – sonography: Pre-budget submission for the 2024–25 federal budget

    Sonography Canada. Addressing health human resource issues in diagnostic medical imag- ing – sonography: Pre-budget submission for the 2024–25 federal budget. Tech. Rep., Sonography Canada (2023). URLhttps://sonographycanada.ca/app/uploads/2023/ 10/Sonography-Canada-Pre-Budget-Final.pdf

  20. [20]

    Sonography vacancy rates have increased dramatically, sor ultrasound census reveals (2026)

    Society of Radiographers. Sonography vacancy rates have increased dramatically, sor ultrasound census reveals (2026). URLhttps://www.sor.org/news/ultrasound/ sonography-vacancy-rates-have-increased-dramatical. Accessed 14 Apr 2026

  21. [21]

    & Harrison, G

    Parker, P. & Harrison, G. Educating the future sonographic workforce: membership survey report from the british medical ultrasound society.Ultrasound23, 231–241 (2015)

  22. [22]

    & Strudwick, R

    Coleman, G., Hyde, E. & Strudwick, R. Exploring uk sonographers’ views on the use of professional supervision in clinical practice–stage one findings of a mixed method study. Radiography30, 252–256 (2024)

  23. [23]

    R., Cleary, K., Wilson, E., Azizi-Koutenaei, B

    Swerdlow, D. R., Cleary, K., Wilson, E., Azizi-Koutenaei, B. & Monfaredi, R. Robotic arm– assisted sonography: Review of technical developments and potential clinical applications. American Journal of Roentgenology208, 733–738 (2017)

  24. [24]

    Monfaredi, R.et al.Robot-assisted ultrasound imaging: Overview and development of a parallel telerobotic system.Minimally Invasive Therapy & Allied Technologies24, 54–62 (2015)

  25. [25]

    Huang, Q., Zhou, J. & Li, Z. Review of robot-assisted medical ultrasound imaging systems: Technology and clinical applications.Neurocomputing559, 126790 (2023). 26

  26. [26]

    The International Journal of Medical Robotics and Computer Assisted Surgery20, e2660 (2024)

    Du, H.et al.A review of robot-assisted ultrasound examination: Systems and technology. The International Journal of Medical Robotics and Computer Assisted Surgery20, e2660 (2024)

  27. [27]

    Su, K.et al.A fully autonomous robotic ultrasound system for thyroid scanning.Nature communications15, 4004 (2024)

  28. [28]

    Jiang, H.et al.Towards expert-level autonomous carotid ultrasonography with large-scale learning-based robotic system.Nature Communications16, 7893 (2025)

  29. [29]

    & Navab, N

    Bi, Y., Jiang, Z., Duelmer, F., Huang, D. & Navab, N. Machine learning in robotic ul- trasound imaging: Challenges and perspectives.Annual Review of Control, Robotics, and Autonomous Systems7(2024)

  30. [30]

    Liu, Y.et al.From screens to scenes: A survey of embodied ai in healthcare.Information Fusion119, 103033 (2025)

  31. [31]

    Merouche, S.et al.A robotic ultrasound scanner for automatic vessel tracking and three- dimensional reconstruction of b-mode images.IEEE transactions on ultrasonics, ferro- electrics, and frequency control63, 35–46 (2015)

  32. [32]

    In2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 3784–3791 (IEEE, Prague, Czech Republic, 2021)

    Akbari, M.et al.Robot-assisted breast ultrasound scanning using geometrical analysis of the seroma and image segmentation. In2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 3784–3791 (IEEE, Prague, Czech Republic, 2021)

  33. [33]

    Jiang, Z.et al.Autonomous robotic screening of tubular structures based only on real-time ultrasound imaging feedback.IEEE Transactions on Industrial Electronics69, 7064–7075 (2021)

  34. [34]

    Wang, Z.et al.Autonomous robotic system for carotid artery ultrasound scanning with visual servo navigation.IEEE Transactions on Medical Robotics and Bionics6, 1436–1447 (2024)

  35. [35]

    In2023 IEEE International Conference on Robotics and Automation (ICRA), 2690–2696 (IEEE, London, UK, 2023)

    Chen, M.et al.Fully robotized 3d ultrasound image acquisition for artery. In2023 IEEE International Conference on Robotics and Automation (ICRA), 2690–2696 (IEEE, London, UK, 2023)

  36. [36]

    Jiang, Z.et al.Skeleton graph-based ultrasound-ct non-rigid registration.IEEE Robotics and Automation Letters8, 4394–4401 (2023)

  37. [37]

    In2021 IEEE International Conference on Robotics and Automation (ICRA), 12494–12500 (IEEE, Xi’an, China, 2021)

    Jiang, Z.et al.Motion-aware robotic 3d ultrasound. In2021 IEEE International Conference on Robotics and Automation (ICRA), 12494–12500 (IEEE, Xi’an, China, 2021)

  38. [38]

    In2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 5534–5541 (IEEE, 2020)

    Hase, H.et al.Ultrasound-guided robotic navigation with deep reinforcement learning. In2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 5534–5541 (IEEE, 2020)

  39. [39]

    In2021 IEEE International Conference on Robotics and Automation (ICRA), 8302–8308 (IEEE, 2021)

    Li, K.et al.Autonomous navigation of an ultrasound probe towards standard scan planes with deep reinforcement learning. In2021 IEEE International Conference on Robotics and Automation (ICRA), 8302–8308 (IEEE, 2021)

  40. [40]

    & Liao, H

    Ning, G., Zhang, X. & Liao, H. Autonomic robotic ultrasound imaging system based on reinforcement learning.IEEE transactions on biomedical engineering68, 2787–2797 (2021)

  41. [41]

    Bi, Y.et al.Vesnet-rl: Simulation-based reinforcement learning for real-world us probe navigation.IEEE Robotics and Automation Letters7, 6638–6645 (2022). 27

  42. [42]

    Droste, R., Drukker, L., Papageorghiou, A. T. & Noble, J. A. Automatic probe movement guidance for freehand obstetric ultrasound. InInternational Conference on Medical Image Computing and Computer-Assisted Intervention, 583–592 (Springer, 2020)

  43. [43]

    Men, Q., Teng, C., Drukker, L., Papageorghiou, A. T. & Noble, J. A. Multimodal- guidenet: Gaze-probe bidirectional guidance in obstetric ultrasound scanning. InInter- national Conference on Medical Image Computing and Computer-Assisted Intervention, 94–103 (Springer, 2022)

  44. [44]

    Deng, X., Chen, Y., Chen, F. & Li, M. Learning robotic ultrasound scanning skills via human demonstrations and guided explorations. In2021 IEEE International Conference on Robotics and Biomimetics (ROBIO), 372–378 (IEEE, 2021)

  45. [45]

    & Navab, N

    Jiang, Z., Gao, Y., Xie, L. & Navab, N. Towards autonomous atlas-based ultrasound acquisitions in presence of articulated motion.IEEE Robotics and Automation Letters7, 7423–7430 (2022)

  46. [46]

    Hennersperger, C.et al.Towards mri-based autonomous robotic us acquisitions: a first feasibility study.IEEE transactions on medical imaging36, 538–548 (2016)

  47. [47]

    Jiang, Z.et al.Precise repositioning of robotic ultrasound: Improving registration-based motion compensation using ultrasound confidence optimization.IEEE Transactions on Instrumentation and Measurement71, 1–11 (2022)

  48. [48]

    InIntelligent Systems Conference, 610–625 (Springer, 2024)

    Long, J.et al.Localizing scan targets from human pose for autonomous lung ultrasound imaging. InIntelligent Systems Conference, 610–625 (Springer, 2024)

  49. [49]

    E.et al.Mimic-iv, a freely accessible electronic health record dataset.Scientific Data10, 1 (2023)

    Johnson, A. E.et al.Mimic-iv, a freely accessible electronic health record dataset.Scientific Data10, 1 (2023)

  50. [50]

    Wu, J.et al.Medical graph rag: Towards safe medical large language model via graph retrieval-augmented generation.arXiv preprint arXiv:2408.04187(2024)

  51. [51]

    mimic ex dataset.https://huggingface.co/datasets/morson/mimic_ex (2024)

    Morson. mimic ex dataset.https://huggingface.co/datasets/morson/mimic_ex (2024). Accessed: 2025-10-22

  52. [52]

    & J´ egou, H

    Johnson, J., Douze, M. & J´ egou, H. Billion-scale similarity search with GPUs.IEEE Transactions on Big Data7, 535–547 (2019)

  53. [53]

    Gutschmayer, S.et al.Whole-body [18f] fdg-pet/ct imaging of healthy controls: Test/retest data for systemic, multi-organ analysis.Scientific Data12, 1707 (2025)

  54. [54]

    URLhttps://doi.org/10

    Gutschmayer, S., Yu, J.et al.Whole-body [18f]fdg-pet/ct imaging of healthy controls: Test/retest data for systemic, multi-organ analysis (2025). URLhttps://doi.org/10. 5281/zenodo.16364694

  55. [55]

    Shetty, K.et al.BOSS: Bones, organs and skin shape model.Computers in Biology and Medicine165, 107383 (2023)

  56. [56]

    Kats, E.et al.Depth to anatomy: Learning internal organ locations from surface depth images.arXiv preprint arXiv:2601.18260(2026)

  57. [57]

    & Mathis-Ullrich, F

    Henrich, P. & Mathis-Ullrich, F. Looc: Localizing organs using occupancy networks and body surface depth images.IEEE Access(2025)

  58. [59]

    & Tao, D

    Xu, Y., Zhang, J., Zhang, Q. & Tao, D. Vitpose: Simple vision transformer baselines for human pose estimation.Advances in neural information processing systems35, 38571– 38584 (2022)

  59. [60]

    Yang, X.et al.Sam 3d body: Robust full-body human mesh recovery.arXiv preprint arXiv:2602.15989(2026)

  60. [61]

    H.et al.A scoping review of human digital twins in healthcare applications and usage patterns.npj Digital Medicine8, 587 (2025)

    Tudor, B. H.et al.A scoping review of human digital twins in healthcare applications and usage patterns.npj Digital Medicine8, 587 (2025)

  61. [62]

    Foundational research gaps and future directions for digital twins(Na- tional Academies Press Washington, DC, USA, 2023)

    Willcox, K.et al. Foundational research gaps and future directions for digital twins(Na- tional Academies Press Washington, DC, USA, 2023)

  62. [63]

    & Gonsard, A

    Drummond, D. & Gonsard, A. Definitions and characteristics of patient digital twins being developed for clinical use: scoping review.Journal of Medical Internet Research26, e58504 (2024)

  63. [64]

    R.et al.Low-dose ct images of healthy cohort (healthy- total-body-cts) (2023)

    Selfridge, A. R.et al.Low-dose ct images of healthy cohort (healthy- total-body-cts) (2023). URLhttps://www.cancerimagingarchive.net/collection/ healthy-total-body-cts/

  64. [65]

    Lorensen, W. E. & Cline, H. E. Marching cubes: A high resolution 3d surface construction algorithm. In Kaufman, A. (ed.)Seminal graphics: pioneering efforts that shaped the field, 347–353 (ACM Press, New York, NY, USA, 1998)

  65. [66]

    InProceedings IEEE Conf

    Pavlakos, G.et al.Expressive body capture: 3d hands, face, and body from a single image. InProceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR)(2019)

  66. [67]

    & Black, M

    Romero, J., Tzionas, D. & Black, M. J. Embodied hands: Modeling and capturing hands and bodies together.ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)36(2017)

  67. [68]

    & Black, M

    Loper, M., Mahmood, N., Romero, J., Pons-Moll, G. & Black, M. J. SMPL: A skinned multi-person linear model.ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)34, 248:1–248:16 (2015)

  68. [69]

    Mhr: Momentum human rig.arXiv preprint arXiv:2511.15586, 2025

    Ferguson, A.et al.Mhr: Momentum human rig (2025). URLhttps://arxiv.org/abs/ 2511.15586.2511.15586

  69. [70]

    A solution for the best rotation to relate two sets of vectors.Acta Crystallo- graphica Section A32, 922–923 (1976)

    Kabsch, W. A solution for the best rotation to relate two sets of vectors.Acta Crystallo- graphica Section A32, 922–923 (1976)

  70. [71]

    InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 20814–20823 (2022)

    Guo, H.et al.Smpl-a: Modeling person-specific deformable anatomy. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 20814–20823 (2022)

  71. [72]

    InACM ToG, Proc

    Keller, M.et al.From skin to skeleton: Towards biomechanically accurate 3d digital humans. InACM ToG, Proc. SIGGRAPH Asia, vol. 42 (2023). Acknowledgements This work was supported in part by the New Generation Artificial Intelligence-National Science and Technology Major Project under Grant No. 2025ZD0123602. 29 Author contributions J.Z. and D.C. conceive...