Recognition: unknown
Institutionalizing Best Practices in Research Computing: A Framework and Case Study for Improving User Onboarding
Pith reviewed 2026-05-08 12:45 UTC · model grok-4.3
The pith
A framework for institutionalizing best practices improves new user onboarding in research computing centers.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
The paper claims that a framework designed to improve new-user onboarding experience, when applied within the Research Infrastructure Services at Washington University in St. Louis, provides empirical validation for its effectiveness in helping users navigate complex resources.
What carries the argument
The framework for institutionalizing best practices in research computing user onboarding.
Load-bearing premise
The results from a single institution's application of the framework can be taken as validation for its use across research computing centers in general.
What would settle it
Implementing the framework at a different research computing center and finding no improvement in new user confusion or access success rates would falsify the central claim.
Figures
read the original abstract
Research computing centers around the world struggle with onboarding new users. Subject matter experts, researchers, and principal investigators are often overwhelmed by the complex infrastructure and software offerings designed to support diverse research domains at large academic and national institutions. As a result, users frequently fall into confusion and complexity to access these resources, despite the availability of documentation, tutorials, interactive trainings and other similar resources. Through this work, we present a framework designed to improve new-user onboarding experience. We also present an empirical validation through its application within the Research Infrastructure Services at Washington University in St. Louis.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The manuscript proposes a framework for institutionalizing best practices to improve new-user onboarding in research computing centers, which often overwhelm users with complex infrastructure and software. It further claims an empirical validation of this framework through its application at the Research Infrastructure Services of Washington University in St. Louis.
Significance. A well-designed, generalizable onboarding framework with demonstrated measurable improvements could address a common pain point across academic and national research computing facilities. However, the current manuscript provides no quantitative evidence, evaluation design, or comparison to prior practices, so the claimed validation does not yet support broader adoption or impact.
major comments (2)
- Abstract: The assertion of 'empirical validation' through application at Washington University is unsupported because the text supplies no description of evaluation methods, outcome metrics, baseline comparisons, statistical tests, or pre/post results. Without these, the case study cannot distinguish framework effects from local factors or selection bias.
- Case study description (implied in the validation claim): A single-institution deployment cannot establish general effectiveness. The manuscript must provide independent outcome measures (e.g., time-to-first-job, user satisfaction scores, or retention rates) and controls or comparisons to prior onboarding processes to support the central claim that the framework improves onboarding.
Simulated Author's Rebuttal
We thank the referee for their constructive and detailed comments. We have reviewed the feedback carefully and provide point-by-point responses below. We agree that certain claims in the manuscript require clarification and have revised the text to better reflect the scope and limitations of the case study.
read point-by-point responses
-
Referee: Abstract: The assertion of 'empirical validation' through application at Washington University is unsupported because the text supplies no description of evaluation methods, outcome metrics, baseline comparisons, statistical tests, or pre/post results. Without these, the case study cannot distinguish framework effects from local factors or selection bias.
Authors: We agree that the abstract's reference to 'empirical validation' is not supported by the level of detail provided in the manuscript. The case study section outlines the framework's implementation at Washington University and describes observed practical benefits, but it does not include formal evaluation methods, quantitative metrics, baseline data, or statistical analysis. We have revised the abstract to remove the 'empirical validation' language and instead characterize the contribution as a framework presented with an illustrative case study of its application. We have also added an explicit limitations subsection that acknowledges the absence of controlled comparisons and the potential influence of local factors. revision: yes
-
Referee: Case study description (implied in the validation claim): A single-institution deployment cannot establish general effectiveness. The manuscript must provide independent outcome measures (e.g., time-to-first-job, user satisfaction scores, or retention rates) and controls or comparisons to prior onboarding processes to support the central claim that the framework improves onboarding.
Authors: We accept that a single-institution case study cannot demonstrate general effectiveness or support broad claims of improvement. The manuscript presents the work as a framework accompanied by a case study intended to illustrate real-world application rather than to serve as a controlled evaluation. In the revised version we have added clarifying language in the introduction, case study, and conclusion sections to emphasize that the example is illustrative and does not include independent outcome measures, pre/post controls, or statistical comparisons. Where observational indicators from the Washington University deployment were available (such as qualitative user feedback), we have incorporated them while explicitly stating their limitations and the lack of rigorous comparative data. revision: partial
Circularity Check
Validation claim reduces to framework application by definition
specific steps
-
self definitional
[Abstract]
"We also present an empirical validation through its application within the Research Infrastructure Services at Washington University in St. Louis."
The paper defines its empirical validation as the application of the proposed framework at a single site. This makes any reported improvement equivalent to the framework's deployment by construction, without requiring separate outcome measures, statistical comparisons, or evidence that the framework (rather than local factors) produced the result.
full rationale
The paper's central contribution is a framework for onboarding plus an 'empirical validation' consisting solely of applying that same framework at one institution. No independent pre/post metrics, control conditions, or external benchmarks are described in the abstract or claimed structure; success is therefore equivalent to the act of deployment itself. This matches the self-definitional pattern: the reported validation is the input (application) renamed as output (evidence of effectiveness). The single-site case study supplies no falsifiable test against prior practice or other centers, rendering the effectiveness claim circular by construction.
Axiom & Free-Parameter Ledger
axioms (2)
- domain assumption Research computing centers struggle with onboarding new users despite available documentation and trainings
- domain assumption A structured institutional framework can reduce user confusion more effectively than existing resources
invented entities (1)
-
The onboarding framework
no independent evidence
Reference graph
Works this paper leans on
-
[1]
Gladys Andino et al . 2024. Onboarding Research Computing and Data Professionals. In Practice and Experience in Advanced Research Computing
2024
-
[2]
doi:10.1145/3626203.3670582
-
[3]
Amy Apon et al . 2014. Assessing the e!ect of high performance computing capabilities on academic research output. Empirical Economics (2014). doi:10.1007/s00181-014-0833-7
-
[4]
Success Criteria
Team Atriade. 2024. 7 Steps for Successful Proof of Concept in IT Infrastructure . White Paper. Atriade Consulting. https://atriade.com/7-proof-of- concept-steps/ Industry-standard framework for running PoCs, emphasizing that "Success Criteria" must be de"ned *before* the PoC starts
2024
-
[5]
Randall J Bateman, Tammie LS Benzinger, Scott Berry, David B Cli!ord, David M Holtzman, Mathias Jucker, John C Morris, et al . 2012. Clinical and biomarker changes in dominantly inherited Alzheimer’s disease. New England Journal of Medicine 367, 9 (2012), 795–804. doi:10.1056/NEJMoa1202753
-
[6]
Mary Jo Bitner et al . 2008. Service Blueprinting: A Practical Technique for Service Innovation. California Management Review (2008). doi:10.2307/ 41166446 Manuscript submitted to ACM PEARC 2026 12 Chaturvedi et al
2008
-
[7]
Dhruva Chakravorty et al . 2024. BRICCs: Building Pathways to Research Cyberinfrastructure at Under Resourced Institutions. In Practice and Experience in Advanced Research Computing 2024 . doi:10.1145/3626203.3670535
-
[8]
Liuhua Chen and Haiying Shen. 2017. Considering resource demand misalignments to reduce resource over-provisioning. IEEE INFOCOM 2017 (2017). doi:10.1109/infocom.2017.8057084
-
[9]
Hashim Chunpir et al. 2017. User Experience (UX) of a Big Data Infrastructure. In Lecture Notes in Computer Science. doi:10.1007/978-3-319-58524-6_37
-
[10]
Cook et al
Alice E. Cook et al . 2021. ADV ANTAGES, CHALLENGES, AND SUCCESS FACTORS IN IMPLEMENTING INFORMATION TECHNOLOGY INFRASTRUCTURE LIBRARY. Issues In Information Systems (2021)
2021
-
[11]
Ryan Danehy and Jaelyn Litzinger. 2025. Review of the Documentation Activities and Training for HPC at PNNL. In Practice and Experience in Advanced Research Computing 2025: The Power of Collaboration . doi:10.1145/3708035.3736005
-
[12]
Sandra Gesing and Katherine Lawrence. 2018. Science Gateways: Leveraging Modeling and Simulations in HPC Infrastructures via Increased Usability. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC)
2018
-
[13]
Ben Godfrey. 2024. Practical advice for the creation of e!ective HPC user documentation. In Practice and Experience in Advanced Research Computing
2024
-
[15]
Ben Godfrey. 2024. Practical advice for the creation of e!ective HPC user documentation. In PEARC ’24. doi:10.1145/3626203.3670621
-
[16]
JP Grotzinger, DY Sumner, LC Kah, K Stack, S Gupta, L Edgar, RE Arvidson, et al . 2014. A habitable #uvio-lacustrine environment at Yellowknife Bay, Gale Crater, Mars. Science 343, 6169 (2014), 1242777. doi:10.1126/science.1242777
-
[17]
Hancock et al
David Y. Hancock et al . 2021. Jetstream2: Accelerating cloud computing via Jetstream. In Practice and Experience in Advanced Research Computing
2021
-
[18]
doi:10.1145/3437359.3465565
-
[19]
International Human Genome Sequencing Consortium. 2001. Initial sequencing and analysis of the human genome. Nature 409, 6822 (2001), 860–921. doi:10.1038/35057062
-
[20]
Kandaswamy et al
G. Kandaswamy et al . 2010. Software Engineering for Scienti"c Computing: The Role of PoCs and Prototyping. Computing in Science & Engineering (2010)
2010
-
[21]
Shelley L. Knuth et al . 2019. An expansion of the user support services for the Research Computing group at the University of Colorado Boulder. In Practice and Experience in Advanced Research Computing 2019 . doi:10.1145/3332186.3332229
-
[22]
Jonathan Koomey et al. 2007. A Simple Model for Determining True Total Cost of Ownership for Data Centers . Technical Report. Uptime Institute
2007
-
[23]
long tail
Katherine A Lawrence et al . 2015. Science gateways: The long road to the "long tail" of science. Proceedings of the 11th IEEE International Conference on e-Science (2015)
2015
-
[24]
J. Miller, K. Agrawal, and A. Panyala. 2025. Navigating Exascale Operational Data Analytics: From Inundation to Insight. IEEE Transactions on Parallel and Distributed Systems 36, 2 (2025), 210–225. doi:10.1109/TPDS.2025.10820578 Provides the MELT framework for HPC telemetry
-
[25]
Mirghani S. Mohamed et al . 2008. The restructuring of the information technology infrastructure library (ITIL) implementation using knowledge management framework. VINE (2008). doi:10.1108/03055720810904835
-
[26]
National Institutes of Health. 2025. NIH Public Access Policy (2024 Update). https://grants.nih.gov/policy-and-compliance/policy-topics/public- access/nih-public-access-policy-overview E!ective July 1, 2025; Accessed: 2026-03-25
2025
-
[27]
Jakob Nielsen. 1994. Usability Engineering. Morgan Kaufmann
1994
-
[28]
Sai Dikshit Pasham. 2018. Dynamic Resource Provisioning in Cloud Environments Using Predictive Analytics. International Journal of Engineering and Computer Science (2018)
2018
-
[29]
Dylan Perkins et al . 2022. Challenges and Lessons Learned of Formalizing the Partnership Between Libraries and Research Computing Groups. In Practice and Experience in Advanced Research Computing 2022 . doi:10.1145/3491418.3535165
-
[30]
Sebastian A. C. Perrig et al . 2024. Measurement practices in user experience (UX) research. Frontiers in Computer Science (2024). doi:10.3389/fcomp. 2024.1368860
-
[31]
István Pintye et al . 2024. Enhancing Machine Learning-Based Autoscaling for Cloud Resource Orchestration. Journal of Grid Computing (2024). doi:10.1007/s10723-024-09783-1
-
[32]
Zebula Sampedro et al. 2018. Continuous Integration and Delivery for HPC. In Proceedings of PEARC ’18 . doi:10.1145/3219104.3219147
-
[33]
Jim Samuel, Margaret Brennan-Tonetta, Yana Samuel, Pradeep Subedi, and Jack Smith. 2022. Strategies for Democratization of Supercomputing: Availability, Accessibility and Usability of High Performance Computing for Education and Practice of Big Data Analytics. Journal of Big Data: Theory and Practice 1 (06 2022). doi:10.54116/jbdtp.v1i1.16
-
[34]
Schwartz
M. Schwartz. 2018. Data Archiving for the Enterprise . O’Reilly Media
2018
-
[35]
Kanu Priya Singh. 2024. A systematic literature review of the application of user experience studies in cyberinfrastructure. Quality and User Experience (2024). doi:10.1007/s41233-024-00069-8
-
[36]
Peter J Turnbaugh, Ruth E Ley, Michael A Mahowald, Vincent Magrini, Elaine R Mardis, Vickie G Musick, and Je!rey I Gordon. 2006. An obesity-associated gut microbiome with increased capacity for energy harvest. Nature 444, 7122 (2006), 1027–1031. doi:10.1038/nature05414
-
[37]
Congress
U.S. Congress. 1996. Health Insurance Portability and Accountability Act of 1996 (HIPAA). https://www.congress.gov/bill/104th-congress/house- bill/3103 Public Law 104-191
1996
-
[38]
Wong et al
A. Wong et al. 2024. A Uni"ed Framework for the Deployment and Access of HPC Applications as Services in Clouds. Journal of Grid Computing (2024). Manuscript submitted to ACM PEARC 2026 Institutionalizing Best Practices in Research Computing: A Framework and Case Study for Improving User Onboarding13
2024
-
[39]
Yunqi Zhang et al. 2016. History-Based Harvesting of Spare Cycles and Storage in Large-Scale Datacenters. In 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16)
2016
-
[40]
Ilya Zhukov and Jolanta Zjupa. 2025. Onboarding of HPC users: hands-on approach at Juelich Supercomputing Centre. In Practice and Experience in Advanced Research Computing 2025: The Power of Collaboration . doi:10.1145/3708035.3736077 A Appendix A A.1 Use of Generative AI The work uses generative AI platform Gemini-AI provided to WashU in partnership with...
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.