Recognition: no theorem link
Strategies for Collecting Multi-Institutional Data in Discipline-Based Education Research
Pith reviewed 2026-05-11 02:17 UTC · model grok-4.3
The pith
Multi-institutional DBER studies become feasible through targeted strategies for IRB navigation, recruitment, data standardization, and logistics.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
By following specific strategies for Institutional Review Board procedures, participant recruitment from a range of institution types, standardization of data sources across institutions, and management of project logistics, researchers can successfully carry out multi-institutional DBER studies, as demonstrated by a national project that gathered comparable data from 31 introductory physics instructors at 28 United States institutions.
What carries the argument
The four-area strategy set covering IRB navigation, recruitment across institution types, data standardization, and logistics coordination, demonstrated through the national multi-site physics data collection project.
If this is right
- Researchers can design studies that test the conditions under which education findings hold across different institutional contexts.
- The field gains the ability to accumulate evidence that applies to wider ranges of students and settings rather than isolated classrooms.
- Project teams can anticipate and resolve coordination problems before they halt data collection.
- Small successful interventions can be scaled to national levels with comparable measurements.
Where Pith is reading between the lines
- The same strategies could be adapted for multi-institutional work in biology or chemistry education research to check their wider usefulness.
- Shared templates for IRB documents might reduce duplicated effort for future teams following this approach.
- Long-term studies using the framework could track how teaching practices and student outcomes evolve across many institutions over time.
Load-bearing premise
The strategies proposed will prove effective and transferable when used in other multi-institutional projects that differ in discipline, scale, or data types from the single national physics example.
What would settle it
A new multi-institutional DBER project that applies the four strategies yet still encounters insurmountable differences in IRB approvals or non-comparable data across sites would show the strategies are not sufficient.
read the original abstract
Multi-institutional studies are critical for advancing discipline-based education research (DBER) because they allow us to determine where and for whom research findings are applicable. Despite this benefit, such studies remain relatively rare due to the complexities of coordinating data collection across different institutions. In this paper, we describe key challenges and propose actionable strategies for implementing multi-institutional DBER studies. We focus on navigating Institutional Review Board procedures, recruiting participants from a range of institution types, standardizing data sources across institutions, and managing logistics. We also provide an applied example of these strategies from a national research project in which we collected concept inventory data, social network surveys, and classroom observations from 31 introductory physics instructors at 28 institutions in the United States.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper claims that multi-institutional studies are essential for DBER to determine where and for whom findings apply, identifies key challenges in IRB navigation, participant recruitment across institution types, data standardization, and logistics, proposes actionable strategies for each, and illustrates them via an applied example of collecting concept inventory, survey, and observation data from 31 introductory physics instructors at 28 U.S. institutions.
Significance. If the strategies hold up under broader use, the work could reduce barriers to multi-institutional DBER, enabling more generalizable research; it directly addresses the rarity of such studies by sharing concrete tactics from a large-scale physics project.
major comments (1)
- The central claim that the proposed strategies are actionable and effective rests entirely on the single applied example (31 instructors, 28 institutions, concept inventories + surveys + observations). No quantitative metrics of strategy impact (e.g., recruitment yields, data completeness rates, time/cost savings) or comparisons to alternative approaches are provided, leaving generalizability to other disciplines, smaller projects, or non-U.S. settings unexamined and limiting the strength of the advice.
minor comments (1)
- A summary table listing each challenge alongside the corresponding strategy would improve readability and quick reference for practitioners.
Simulated Author's Rebuttal
We thank the referee for their positive recommendation of minor revision and for highlighting an important limitation in how the strategies are supported. We address the major comment below and describe the revisions we will make.
read point-by-point responses
-
Referee: The central claim that the proposed strategies are actionable and effective rests entirely on the single applied example (31 instructors, 28 institutions, concept inventories + surveys + observations). No quantitative metrics of strategy impact (e.g., recruitment yields, data completeness rates, time/cost savings) or comparisons to alternative approaches are provided, leaving generalizability to other disciplines, smaller projects, or non-U.S. settings unexamined and limiting the strength of the advice.
Authors: We agree that the manuscript relies on a single illustrative case without quantitative metrics of impact or comparisons to other methods. The paper's purpose is to describe challenges encountered in multi-institutional DBER and to propose practical strategies that proved workable in our project, rather than to evaluate those strategies empirically or benchmark them against alternatives. In the revised version we will add a dedicated limitations subsection that reports available project-specific details (e.g., institutions contacted versus those that participated, overall data completeness rates for the concept inventories, surveys, and observations) and that explicitly cautions readers about the U.S.-physics context, the scale of the effort, and the absence of cross-disciplinary or international evidence. This framing will make the advice more appropriately qualified while preserving its utility as a starting point for other researchers. revision: partial
Circularity Check
No circularity: purely descriptive strategies paper with no derivations
full rationale
The paper presents challenges and actionable strategies for multi-institutional DBER studies, illustrated via one national physics project example (31 instructors, 28 institutions). It contains no equations, no predictions, no first-principles derivations, and no load-bearing self-citations or uniqueness claims that reduce any result to its inputs by construction. All content is advisory and experiential; the central claims about strategy effectiveness rest on direct description rather than any self-referential chain or fitted input renamed as output. This is the expected non-finding for a methods-oriented descriptive manuscript.
Axiom & Free-Parameter Ledger
axioms (1)
- domain assumption Multi-institutional studies allow determination of where and for whom research findings are applicable
Reference graph
Works this paper leans on
-
[1]
lity in the study procedures (Caulfield et al. , 2011; Green et al ., 2023). A third option is to establish IRB authorization (reliance) agreements, where the researchers submit one project-wide protocol at their institution that includes both instructors and students as participants ( Strategy IRB-1C ). In contrast to the approaches described above, this...
work page 2011
-
[2]
responsibility for the accuracy and integrity of the work (ICMJE, 2026). However, what constitutes a “substantial” contribution can be ambiguous, and many actions essential to multi-institutional research may not meet authorship criteria despite being critical to the success of the project. Under Strategy IRB-1D , instructors and students at participating...
work page 2026
-
[3]
Characterizing Active Learning Environments in Physics
or use obscuring Zoom backgrounds (Oliffe et al. , 2021), limiting researchers’ ability to observe the cues (e.g., gestures, facial expressions) that often support qualitative analyses. Online interviews can also be more prone to interruptions (e.g., due to lost Internet connection) and unnatural conversation pacing (e.g., due to low audio quality or lag ...
work page 2021
-
[4]
In early 2022, about 1.5 years before data collection started, we worked with our institution’s IRB to choose the most appropriate approach to ethical review given that IRB policies and processes are idiosyncratic (Linden et al. , 2019). The scope of our project was quite large, spanning many different institutions; therefore, we decided not to use separa...
work page 2022
-
[5]
McPadden, D., Sawtelle, V., Scanlon, E
The Physics Teacher , 58 (7), 465-469. McPadden, D., Sawtelle, V., Scanlon, E. M., Chini, J. J., Chahal, H., Levy, R., & Reynolds, A. (2023). Planning for participants’ varying needs and abilities in qualitative research. Physical Review Physics Education Research , 19 (2), 020143. National Research Council. (2012). Overview of Discipline-Based Education ...
-
[6]
Oliffe, J. L., Kelly, M. T., Gonzalez Montaner, G., & Yu Ko, W. F. (2021). Zoom interviews: Benefits and concessions. International Journal of Qualitative Methods , 20 , 16094069211053522. Ramlo, S. (2008). Validity and reliability of the force and motion conceptual evaluation. American Journal of Physics , 76 (9), 882-886. Sawtelle, V. (2026). Keep Doing...
work page 2021
-
[7]
Secretary’s Advisory Committee on Human Research Protections [SACHRP]. (2022). A new interpretation of the “engaged in research” standard. U.S. Department of Health and Human Services, Office for Human Research Protections. Retrieved February 18, 2026, from https://www.hhs.gov/ohrp/sachrp-committee/recommendations/attachment-d-july-25-2022-letter/index.ht...
-
[8]
Wieman, C. E. (2014). Large-scale comparison of science teaching methods sends clear message. Proceedings of the National Academy of Sciences , 111 (23), 8319-8320. Wild, H., Kyröläinen, A. J., & Kuperman, V. (2022). How representative are student convenience samples? A study of literacy and numeracy skills in 32 countries. PLoS One , 17 (7), e0271191. Wu...
work page 2014
-
[9]
Yeager, D. S., Hanselman, P., Walton, G. M., Murray, J. S., Crosnoe, R., Muller, C., ... & Dweck, C. S. (2019). A national experiment reveals where a growth mindset improves achievement. Nature , 573 (7774), 364-369. Zwickl, B. M., Hirokawa, T., Finkelstein, N., & Lewandowski, H. J. (2014). Epistemology and expectations survey about experimental physics: ...
work page 2019
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.