Recognition: no theorem link
UI Placement as a Critical Design Factor for Augmented Reality During Locomotion
Pith reviewed 2026-05-10 20:11 UTC · model grok-4.3
The pith
Physical movement affects AR interaction through the spatial placement of the UI relative to the user.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
The paper establishes that the impact of physical movement on AR interaction is not direct but mediated by UI placement, defined as the spatial relationship between the user and the interface. Current research overlooks this by isolating interaction techniques, and the authors advocate reconceptualizing placement beyond traditional anchoring, designing techniques for specific placements during locomotion, and evaluating placement rigorously as an independent variable in studies. Centering analysis on relative movement between user and interface is presented as the path to more effective on-the-go AR.
What carries the argument
UI placement as the mediator of locomotion effects, specifically the spatial relationship between user and interface that shapes interaction performance.
If this is right
- Interaction techniques must be developed specifically for chosen UI placements rather than applied universally during locomotion.
- Experimental studies should isolate and vary UI placement to measure its effects on interaction outcomes.
- Designers should explore novel techniques that account for the relative motion between the user and the placed interface.
- AR systems for mobile activities would prioritize placement decisions early in the design process.
Where Pith is reading between the lines
- Systems could detect user locomotion and automatically adjust or suggest UI placements to maintain usable interaction.
- This view raises questions about how placement choices affect real-world safety and attention during walking or jogging.
- The mediation idea could extend to other wearable contexts where body motion and digital overlays interact, such as in sports or navigation aids.
Load-bearing premise
Reconceptualizing UI placement beyond traditional anchoring and evaluating it as an independent variable will unlock more effective AR interaction during locomotion.
What would settle it
An experiment that measures AR task performance while users walk at consistent speeds, comparing multiple UI placements and finding no measurable differences in error rates, speed, or user effort attributable to placement.
Figures
read the original abstract
Wearable augmented reality (AR) represents the next interface to all things computing, extending what smartphones and laptops can do. This involves providing access to digital information during activities like walking or jogging. In this work we argue that the impact of physical movement on AR interaction is not direct, but mediated by UI placement - the spatial relationship between the user and the interface. Current research often treats interaction techniques in isolation, overlooking how their performance is fundamentally linked to where the UI is placed. This position paper highlights the need to reconceptualize UI placement beyond traditional anchoring views, explore novel interaction techniques designed for specific UI placements during locomotion, and rigorously evaluate UI placement as an independent variable in experimental studies. By centering the analysis on the relative movement between user and interface, we can unlock more effective on-the-go AR interaction.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. This position paper claims that the impact of physical movement on AR interaction during locomotion is not direct but mediated by UI placement—the spatial relationship between the user and the interface. It argues that current research often treats interaction techniques in isolation, overlooking this fundamental link, and calls for reconceptualizing UI placement beyond traditional anchoring views, exploring novel interaction techniques designed for specific UI placements during locomotion, and rigorously evaluating UI placement as an independent variable in experimental studies. By centering analysis on the relative movement between user and interface, the paper posits that more effective on-the-go AR interaction can be unlocked.
Significance. If this mediation framing and call to treat UI placement as a first-class design factor are adopted, the work could shift design paradigms in wearable AR and mobile HCI toward locomotion-aware interfaces that prioritize spatial user-interface relationships. This has prospective significance for improving usability in everyday activities involving movement, as it highlights an under-emphasized lens that could inspire new techniques and evaluation methods. As a conceptual position without empirical results or formal models, however, its impact would depend on subsequent research validating or operationalizing the ideas.
major comments (1)
- Abstract: The central claim that 'by centering the analysis on the relative movement between user and interface, we can unlock more effective on-the-go AR interaction' is asserted without any concrete examples of reconceptualized placements, novel techniques, or references to how existing anchoring approaches have failed to account for locomotion mediation, making the reframing difficult to evaluate or build upon.
minor comments (1)
- The manuscript would benefit from explicit section headings (e.g., Related Work, Proposed Framework) to better organize the position and allow readers to trace how the mediation argument extends prior AR locomotion studies.
Simulated Author's Rebuttal
We thank the referee for their constructive feedback, which helps us better communicate the core ideas of this position paper. We agree that the abstract requires strengthening to make the central claim more concrete and actionable.
read point-by-point responses
-
Referee: Abstract: The central claim that 'by centering the analysis on the relative movement between user and interface, we can unlock more effective on-the-go AR interaction' is asserted without any concrete examples of reconceptualized placements, novel techniques, or references to how existing anchoring approaches have failed to account for locomotion mediation, making the reframing difficult to evaluate or build upon.
Authors: We acknowledge the validity of this observation regarding the abstract. The manuscript body does discuss limitations of traditional anchoring (e.g., world-fixed UIs becoming unstable during gait-induced motion and body-fixed UIs causing visual fatigue from constant relative movement) and proposes reconceptualizations such as gait-phase-adaptive placements and novel techniques like velocity-matched UI sliding. However, these were not distilled into the abstract. We will revise the abstract to include one or two brief, specific examples of reconceptualized placements and references to prior work on anchoring that overlooks relative locomotion effects. This change will improve evaluability while preserving the paper's position-paper character. revision: yes
Circularity Check
No significant circularity; position paper with no derivations
full rationale
The manuscript is a position paper that advances a conceptual reframing: physical movement effects on AR interaction are mediated by UI placement rather than direct. No equations, formal models, fitted parameters, or derivations appear in the provided text or abstract. The central claim is an argumentative hypothesis recommending reconceptualization of placement as an independent variable, without any self-referential reduction, self-citation load-bearing premise, or renaming of known results as new predictions. The argument structure is self-contained as a call for future work and does not rely on its own outputs as inputs.
Axiom & Free-Parameter Ledger
axioms (1)
- domain assumption UI placement mediates the effects of locomotion on AR interaction performance
Reference graph
Works this paper leans on
-
[1]
Lystbæk, Anna Maria Feit, Ken Pfeuffer, Peter Kán, Antti Oulasvirta, and Kaj Grønbæk
João Marcelo Evangelista Belo, Mathias N. Lystbæk, Anna Maria Feit, Ken Pfeuffer, Peter Kán, Antti Oulasvirta, and Kaj Grønbæk. 2022. AUIT – the Adaptive User Interfaces Toolkit for Designing XR Applications. InProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology (UIST ’22). Association for Computing Machinery, New York, ...
-
[2]
M. Billinghurst, J. Bowskill, M. Jessop, and J. Morphett. 1998. A Wearable Spatial Conferencing Space. InDigest of Papers. Second International Symposium on Wearable Computers (Cat. No.98ex215). 76–83. doi:10.1109/ISWC.1998.729532
-
[3]
Olivier Borg, Remy Casanova, and Reinoud J. Bootsma. 2015. Reading from a Head-Fixed Display during Walking: Adverse Effects of Gaze Stabilization Mechanisms.PLOS ONE10, 6 (June 2015), e0129902. doi:10.1371/journal.pone.0129902
-
[4]
2004.3D User Interfaces: Theory and Practice(1st edition ed.)
Doug Bowman, Ernst Kruijff, Joseph LaViola Jr, and Ivan Poupyrev. 2004.3D User Interfaces: Theory and Practice(1st edition ed.). Addison-Wesley Professional, Boston San Francisco New York Toronto Montreal London Munich Paris Madrid Capetown Sydney Tokyo Singapore Mexico City
2004
-
[5]
Chiao-Ju Chang, Yu Lun Hsu, Wei Tian Mireille Tan, Yu-Cheng Chang, Pin Chun Lu, Yu Chen, Yi-Han Wang, and Mike Y. Chen
-
[6]
InProceedings of the 2024 ACM Designing Interactive Systems Conference (DIS ’24)
Exploring Augmented Reality Interface Designs for Virtual Meetings in Real-world Walking Contexts. InProceedings of the 2024 ACM Designing Interactive Systems Conference (DIS ’24). Association for Computing Machinery, New York, NY, USA, 391–408. doi:10.1145/3643834.3661538
-
[7]
Josephine Y. Chau, Hidde P. van der Ploeg, Jannique G. Z. van Uffelen, Jason Wong, Ingrid Riphagen, Genevieve N. Healy, Nicholas D. Gilson, David W. Dunstan, Adrian E. Bauman, Neville Owen, and Wendy J. Brown. 2010. Are Workplace Interventions to Reduce Sitting Effective? A Systematic Review.Preventive Medicine51, 5 (Nov. 2010), 352–356. doi:10.1016/j.ypm...
-
[8]
Fidalgo, Jonathan Wieland, and David Lindlbauer
Yi Fei Cheng, Ari Carden, Hyunsung Cho, Catarina G. Fidalgo, Jonathan Wieland, and David Lindlbauer. 2025. Augmented Reality Productivity In-the-Wild: A Diary Study of Usage Patterns and Experiences of Working With AR Laptops in Real-World Settings.IEEE Transactions on Visualization and Computer Graphics31, 10 (Oct. 2025), 9195–9212. doi:10.1109/TVCG.2025.3592962
-
[9]
Steven Feiner, Blair MacIntyre, Marcus Haupt, and Eliot Solomon. 1993. Windows on the World: 2D Windows for 3D Augmented Reality. InProceedings of the 6th Annual ACM Symposium on User Interface Software and Technology(Atlanta, Georgia, USA)(UIST ’93). Association for Computing Machinery, New York, NY, USA, 145–155. doi:10.1145/168642.168657
-
[10]
Elisa Maria Klose, Nils Adrian Mack, Jens Hegenberg, and Ludger Schmidt. 2019. Text Presentation for Augmented Reality Applications in Dual-Task Situations. In2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 636–644. doi:10.1109/VR.2019.8797992
-
[11]
Wallace Lages and Doug Bowman. 2019. Adjustable Adaptation for Spatial Augmented Reality Workspaces. InSymposium on Spatial User Interaction (SUI ’19). Association for Computing Machinery, New York, NY, USA, 1–2. doi:10.1145/3357251.3358755
-
[12]
Wallace S. Lages and Doug A. Bowman. 2019. Walking with Adaptive Augmented Reality Workspaces: Design and Usage Patterns. In Proceedings of the 24th International Conference on Intelligent User Interfaces (IUI ’19). Association for Computing Machinery, New York, NY, USA, 356–366. doi:10.1145/3301275.3302278
-
[13]
Yang Li, Juan Liu, Jin Huang, Yang Zhang, Xiaolan Peng, Yulong Bian, and Feng Tian. 2024. Evaluating the Effects of User Motion and Viewing Mode on Target Selection in Augmented Reality.International Journal of Human-Computer Studies(July 2024), 103327. doi:10.1016/j.ijhcs.2024.103327
-
[14]
Yujun Lu, BoYu Gao, Huawei Tu, Huiyue Wu, Weiqiang Xin, Hui Cui, Weiqi Luo, and Henry Been-Lirn Duh. 2022. Effects of Physical Walking on Eyes-Engaged Target Selection with Ray-Casting Pointing in Virtual Reality.Virtual Reality(Aug. 2022). doi:10.1007/s10055- 022-00677-9
-
[15]
2025.3D UI Placement for Interaction on the Go
Pavel Manakhov. 2025.3D UI Placement for Interaction on the Go. Ph. D. Dissertation. Aarhus University. https://pure.au.dk/portal/en/ publications/3d-ui-placement-for-interaction-on-the-go/
2025
-
[16]
Pavel Manakhov, Ludwig Sidenmark, Ken Pfeuffer, and Hans Gellersen. 2024. Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended Reality.IEEE Transactions on Visualization and Computer Graphics30, 11 (Nov. 2024), 7234–7244. doi:10.1109/TVCG.2024.3456153
-
[17]
Pavel Manakhov, Ludwig Sidenmark, Ken Pfeuffer, and Hans Gellersen. 2024. Gaze on the Go: Effect of Spatial Reference Frame on Visual Target Acquisition During Physical Locomotion in Extended Reality. InProceedings of the CHI Conference on Human Factors in Computing Systems (CHI ’24). Association for Computing Machinery, New York, NY, USA, 1–16. doi:10.11...
-
[18]
Steve Mann and James Fung. 2001. Videoorbits on Eye Tap Devices for Deliberately Diminished Reality or Altering the Visual Perception of Rigid Planar Patches of a Real World Scene.International Symposium on Mixed Reality, 2001(2001), 48–55
2001
-
[19]
Microsoft. 2021. Comfort - Mixed Reality: Heads-Up Displays. https://learn.microsoft.com/en-us/windows/mixed-reality/design/ comfort#heads-up-displays
2021
-
[20]
Microsoft. 2021, Nov 30. Billboarding and Tag-along - Mixed Reality. https://web.archive.org/web/20250604055920/https://learn. microsoft.com/en-us/windows/mixed-reality/design/billboarding-and-tag-along
-
[21]
Guanghan Zhao, Jason Orlosky, Joseph Gabbard, and Kiyoshi Kiyokawa. 2024. HazARdSnap: Gazed-Based Augmentation Delivery for Safe Information Access While Cycling.IEEE Transactions on Visualization and Computer Graphics30, 9 (Sept. 2024), 6378–6389. doi:10.1109/TVCG.2023.3333336 Received 12 February 2026; accepted 15 March 2026
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.