pith. machine review for the scientific record. sign in

arxiv: 2604.23562 · v1 · submitted 2026-04-26 · 🧬 q-bio.NC · cs.AI· cs.HC

Recognition: unknown

EyeBrain: Left and Right Brain Lateralization Activity Classification Through Pupil Diameter and Fixation Duration

Andreas Dengel, Ko Watanabe, Nicolas Gro{\ss}mann, Pooja Pol, Shoya Ishimaru

Authors on Pith no claims yet

Pith reviewed 2026-05-08 04:48 UTC · model grok-4.3

classification 🧬 q-bio.NC cs.AIcs.HC
keywords eye-trackingbrain lateralizationpupil diameterfixation durationclassificationcognitive monitoringneurorehabilitationhemisphere activity
0
0 comments X

The pith

Pupil diameter and fixation duration can classify left versus right brain hemisphere activity with an F1 score of 0.894.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper establishes that eye-tracking data on pupil size and fixation duration reliably distinguish activity dominated by the left brain hemisphere from that dominated by the right. Left-hemisphere tasks like language and arithmetic produce different ocular patterns than right-hemisphere tasks like drawing or music perception. The authors train a classifier on these metrics during controlled tasks and report strong performance. A reader would care because this points to a low-cost, non-invasive way to monitor which side of the brain is engaged without brain imaging. The work positions these eye signals as practical indicators for applications in cognitive tracking and recovery.

Core claim

The paper demonstrates that pupil diameter and fixation duration can effectively classify left and right brain hemisphere activities. We obtained a considerably high classification performance, with an F1 score of 0.894. The results suggest that ocular metrics are robust indicators of lateralized brain activity and can be applied in cognitive monitoring and neurorehabilitation.

What carries the argument

Binary classification model trained on pupil diameter and fixation duration extracted from eye-tracking recordings during tasks that selectively engage one hemisphere.

If this is right

  • Supports non-invasive monitoring of brain lateralization during cognitive tasks.
  • Opens use in neurorehabilitation settings to track recovery of hemispheric function.
  • Enables integration into real-time applications for ongoing cognitive assessment.
  • Extends to broader domains of cognitive and neurological monitoring without specialized equipment.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • Wearable eye trackers could one day provide continuous at-home feedback on hemispheric balance.
  • The same signals might help detect imbalances in conditions that disrupt typical lateralization.
  • Further experiments could test whether these metrics also track shifts in attention or fatigue within the same hemisphere.

Load-bearing premise

The chosen tasks activate only one hemisphere at a time and the eye metrics directly reflect that lateralization instead of task difficulty, effort, or general arousal.

What would settle it

Re-running the classification on a new dataset where left- and right-hemisphere tasks are matched for difficulty and arousal levels but the model accuracy falls to near chance.

Figures

Figures reproduced from arXiv: 2604.23562 by Andreas Dengel, Ko Watanabe, Nicolas Gro{\ss}mann, Pooja Pol, Shoya Ishimaru.

Figure 1
Figure 1. Figure 1: Concept design of EyeBrain project. Eye-tracking technology makes statistical information on brain lateralization accessible and easy to understand. Users can understand brain activities as an activity monitoring application. The relationship between brain lateralization and cognitive functions is well-documented. The left hemisphere primarily handles tasks such as language and arithmetic, while the right … view at source ↗
Figure 2
Figure 2. Figure 2: Experiment flow. Procedure with step-wise screenshots of the computer during any session of data collection. view at source ↗
Figure 3
Figure 3. Figure 3: Variation of raw and preprocessed pupil diameter signal over time. view at source ↗
Figure 4
Figure 4. Figure 4: Variation in weighted average F1 score with activity duration for machine-learning models. view at source ↗
Figure 5
Figure 5. Figure 5: Confusion matrix of leave-one-participant-out cross-validation (LOPOCV) for the best performing window size of 90 seconds view at source ↗
Figure 6
Figure 6. Figure 6: Confusion matrix of leave-one-activity-out cross-validation (LOAOCV) for three window sizes visualized for XGBoost classifier. view at source ↗
Figure 7
Figure 7. Figure 7: Top ten most important features according to XGBoost classification score. view at source ↗
read the original abstract

The relationship between brain lateralization and cognitive functions is well-documented. The left hemisphere primarily handles tasks such as language and arithmetic, while the right hemisphere is involved in creative activities like drawing and music perception. Eye-tracking technology has shown the potential to reveal cognitive states by measuring ocular metrics such as pupil diameter and fixation duration. However, the ability to distinguish lateralized brain activity using these ocular metrics remains underexplored. Here, we demonstrate that pupil diameter and fixation duration can effectively classify left and right brain hemisphere activities. We obtained a considerably high classification performance, with an F1 score of 0.894. The results suggest that ocular metrics are robust indicators of lateralized brain activity and can be applied in cognitive monitoring and neurorehabilitation. Our future work expands on this by integrating these methods into real-time applications EyeBrain, potentially broadening their use across various cognitive and neurological domains.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 0 minor

Summary. The manuscript claims that pupil diameter and fixation duration measured via eye-tracking can classify left-hemisphere (language/arithmetic) versus right-hemisphere (drawing/music) brain activity, achieving an F1 score of 0.894, and positions these ocular metrics as robust indicators suitable for cognitive monitoring and neurorehabilitation applications.

Significance. If the central mapping from eye metrics to selective hemispheric activation holds after proper controls, the work could enable low-cost, non-invasive monitoring of lateralized cognitive function with potential utility in neurorehabilitation and real-time applications. The absence of methodological detail prevents assessment of whether this potential is realized.

major comments (2)
  1. [Abstract] Abstract: The reported F1 score of 0.894 is presented with zero information on participant numbers, task design details, cross-validation procedure, baseline comparisons, or statistical controls for confounds (e.g., arousal, difficulty, cognitive load). Without these, it is impossible to determine whether classification performance tracks hemispheric lateralization or shared non-lateralized factors.
  2. [Abstract] The ground-truth labeling of tasks as selectively left- or right-lateralized rests on unvalidated assumptions; no prior fMRI/EEG validation, difficulty-matched controls, or ablation experiments are described to show that performance collapses when non-lateralized variables are equalized.

Simulated Author's Rebuttal

2 responses · 1 unresolved

We thank the referee for the detailed and constructive review. We address each major comment below and have revised the manuscript accordingly to improve clarity and rigor.

read point-by-point responses
  1. Referee: [Abstract] Abstract: The reported F1 score of 0.894 is presented with zero information on participant numbers, task design details, cross-validation procedure, baseline comparisons, or statistical controls for confounds (e.g., arousal, difficulty, cognitive load). Without these, it is impossible to determine whether classification performance tracks hemispheric lateralization or shared non-lateralized factors.

    Authors: We agree that the original abstract omitted critical methodological details required to evaluate the results. In the revised manuscript we have expanded the abstract to report participant numbers, task design (language/arithmetic vs. drawing/music), cross-validation procedure, baseline comparisons, and controls for confounds such as arousal, difficulty, and cognitive load. A new Methods section now provides the full experimental protocol, statistical analysis pipeline, and explicit discussion of how non-lateralized factors were addressed. revision: yes

  2. Referee: [Abstract] The ground-truth labeling of tasks as selectively left- or right-lateralized rests on unvalidated assumptions; no prior fMRI/EEG validation, difficulty-matched controls, or ablation experiments are described to show that performance collapses when non-lateralized variables are equalized.

    Authors: Task selection follows well-established findings on hemispheric specialization, but we acknowledge the original text did not sufficiently document supporting evidence or controls. The revision adds citations to prior fMRI and EEG literature validating lateralization of the chosen tasks, describes how tasks were matched for difficulty and cognitive load, and includes additional analyses examining the unique contribution of pupil diameter and fixation duration. Dedicated ablation experiments or new within-subject fMRI/EEG validation, however, would require fresh data collection and are noted as future work. revision: partial

standing simulated objections not resolved
  • New fMRI/EEG validation or full ablation experiments that would require additional participant recruitment and neuroimaging sessions beyond the existing dataset.

Circularity Check

0 steps flagged

No circularity: empirical classification with independent task labels and standard ML evaluation

full rationale

The paper reports a supervised classification experiment (pupil diameter + fixation duration → left/right hemisphere label) that yields F1=0.894. Labels are assigned by task type (language/arithmetic vs. drawing/music) rather than derived from the eye metrics themselves. No equations, fitted parameters renamed as predictions, self-citations used as uniqueness theorems, or ansatzes appear in the provided text. The derivation chain is therefore a conventional train/test pipeline whose output is not forced by construction from its inputs. The validity of the task-to-hemisphere mapping is an external empirical assumption, not a self-referential reduction.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

Only the abstract is available; no methods, equations, or data details are provided, so the ledger cannot be populated with concrete free parameters, axioms, or invented entities from the paper.

pith-pipeline@v0.9.0 · 5472 in / 1050 out tokens · 50974 ms · 2026-05-08T04:48:58.230185+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

77 extracted references · 16 canonical work pages

  1. [1]

    Tobii AB. 2024. Specifications for the Tobii Eye Tracker 4C. https://help.tobii.com/hc/en-us/articles/213414285-Specifications-for-the-Tobii-Eye- Tracker-4C

  2. [2]

    Evgeniy Abdulin and Oleg Komogortsev. 2015. User Eye Fatigue Detection via Eye Movement Behavior. InProceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems(Seoul, Republic of Korea)(CHI EA ’15). Association for Computing Machinery, New York, NY, USA, 1265–1270. https://doi.org/10.1145/2702613.2732812

  3. [3]

    Samira Aminihajibashi, Thomas Hagen, Maja Dyhre Foldal, Bruno Laeng, and Thomas Espeseth. 2019. Individual differences in resting-state pupil size: Evidence for association between working memory capacity and pupil size variability.International Journal of Psychophysiology140 (2019), 1–7

  4. [4]

    Gehrer, Krzysztof Krejtz, Andrew T

    Davide Bacchin, Nina A. Gehrer, Krzysztof Krejtz, Andrew T. Duchowski, and Luciano Gamberini. 2023. Gaze-based Metrics of Cognitive Load in a Conjunctive Visual Memory Task. InExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems(Hamburg, Germany) (CHI EA ’23). Association for Computing Machinery, New York, NY, USA, Article 1...

  5. [5]

    Shlomo Berkovsky, Ronnie Taib, Irena Koprinska, Eileen Wang, Yucheng Zeng, Jingjie Li, and Sabina Kleitman. 2019. Detecting Personality Traits Using Eye-Tracking Data. InProceedings of the 2019 CHI Conference on Human Factors in Computing Systems(Glasgow, Scotland Uk)(CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/...

  6. [6]

    Jeffrey Binder. 2000. The new neuroanatomy of speech perception. , 2371–2372 pages

  7. [7]

    Iuliia Brishtel, Anam Ahmad Khan, Thomas Schmidt, Tilman Dingler, Shoya Ishimaru, and Andreas Dengel. 2020. Mind wandering in a multimodal reading setting: Behavior analysis & automatic detection using eye-tracking and an eda sensor.Sensors20, 9 (2020), 2546

  8. [8]

    Andreas Bulling, Jamie A Ward, Hans Gellersen, and Gerhard Tröster. 2009. Eye movement analysis for activity recognition. InProceedings of the 11th international conference on Ubiquitous computing. 41–50

  9. [9]

    Roxanne L Canosa. 2009. Real-world vision: Selective perception and task.ACM Transactions on Applied Perception (TAP)6, 2 (2009), 1–34

  10. [10]

    Youngjun Cho. 2021. Rethinking Eye-blink: Assessing Task Difficulty through Physiological Representation of Spontaneous Blinking. InProceedings of the 2021 CHI Conference on Human Factors in Computing Systems(Yokohama, Japan)(CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 721, 12 pages. https://doi.org/10.1145/3411764.3445577

  11. [11]

    Michael C Corballis. 2014. Left brain, right brain: facts and fantasies.PLoS biology12, 1 (2014), e1001767

  12. [12]

    Inès Daguet, Didier Bouhassira, and Claude Gronfier. 2019. Baseline pupil diameter is not a reliable biomarker of subjective sleepiness.Frontiers in Neurology10 (2019), 108

  13. [13]

    Mirella Dapretto and Susan Y Bookheimer. 1999. Form and content: dissociating syntax and semantics in sentence comprehension.Neuron24, 2 (1999), 427–432

  14. [14]

    Duchowski, Krzysztof Krejtz, Izabela Krejtz, Cezary Biele, Anna Niedzielska, Peter Kiefer, Martin Raubal, and Ioannis Giannopoulos

    Andrew T. Duchowski, Krzysztof Krejtz, Izabela Krejtz, Cezary Biele, Anna Niedzielska, Peter Kiefer, Martin Raubal, and Ioannis Giannopoulos

  15. [15]

    InProceedings of the 2018 CHI Conference on Human Factors in Computing Systems(Montreal QC, Canada)(CHI ’18)

    The Index of Pupillary Activity: Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation. InProceedings of the 2018 CHI Conference on Human Factors in Computing Systems(Montreal QC, Canada)(CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3173574.3173856

  16. [16]

    Margaret G Funnell, Mary K Colvin, and Michael S Gazzaniga. 2007. The calculating hemispheres: Studies of a split-brain patient.Neuropsychologia 45, 10 (2007), 2378–2386

  17. [17]

    Hristijan Gjoreski, Ifigeneia Mavridou, James Archer William Archer, Andrew Cleal, Simon Stankoski, Ivana Kiprijanovska, Mohsen Fatoorechi, Piotr Walas, John Broulidakis, Martin Gjoreski, and Charles Nduka. 2023. OCOsense Glasses – Monitoring Facial Gestures and Expressions for Augmented Human-Computer Interaction: OCOsense Glasses for Monitoring Facial G...

  18. [18]

    1983.Music and the right hemisphere

    H Gordon. 1983.Music and the right hemisphere. Vol. 3. Academic Press London

  19. [19]

    Janine Grimmer, Laura Simon, and Jan Ehlers. 2021. The Cognitive Eye: Indexing Oculomotor Functions for Mental Workload Assessment in Cognition-Aware Systems. InExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems(Yokohama, Japan)(CHI EA ’21). Association for Computing Machinery, New York, NY, USA, Article 428, 6 pages. http...

  20. [20]

    Grootjen, Henrike Weingärtner, and Sven Mayer

    Jesse W. Grootjen, Henrike Weingärtner, and Sven Mayer. 2024. Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive Systems. InProceedings of the CHI Conference on Human Factors in Computing Systems(Honolulu, HI, USA)(CHI ’24). Association for Computing Machinery, New York, NY, USA, Article 322, 23 pages. https://doi.org...

  21. [21]

    Yoichiro Hisadome, Tianyi Wu, Jiawei Qin, and Yusuke Sugano. 2024. Rotation-Constrained Cross-View Feature Fusion for Multi-View Appearance- Based Gaze Estimation. InProceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (W ACV). 5985–5994

  22. [22]

    Andrew Housholder, Jonathan Reaban, Aira Peregrino, Georgia Votta, and Tauheed Khan Mohd. 2021. Evaluating accuracy of the Tobii eye tracker

  23. [23]

    Springer, 379–390

    InInternational Conference on Intelligent Human Computer Interaction. Springer, 379–390

  24. [24]

    Brockmole, and Sidney K

    Stephen Hutt, Kristina Krasich, James R. Brockmole, and Sidney K. D’Mello. 2021. Breaking out of the Lab: Mitigating Mind Wandering with Gaze- Based Attention-Aware Technology in Classrooms. InProceedings of the 2021 CHI Conference on Human Factors in Computing Systems(Yokohama, Japan)(CHI ’21). Association for Computing Machinery, New York, NY, USA, Arti...

  25. [25]

    Shoya Ishimaru, Kai Kunze, Koichi Kise, Jens Weppner, Andreas Dengel, Paul Lukowicz, and Andreas Bulling. 2014. In the blink of an eye: combining head motion and eye blink frequency for activity recognition with google glass. InProceedings of the 5th augmented human international conference. 1–4

  26. [26]

    Hassan Ismail Fawaz, Germain Forestier, Jonathan Weber, Lhassane Idoumghar, and Pierre-Alain Muller. 2019. Deep learning for time series classification: a review.Data mining and knowledge discovery33, 4 (2019), 917–963

  27. [27]

    Siddhartha Joshi and Joshua I Gold. 2020. Pupil size as a window on neural substrates of cognition.Trends in cognitive sciences24, 6 (2020), 466–480

  28. [28]

    Siddhartha Joshi, Yin Li, Rishi M Kalwani, and Joshua I Gold. 2016. Relationships between pupil diameter and neuronal activity in the locus coeruleus, colliculi, and cingulate cortex.Neuron89, 1 (2016), 221–234

  29. [29]

    Daniel Kahneman and Jackson Beatty. 1966. Pupil diameter and load on memory.Science154, 3756 (1966), 1583–1585

  30. [30]

    Ard Kastrati, Martyna Beata Plomecka, Roger Wattenhofer, and Nicolas Langer. 2021. Using deep learning to classify saccade direction from brain activity. InACM Symposium on Eye Tracking Research and Applications. 1–6

  31. [31]

    Andrew Kirk and Andrew Kertesz. 1989. Hemispheric contributions to drawing.Neuropsychologia27, 6 (1989), 881–886

  32. [32]

    Xiang-Zhen Kong, Samuel R Mathias, Tulio Guadalupe, ENIGMA Laterality Working Group, David C Glahn, Barbara Franke, Fabrice Crivello, Nathalie Tzourio-Mazoyer, Simon E Fisher, Paul M Thompson, et al. 2018. Mapping cortical brain asymmetry in 17,141 healthy individuals worldwide via the ENIGMA Consortium.Proceedings of the National Academy of Sciences115, ...

  33. [33]

    Xiang-Zhen Kong, Merel C Postema, Tulio Guadalupe, Carolien de Kovel, Premika SW Boedhoe, Martine Hoogman, Samuel R Mathias, Daan Van Rooij, Dick Schijven, David C Glahn, et al. 2022. Mapping brain asymmetry in health and disease through the ENIGMA consortium.Human brain mapping43, 1 (2022), 167–181

  34. [34]

    Thomas Kosch, Mariam Hassib, Daniel Buschek, and Albrecht Schmidt. 2018. Look into my Eyes: Using Pupil Dilation to Estimate Mental Workload for Task Complexity Adaptation. InExtended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems(Montreal QC, Canada) (CHI EA ’18). Association for Computing Machinery, New York, NY, USA, 1–6. ht...

  35. [35]

    Krzysztof Krejtz, Andrew T Duchowski, Anna Niedzielska, Cezary Biele, and Izabela Krejtz. 2018. Eye tracking cognitive load using pupil diameter and microsaccades with fixed gaze.PloS one13, 9 (2018), e0203629

  36. [36]

    Mariska E Kret and Elio E Sjak-Shie. 2019. Preprocessing pupil size data: Guidelines and code.Behavior research methods51 (2019), 1336–1342

  37. [37]

    Michal T Kucewicz, Jaromir Dolezal, Vaclav Kremen, Brent M Berry, Laura R Miller, Abigail L Magee, Vratislav Fabian, and Gregory A Worrell. 2018. Pupil size reflects successful encoding and recall of memory in humans.Scientific reports8, 1 (2018), 4949

  38. [38]

    Sharon Leal and Aldert Vrij. 2008. Blinking during and after lying.Journal of Nonverbal Behavior32 (2008), 187–194

  39. [39]

    Jia Zheng Lim, James Mountstephens, and Jason Teo. 2022. Eye-tracking feature extraction for biometric machine learning.Frontiers in neurorobotics 15 (2022), 796895

  40. [40]

    Simon P Liversedge and John M Findlay. 2000. Saccadic eye movements and cognition.Trends in cognitive sciences4, 1 (2000), 6–14

  41. [41]

    Anat London, Inbal Benhar, and Michal Schwartz. 2013. The retina as a window to the brain—from eye research to CNS disorders.Nature Reviews Neurology9, 1 (2013), 44–53

  42. [42]

    Samantha Mann, Aldert Vrij, and Ray Bull. 2002. Suspects, lies, and videotape: An analysis of authentic high-stake liars.Law and human behavior 26 (2002), 365–376

  43. [43]

    Sebastiaan Mathôt, Jasper Fabius, Elle Van Heusden, and Stefan Van der Stigchel. 2018. Safe and sensible preprocessing and baseline correction of pupil-size data.Behavior research methods50 (2018), 94–106

  44. [44]

    Tomoyo Morita, Minoru Asada, and Eiichi Naito. 2020. Right-hemispheric dominance in self-body recognition is altered in left-handed individuals. Neuroscience425 (2020), 68–89

  45. [45]

    Peter R Murphy, Redmond G O’connell, Michael O’sullivan, Ian H Robertson, and Joshua H Balsters. 2014. Pupil diameter covaries with BOLD activity in human locus coeruleus.Human brain mapping35, 8 (2014), 4140–4154

  46. [46]

    Shivsevak Negi and Ritayan Mitra. 2020. Fixation duration and the learning process: An eye tracking study with subtitled videos.Journal of Eye Movement Research13, 6 (2020)

  47. [47]

    Simon Neubauer, Philipp Gunz, Nadia A Scott, Jean-Jacques Hublin, and Philipp Mitteroecker. 2020. Evolution of brain lateralization: A shared hominid pattern of endocranial asymmetry is much more variable in humans than in great apes.Science Advances6, 7 (2020), eaax9935

  48. [48]

    Nikolai Nikolaenko. 2003. Artistic thinking and cerebral asymmetry.Acta Neuropsychologica1, 2 (2003)

  49. [49]

    Cladek, Mindy K

    Emma Ning, Andrea T. Cladek, Mindy K. Ross, Sarah Kabir, Amruta Barve, Ellyn Kennelly, Faraz Hussain, Jennifer Duffecy, Scott L. Langenecker, Theresa Nguyen, Theja Tulabandhula, John Zulueta, Olusola A. Ajilore, Alexander P. Demos, and Alex Leow. 2023. Smartphone-derived Virtual Keyboard Dynamics Coupled with Accelerometer Data as a Window into Understand...

  50. [50]

    Sou Nobukawa, Aya Shirama, Tetsuya Takahashi, Toshinobu Takeda, Haruhisa Ohta, Mitsuru Kikuchi, Akira Iwanami, Nobumasa Kato, and Shigenobu Toda. 2021. Pupillometric complexity and symmetricity follow inverted-U curves against baseline diameter due to crossed locus coeruleus projections to the Edinger-Westphal nucleus.Frontiers in Physiology12 (2021), 614479

  51. [51]

    2024.The lateralized brain: The neuroscience and evolution of hemispheric asymmetries

    Sebastian Ocklenburg and Onur Güntürkün. 2024.The lateralized brain: The neuroscience and evolution of hemispheric asymmetries. Elsevier

  52. [52]

    Manuel Oliva and Andrey Anikin. 2018. Pupil dilation reflects the time course of emotion recognition in human vocalizations.Scientific reports8, 1 (2018), 4871. 19 Conference acronym ’XX, June 03–05, 2018, Woodstock, NY Watanabe and Pol et al

  53. [53]

    Isabelle Peretz and Robert J Zatorre. 2005. Brain organization for music processing.Annu. Rev. Psychol.56, 1 (2005), 89–114

  54. [54]

    Philippe Pinel and Stanislas Dehaene. 2010. Beyond hemispheric dominance: brain regions underlying the joint lateralization of language and arithmetic to the left hemisphere.Journal of Cognitive Neuroscience22, 1 (2010), 48–66

  55. [55]

    PupilLabs. 2024. Specifications for the Pupil Labs Core Eye Tracker. https://pupil-labs.com/products/core

  56. [56]

    PupilLabs. 2024. Specifications for the Pupil Labs Invisible Eye Tracker. https://pupil-labs.com/products/invisible

  57. [57]

    Felix Putze, Tilman Ihrig, Tanja Schultz, and Wolfgang Stuerzlinger. 2020. Platform for studying self-repairing auto-corrections in mobile text entry based on brain activity, gaze, and context. InProceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13

  58. [58]

    Azhar Quddus, Ali Shahidi Zandi, Laura Prest, and Felix JE Comeau. 2021. Using long short term memory and convolutional neural networks for driver drowsiness detection.Accident Analysis & Prevention156 (2021), 106107

  59. [59]

    GS Rajshekar Reddy, Michael J Proulx, Leanne Hirshfield, and Anthony Ries. 2024. Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XR. InProceedings of the CHI Conference on Human Factors in Computing Systems. 1–17

  60. [60]

    Lesley J Rogers. 2021. Brain lateralization and cognitive capacity.Animals11, 7 (2021), 1996

  61. [61]

    Lesley J Rogers. 2024. Lateralization of Brain Function. InOxford Research Encyclopedia of Psychology

  62. [62]

    David Rozado and Andreas Dunser. 2015. Combining EEG with pupillometry to improve cognitive workload detection.Computer48, 10 (2015), 18–25

  63. [63]

    Carlo Semenza, Margarete Delazer, Laura Bertella, Alessia Granà, Ileana Mori, Fabio M Conti, Riccardo Pignatti, Lisa Bartha, Frank Domahs, Thomas Benke, et al. 2006. Is math lateralised on the same side as language? Right hemisphere aphasia and mathematical abilities.Neuroscience Letters406, 3 (2006), 285–288

  64. [64]

    Harshita Sharma, Lior Drukker, Aris T Papageorghiou, and J Alison Noble. 2021. Machine learning-based analysis of operator pupillary response to assess cognitive workload in clinical ultrasound imaging.Computers in biology and medicine135 (2021), 104589

  65. [65]

    Namrata Srivastava, Joshua Newn, and Eduardo Velloso. 2018. Combining low and mid-level gaze features for desktop activity recognition. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies2, 4 (2018), 1–27

  66. [66]

    Niklas Stein, Gianni Bremer, and Markus Lappe. 2022. Eye tracking-based lstm for locomotion prediction in vr. In2022 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE, 493–503

  67. [67]

    Vargo, Aman Gupta, George Chernyshov, Kai Kunze, and Tilman Dingler

    Benjamin Tag, Andrew W. Vargo, Aman Gupta, George Chernyshov, Kai Kunze, and Tilman Dingler. 2019. Continuous Alertness Assessments: Using EOG Glasses to Unobtrusively Monitor Fatigue Levels In-The-Wild. InProceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk)(CHI ’19). Association for Computing Machinery, New ...

  68. [68]

    Taichi Tanaka, Isao Nambu, Yoshiko Maruyama, and Yasuhiro Wada. 2022. Sliding-window normalization to improve the performance of machine- learning models for real-time motion prediction using electromyography.Sensors22, 13 (2022), 5005

  69. [69]

    Arthur W Toga and Paul M Thompson. 2003. Mapping brain asymmetry.Nature Reviews Neuroscience4, 1 (2003), 37–48

  70. [70]

    Lisa-Marie Vortmann and Felix Putze. 2021. Combining implicit and explicit feature extraction for eye tracking: attention classification using a heterogeneous input.Sensors21, 24 (2021), 8205

  71. [71]

    SF Walker. 1980. Lateralization of functions in the vertebrate brain: A review.British journal of psychology71, 3 (1980), 329–367

  72. [72]

    Ko Watanabe, Tanuja Sathyanarayana, Andreas Dengel, and Shoya Ishimaru. 2023. Engauge: Engagement gauge of meeting participants estimated by facial expression and deep neural network.IEEE Access11 (2023), 52886–52898

  73. [73]

    Ko Watanabe, Yusuke Soneda, Yuki Matsuda, Yugo Nakamura, Yutaka Arakawa, Andreas Dengel, and Shoya Ishimaru. 2021. DisCaaS: Micro Behavior Analysis on Discussion by Camera as a Sensor.Sensors21, 17 (2021). https://doi.org/10.3390/s21175719

  74. [74]

    Jie Xu, Yang Wang, Fang Chen, and Eric Choi. 2011. Pupillary response based cognitive workload measurement under luminance changes. InIFIP Conference on Human-Computer Interaction. Springer, 178–185

  75. [75]

    Zihan Yan, Yufei Wu, Yang Zhang, and Xiang ’Anthony’ Chen. 2022. EmoGlass: an End-to-End AI-Enabled Wearable Platform for Enhancing Self-Awareness of Emotional Health. InProceedings of the 2022 CHI Conference on Human Factors in Computing Systems(New Orleans, LA, USA) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 13, 19 pages....

  76. [76]

    Yvonne Yau, Thomas Hinault, Madeline Taylor, Paul Cisek, Lesley K Fellows, and Alain Dagher. 2021. Evidence and urgency related EEG signals during dynamic decision-making in humans.Journal of Neuroscience41, 26 (2021), 5711–5722

  77. [77]

    Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2019. Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications. InProceedings of the 2019 CHI Conference on Human Factors in Computing Systems(Glasgow, Scotland Uk)(CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300646 20