pith. machine review for the scientific record. sign in

arxiv: 2604.02629 · v1 · submitted 2026-04-03 · 💻 cs.HC · cs.AI

Recognition: no theorem link

Toys that listen, talk, and play: Understanding Children's Sensemaking and Interactions with AI Toys

Authors on Pith no claims yet

Pith reviewed 2026-05-13 19:17 UTC · model grok-4.3

classification 💻 cs.HC cs.AI
keywords AI toyschildren's sensemakingparticipatory designadversarial playinteraction breakdownsgenerative AIsocial profiling
0
0 comments X

The pith

Children profile AI toys as social beings but interaction breakdowns lead to adversarial play.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper examines how children ages 6-11 make sense of AI toys that listen, talk, simulate emotions, and recall interactions. In two participatory design sessions, eight children played with three such toys and reflected on the experience, showing genuine curiosity and treating the toys as social entities. Yet mismatches between the toys' displayed intelligence and their toy-like physical form caused frequent breakdowns that disrupted expected play patterns. These disruptions prompted children to shift toward adversarial behaviors. The work matters because generative AI is moving into everyday play objects, raising questions about boundaries, agency, and relationships that designers must address.

Core claim

Children approached the AI toys with curiosity and profiled them as social beings capable of ongoing connection, but frequent interaction breakdowns and mismatches between apparent intelligence and toy-like form disrupted play expectations and produced adversarial play. The authors conclude with implications and design provocations for more transparent, developmentally appropriate, and responsible encounters.

What carries the argument

Participatory design sessions in which children shift between play, experimentation, and reflection with AI toys, exposing how social profiling collides with capability mismatches to generate adversarial responses.

If this is right

  • Designers should build clearer signals of toy limitations so children can adjust expectations before breakdowns occur.
  • Toys need mechanisms that preserve play flow even when responses fail to match prior interactions or simulated emotions.
  • Developmentally appropriate transparency features can reduce the shift from curiosity to adversarial play.
  • Responsible AI toy design requires explicit attention to how children form and revise mental models of agency and relationship.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • Widespread adversarial play with AI toys could alter how children distinguish between responsive objects and people in other contexts.
  • Short sessions may understate long-term effects on children's play habits if breakdowns persist across repeated use.
  • The pattern suggests a need to test whether simple explanations of AI capabilities before play reduce adversarial shifts.

Load-bearing premise

Observations from eight children in two sessions can stand in for how broader groups of children will interact with AI toys.

What would settle it

A larger study in which children ages 6-11 interact with similar AI toys yet show no increase in adversarial play despite the same breakdowns and form-intelligence mismatches.

Figures

Figures reproduced from arXiv: 2604.02629 by Aayushi Dangol, Daeun Yoo, Franziska Roesner, Jason Yip, Julie A. Kientz, Meghna Gupta, Robert Wolfe.

Figure 1
Figure 1. Figure 1: Examples of emerging AI toys (Grem, Grok, and Gabbo) from https://heycurio.com/ [PITH_FULL_IMAGE:figures/full_fig_p001_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Overview of design sessions with children: (a) circle time, (b) free play, (c) stickies from likes, dislikes, and design ideas activity, and (d) comicboarding [PITH_FULL_IMAGE:figures/full_fig_p006_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Overview of the comic panels that were used in the [PITH_FULL_IMAGE:figures/full_fig_p006_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: Photos from the free-play activity across the two design sessions, showing children interacting with the AI toys. [PITH_FULL_IMAGE:figures/full_fig_p008_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Overview of comics from the comicboarding activity. [PITH_FULL_IMAGE:figures/full_fig_p010_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Unpredictability Scenario: After a child recalls a [PITH_FULL_IMAGE:figures/full_fig_p014_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Shared Secrets Scenario: A child confides a personal [PITH_FULL_IMAGE:figures/full_fig_p014_7.png] view at source ↗
Figure 8
Figure 8. Figure 8: Emotional Misalignment Scenario: An AI toy re [PITH_FULL_IMAGE:figures/full_fig_p015_8.png] view at source ↗
Figure 9
Figure 9. Figure 9: Embodied Continuity Scenario: An AI toy is physi [PITH_FULL_IMAGE:figures/full_fig_p015_9.png] view at source ↗
Figure 12
Figure 12. Figure 12: Mimicry Scenario: The AI toy begins mimicking, [PITH_FULL_IMAGE:figures/full_fig_p016_12.png] view at source ↗
Figure 13
Figure 13. Figure 13: Paywalled Memory Scenario: The AI toy forgets [PITH_FULL_IMAGE:figures/full_fig_p016_13.png] view at source ↗
read the original abstract

Generative AI (genAI) is increasingly being integrated into children's everyday lives, not only through screens but also through so-called "screen-free" AI toys. These toys can simulate emotions, personalize responses, and recall prior interactions, creating the illusion of an ongoing social connection. Such capabilities raise important questions about how children understand boundaries, agency, and relationships when interacting with AI toys. To investigate this, we conducted two participatory design sessions with eight children ages 6-11 where they engaged with three different AI toys, shifting between play, experimentation, and reflection. Our findings reveal that children approached AI toys with genuine curiosity, profiling them as social beings. However, frequent interaction breakdowns and mismatches between apparent intelligence and toy-like form disrupted expectations around play and led to adversarial play. We conclude with implications and design provocations to navigate children's encounters with AI toys in more transparent, developmentally appropriate, and responsible ways.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The manuscript reports findings from two participatory design sessions with eight children (ages 6-11) who interacted with three AI toys. It claims that children approached the toys with genuine curiosity and profiled them as social beings, but that interaction breakdowns and mismatches between apparent intelligence and toy-like form disrupted play expectations and prompted adversarial play. The authors draw implications and design provocations for more transparent, developmentally appropriate AI toys.

Significance. If the observations hold, the work contributes exploratory qualitative evidence on children's sensemaking with generative-AI toys, a timely topic in HCI. The participatory sessions provide direct child perspectives that can usefully inform design guidelines for child-AI interaction, particularly around managing expectations and breakdowns.

major comments (2)
  1. [Methods] Methods: The description of how raw session data were analyzed to produce the reported themes (curiosity, social profiling, adversarial play) lacks detail on the coding process, inter-observer agreement, or validation steps. This under-specification is load-bearing for the reliability of the central empirical claims.
  2. [Findings] Findings: Concrete session excerpts or behavioral examples that directly illustrate how interaction breakdowns and form-intelligence mismatches produced adversarial play are needed; without them the causal link between observed behaviors and the claimed sensemaking patterns remains under-supported.
minor comments (2)
  1. [Abstract] Abstract: A one-sentence note on the qualitative analysis approach would help readers assess the strength of the reported patterns.
  2. [Discussion] Discussion: The design provocations would be stronger if each were explicitly tied to a specific observation or quote from the sessions rather than stated at a general level.

Simulated Author's Rebuttal

2 responses · 0 unresolved

Thank you for the constructive feedback on our manuscript. We have carefully considered the referee's comments and revised the paper to address the concerns regarding methods and findings. Below, we provide point-by-point responses.

read point-by-point responses
  1. Referee: [Methods] Methods: The description of how raw session data were analyzed to produce the reported themes (curiosity, social profiling, adversarial play) lacks detail on the coding process, inter-observer agreement, or validation steps. This under-specification is load-bearing for the reliability of the central empirical claims.

    Authors: We agree that additional detail on the analysis process would strengthen the manuscript. In the revised version, we have expanded the Methods section to describe the thematic analysis approach in more detail, including how the two researchers iteratively coded the session videos and transcripts, developed the themes through discussion, and validated them against the raw data. While we did not compute formal inter-rater reliability metrics due to the small sample and exploratory nature of the participatory design sessions, we have added information on the collaborative validation steps taken. revision: yes

  2. Referee: [Findings] Findings: Concrete session excerpts or behavioral examples that directly illustrate how interaction breakdowns and form-intelligence mismatches produced adversarial play are needed; without them the causal link between observed behaviors and the claimed sensemaking patterns remains under-supported.

    Authors: We thank the referee for this suggestion. We have incorporated specific excerpts from the participatory design sessions and detailed behavioral observations into the Findings section. These examples now explicitly link the interaction breakdowns and mismatches to the emergence of adversarial play, providing stronger support for our interpretations of children's sensemaking. revision: yes

Circularity Check

0 steps flagged

No circularity: empirical qualitative observations from participatory sessions

full rationale

The paper reports findings from two participatory design sessions with eight children, describing observed behaviors such as curiosity, social profiling, interaction breakdowns, and adversarial play. These are presented as direct interpretations of session data with no mathematical derivations, fitted parameters, predictions, equations, or self-citation chains invoked to justify core claims. The analysis relies on qualitative sensemaking from the sessions themselves rather than reducing any result to prior inputs by construction. This is a standard empirical HCI study with no load-bearing self-referential steps.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

This is an empirical qualitative study relying on observational data from participatory sessions rather than formal axioms or parameters.

pith-pipeline@v0.9.0 · 5480 in / 858 out tokens · 25608 ms · 2026-05-13T19:17:25.482763+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

78 extracted references · 78 canonical work pages

  1. [1]

    2025.Google Rolls Out Gemini for Home AI Assistant in the US

    Aminu Abdullahi. 2025.Google Rolls Out Gemini for Home AI Assistant in the US. TechRepublic. https://www.techrepublic.com/article/news-google-gemini- home/

  2. [2]

    Valentina Andries and Judy Robertson. 2023. Alexa doesn’t have that many feelings: Children’s understanding of AI through interactions with smart speakers in their homes.Computers and Education: Artificial Intelligence5 (2023), 100176

  3. [3]

    Erin Beneteau, Ashley Boone, Yuxing Wu, Julie A Kientz, Jason Yip, and Alexis Hiniker. 2020. Parenting with Alexa: exploring the introduction of smart speakers on family dynamics. InProceedings of the 2020 CHI conference on human factors in computing systems. 1–13

  4. [4]

    Erin Beneteau, Olivia K Richards, Mingrui Zhang, Julie A Kientz, Jason Yip, and Alexis Hiniker. 2019. Communication breakdowns between families and Alexa. InProceedings of the 2019 CHI conference on human factors in computing systems. 1–13

  5. [5]

    Melanie Birks, Ysanne Chapman, and Karen Francis. 2008. Memoing in qualitative research: Probing data and processes.Journal of research in nursing13, 1 (2008), 68–75

  6. [6]

    2009.Play: How it shapes the brain, opens the imagination, and invigorates the soul

    Stuart Brown and Christopher Vaughan. 2009.Play: How it shapes the brain, opens the imagination, and invigorates the soul. Penguin

  7. [7]

    Jerome S Bruner. 1972. Nature and uses of immaturity.American psychologist27, 8 (1972), 687

  8. [8]

    Gaile S Cannella. 1993. Learning through social interaction: Shared cognitive experience, negotiation strategies, and joint concept construction for young children.Early Childhood Research Quarterly8, 4 (1993), 427–444

  9. [9]

    Nancy Carlsson-Paige. 1990. Who’s calling the shots?: How to respond effectively to children’s fascination with war play and war toys.(No Title)(1990)

  10. [10]

    2025.About a Quarter of US Teens Have Used ChatGPT for Schoolwork, Double the Share in 2023

    Pew Research Center. 2025.About a Quarter of US Teens Have Used ChatGPT for Schoolwork, Double the Share in 2023. https://www.pewresearch.org/short- reads/2025/01/15/about-a-quarter-of-us-teens-have-used-chatgpt-for- schoolwork-double-the-share-in-2023/

  11. [11]

    Liuqing Chen, Shuhong Xiao, Yunnong Chen, Yaxuan Song, Ruoyu Wu, and Lingyun Sun. 2024. ChatScratch: an AI-Augmented System Toward Autonomous Visual Programming Learning for Children Aged 6-12. InProceedings of the 2024 CHI Conference on Human Factors in Computing Systems(Honolulu, HI, USA) (CHI ’24). Association for Computing Machinery, New York, NY, USA...

  12. [12]

    Ruishi Chen, Victor R Lee, and Monica G Lee. 2025. A cross-sectional look at teacher reactions, worries, and professional development needs related to generative AI in an urban school district.Education and Information Technologies (2025), 1–38

  13. [13]

    Yi Cheng, Kate Yen, Yeqi Chen, Sijin Chen, and Alexis Hiniker. 2018. Why doesn’t it work? Voice-driven interfaces and young children’s communication repair strategies. InProceedings of the 17th ACM conference on interaction design and children. 337–348

  14. [14]

    James F Christie and EP Johnsen. 1983. The role of play in social-intellectual development.Review of Educational Research53, 1 (1983), 93–115

  15. [15]

    2026.AI Toys

    Common Sense Media. 2026.AI Toys. Common Sense Media. https://www. commonsensemedia.org/ai-ratings/ai-toys

  16. [16]

    R. J. Cross and Rory Erlich. 2025. AI Comes to Playtime: Artificial companions, real risks. https://pirg.org/edfund/wp-content/uploads/2025/12/AI-Comes-to- Playtime-Artifical-companions-real-risks.pdf

  17. [17]

    Flannery Hope Currin, Cassidy Kilcoin, Kerry Peterman, Kyle Rector, and Juan Pablo Hourcade. 2024. Opportunities and challenges in using tangible, teleoperated voice agents in kid-driven moments in play among families with neurodivergent children.Proceedings of the ACM on human-computer interaction 8, CSCW1 (2024), 1–25

  18. [18]

    Aayushi Dangol. 2025. Beyond Users: Supporting Children in Interpreting, Resisting, and Collaborating with AI. InProceedings of the 24th Interaction Design and Children. 1180–1184

  19. [19]

    Aayushi Dangol, Yun Huang, Srirangaraj Setlur, Adele Smolansky, Hariharan Subramonyam, Hyewon Suh, Jinjun Xiong, and Julie A Kientz. 2024. AI-driven support for people with speech & language difficulties. InExtended Abstracts of the CHI Conference on Human Factors in Computing Systems. 1–4

  20. [20]

    Bowers, Antonio Vigil, Jason Yip, Julie A

    Aayushi Dangol, Smriti Kotiyal, Robert Wolfe, Alex J. Bowers, Antonio Vigil, Jason Yip, Julie A. Kientz, Suleman Shahid, Tom Yeh, Vincent Cho, and Katie Davis

  21. [21]

    InProceedings of the 2026 CHI Conference on Human Factors in Computing Systems (CHI ’26)(Barcelona, Spain)(CHI ’26)

    Relief or displacement? How teachers are negotiating generative AI’s role in their professional practice. InProceedings of the 2026 CHI Conference on Human Factors in Computing Systems (CHI ’26)(Barcelona, Spain)(CHI ’26). Association for Computing Machinery, New York, NY, USA. doi:10.1145/3772318.3791904

  22. [22]

    I Want to Think Like an SLP

    Aayushi Dangol, Aaleyah Lewis, Hyewon Suh, Xuesi Hong, Hedda Meadan, James Fogarty, and Julie A Kientz. 2025. “I Want to Think Like an SLP”: A Design Exploration of AI-Supported Home Practice in Speech Therapy. InProceedings of the 2025 CHI Conference on Human Factors in Computing Systems. 1–21

  23. [23]

    Kientz, Jason Yip, and Caroline Pitt

    Aayushi Dangol, Michele Newman, Robert Wolfe, Jin Ha Lee, Julie A. Kientz, Jason Yip, and Caroline Pitt. 2024. Mediating Culture: Cultivating Socio-cultural Understanding of AI in Children through Participatory Design. InProceedings of the 2024 ACM Designing Interactive Systems Conference(Copenhagen, Denmark) (DIS ’24). Association for Computing Machinery...

  24. [24]

    Aayushi Dangol, Robert Wolfe, Daeun Yoo, Arya Thiruvillakkat, Ben Chick- adel, and Julie A Kientz. 2025. If anybody finds out you are in BIG TROUBLE”: Toys that listen, talk, and play: Understanding Children’s Sensemaking and Interactions with AI Toys Conference’17, July 2017, Washington, DC, USA Understanding Children’s Hopes, Fears, and Evaluations of G...

  25. [25]

    Aayushi Dangol, Robert Wolfe, Runhua Zhao, JaeWon Kim, Trushaa Ramanan, Katie Davis, and Julie A. Kientz. 2025. Children’s Mental Models of AI Reasoning: Implications for AI Literacy Education. InProceedings of the Interaction Design and Children Conference (IDC ’25). Association for Computing Machinery, Reykjavik, Iceland, 18. doi:10.1145/3713043.3728856

  26. [26]

    AI just keeps guessing

    Aayushi Dangol, Runhua Zhao, Robert Wolfe, Trushaa Ramanan, Julie A Kientz, and Jason Yip. 2025. “AI just keeps guessing”: Using ARC Puzzles to Help Children Identify Reasoning Errors in Generative AI. InProceedings of the 24th Interaction Design and Children. 444–464

  27. [27]

    Hey Google is it ok if I eat you?

    Stefania Druga, Randi Williams, Cynthia Breazeal, and Mitchel Resnick. 2017. " Hey Google is it ok if I eat you?" Initial explorations in child-agent interaction. InProceedings of the 2017 conference on interaction design and children. 595–600

  28. [28]

    Allison Druin. 1999. Cooperative inquiry: developing new technologies for children with children. InProceedings of the SIGCHI Conference on Human Factors in Computing Systems(Pittsburgh, Pennsylvania, USA)(CHI ’99). Association for Computing Machinery, New York, NY, USA, 592–599. doi:10.1145/302979.303166

  29. [29]

    Allison DRUIN. 2002. The role of children in the design of new technology. Behaviour & information technology21, 1 (2002), 1–25

  30. [30]

    Gary A Fine. 1988. Good children and dirty play.Play & Culture(1988)

  31. [31]

    Radhika Garg and Subhasree Sengupta. 2020. He is just like me: a study of the long-term use of smart speakers by parents and children.Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies4, 1 (2020), 1–24

  32. [32]

    Josie Gleave and Issy Cole-Hamilton. 2012. A literature review on the effects of a lack of play on children’s lives.England: Play England(2012)

  33. [33]

    Peter Gray. 2017. What exactly is play, and why is it such a powerful vehicle for learning?Topics in Language Disorders37, 3 (2017), 217–228

  34. [34]

    Phyllis Greenacre. 1959. Play in relation to creative imagination.The psychoana- lytic study of the child14, 1 (1959), 61–80

  35. [35]

    Karl Groos, Elizabeth L Baldwin, and J Baldwin. 1898. The psychology of animal play. (1898)

  36. [36]

    Xinwei Guan, Dengkai Chen, and Suihuai Yu. 2025. AI-Driven Design of Emotion- ally Supportive Toys for Child Development: An Iterative Process. InProceedings of the 2025 International Conference on Artificial Intelligence and Educational Systems. 160–168

  37. [37]

    Ariel Han and Zhenyao Cai. 2023. Design implications of generative AI systems for visual storytelling for young learners. InProceedings of the 22nd Annual ACM Interaction Design and Children Conference. 470–474

  38. [38]

    Alexis Hiniker, Sharon S Heung, Sungsoo Hong, and Julie A Kientz. 2018. Coco’s videos: an empirical investigation of video-player design features and children’s media use. InProceedings of the 2018 CHI conference on human factors in computing systems. 1–13

  39. [39]

    Jessica Hoffmann and Sandra Russ. 2012. Pretend play, creativity, and emotion regulation in children.Psychology of aesthetics, creativity, and the arts6, 2 (2012), 175

  40. [40]

    Layne Jackson Hubbard, Yifan Chen, Eliana Colunga, Pilyoung Kim, and Tom Yeh. 2021. Child-robot interaction to integrate reflective storytelling into creative play. InProceedings of the 13th Conference on Creativity and Cognition. 1–8

  41. [41]

    Eva Johansson and Anette Emilson. 2016. Conflicts and resistance: Potentials for democracy learning in preschool.International Journal of Early Years Education 24, 1 (2016), 19–35

  42. [42]

    Ken Kahn and Niall Winters. 2021. Constructionism and AI: A history and possible futures.British Journal of Educational Technology52, 3 (2021), 1130–1142

  43. [43]

    Sarika Kewalramani, Ioanna Palaiologou, Maria Dardanou, Kelly-Ann Allen, and Sivanes Phillipson. 2021. Using robotic toys in early childhood education to support children’s social and emotional competencies.Australasian Journal of Early Childhood46, 4 (2021), 355–369

  44. [44]

    Eric Klopfer, Justin Reich, Hal Abelson, and Cynthia Breazeal. 2024. Generative AI and K-12 Education: An MIT Perspective.An MIT Exploration of Generative AI(mar 27 2024). https://mit-genai.pubpub.org/pub/4k9msp17

  45. [45]

    Josephine Lau, Benjamin Zimmerman, and Florian Schaub. 2018. Alexa, are you listening? Privacy perceptions, concerns and privacy-seeking behaviors with smart speakers.Proceedings of the ACM on human-computer interaction2, CSCW (2018), 1–31

  46. [46]

    I Said Knight, Not Night!

    Zhixin Li, Trisha Thomas, Chi-Lin Yu, and Ying Xu. 2024. " I Said Knight, Not Night!": Children’s Communication Breakdowns and Repairs with AI Versus Human Partners. InProceedings of the 23rd Annual ACM Interaction Design and Children Conference. 781–788

  47. [47]

    Irene Lopatovska, Katrina Rink, Ian Knight, Kieran Raines, Kevin Cosenza, Harriet Williams, Perachya Sorsche, David Hirsch, Qi Li, and Adrianna Martinez. 2019. Talk to me: Exploring user interactions with the Amazon Alexa.Journal of Librarianship and Information Science51, 4 (2019), 984–997

  48. [48]

    Ben Mardell, Daniel Wilson, Jen Ryan, Katie Ertel, Mara Krechevsky, and Megina Baker. 2016. Towards a pedagogy of play.Cambridge, MA: Harvard Graduate School of Education(2016)

  49. [49]

    Alexa, you’re really stupid

    Lina Mavrina, Jessica Szczuka, Clara Strathmann, Lisa Michelle Bohnenkamp, Nicole Krämer, and Stefan Kopp. 2022. “Alexa, you’re really stupid”: A longitudi- nal field study on communication breakdowns between family members and a voice assistant.Frontiers in Computer Science4 (2022), 791704

  50. [50]

    Nora McDonald, Sarita Schoenebeck, and Andrea Forte. 2019. Reliability and Inter-rater Reliability in Qualitative Research: Norms and Guidelines for CSCW and HCI Practice.Proc. ACM Hum.-Comput. Interact.3, CSCW, Article 72 (Nov. 2019), 23 pages. doi:10.1145/3359174

  51. [51]

    2025.New Report Shows Students Are Em- bracing Artificial Intelligence Despite Lack of Parent A wareness

    Common Sense Media. 2025.New Report Shows Students Are Em- bracing Artificial Intelligence Despite Lack of Parent A wareness. https: //www.commonsensemedia.org/press-releases/new-report-shows-students- are-embracing-artificial-intelligence-despite-lack-of-parent-awareness-and Accessed: 2025-01-26

  52. [52]

    Punya Mishra and Matthew J. Koehler. 2006. Technological pedagogical content knowledge: A framework for teacher knowledge.Teachers College Record108, 6 (2006), 1017–1054

  53. [53]

    Neema Moraveji, Jason Li, Jiarong Ding, Patrick O’Kelley, and Suze Woolf. 2007. Comicboarding: using comics as proxies for participatory design with children. InProceedings of the SIGCHI conference on Human factors in computing systems. 1371–1374

  54. [54]

    Terran Mott, Alexandra Bejarano, and Tom Williams. 2022. Robot co-design can help us engage child stakeholders in ethical reflection. In2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 14–23

  55. [55]

    Keziah Naggita, Elsa Athiley, Beza Desta, and Sarah Sebo. 2022. Parental Re- sponses to Aggressive Child Behavior towards Robots, Smart Speakers, and Tablets. In2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 337–344

  56. [56]

    I want it to talk like Darth Vader

    Michele Newman, Kaiwen Sun, Ilena B Dalla Gasperina, Grace Y Shin, Matthew Kyle Pedraja, Ritesh Kanchi, Maia B Song, Rannie Li, Jin Ha Lee, and Jason Yip. 2024. " I want it to talk like Darth Vader": Helping Children Con- struct Creative Self-Efficacy with Generative AI. InProceedings of the 2024 CHI Conference on Human Factors in Computing Systems. 1–18

  57. [57]

    Tatsuya Nomura, Takayuki Uratani, Takayuki Kanda, Kazutaka Matsumoto, Hi- royuki Kidokoro, Yoshitaka Suehiro, and Sachie Yamada. 2015. Why do children abuse robots?. InProceedings of the tenth annual ACM/IEEE international confer- ence on human-robot interaction extended abstracts. 63–64

  58. [58]

    Mohammad Obaid, Gökçe Elif Baykal, Güncel Kırlangıc, Tilbe Göksun, and Asım Evren Yantaç. 2024. Collective co-design activities with children for design- ing classroom robots. InProceedings of the 4th African Human Computer Interac- tion Conference(East London, South Africa)(AfriCHI ’23). Association for Com- puting Machinery, New York, NY, USA, 229–237. ...

  59. [59]

    Alexa, let me ask you something different

    Cansu Oranç and Azzurra Ruggeri. 2021. “Alexa, let me ask you something different” Children’s adaptive information search with voice assistants.Human Behavior and Emerging Technologies3, 4 (2021), 595–605

  60. [60]

    Anne Ottenbreit-Leftwich, Krista Glazewski, Minji Jeon, Katie Jantaraweragul, Cindy E Hmelo-Silver, Adam Scribner, Seung Lee, Bradford Mott, and James Lester. 2023. Lessons learned for AI education with elementary students and teachers.International Journal of Artificial Intelligence in Education33, 2 (2023), 267–289

  61. [61]

    Jean Piaget. 1962. The stages of the intellectual development of the child.Bulletin of the Menninger clinic26, 3 (1962), 120

  62. [62]

    Alexa is my new BFF

    Amanda Purington, Jessie G Taft, Shruti Sannon, Natalya N Bazarova, and Samuel Hardman Taylor. 2017. " Alexa is my new BFF" social roles, user satis- faction, and personification of the Amazon Echo. InProceedings of the 2017 CHI conference extended abstracts on human factors in computing systems. 2853–2859

  63. [63]

    Mitchel Resnick. 2024. Generative AI and creative learning: Concerns, opportu- nities, and choices. (2024)

  64. [64]

    Olivia K Richards and Tiffany Veinot. 2025. I don’t want to watch grown-up stuff’: Children’s and Parents’ Perspectives and Recommendations for Health-Centered Digital Media Design. InProceedings of the 2025 CHI Conference on Human Factors in Computing Systems. 1–18

  65. [65]

    Richard Rogers. 2018. Coding and writing analytic memos on qualitative data: A review of Johnny Saldaña’s the coding manual for qualitative researchers.The Qualitative Report23, 4 (2018), 889–892

  66. [66]

    Carolyn U Shantz and Willard W Hartup. 1992. Conflict and development: An introduction.Conflict in child and adolescent development(1992), 1–11

  67. [67]

    Jaemarie Solyst, Amy Ogan, and Jessica Hammer. 2023. Intergenerational Games to Learn About AI and Ethics. InProceedings of the 54th ACM Tech- nical Symposium on Computer Science Education V. 2(Toronto ON, Canada) (SIGCSE 2023). Association for Computing Machinery, New York, NY, USA, 1273. doi:10.1145/3545947.3573256

  68. [68]

    Clara Strathmann, Aike C Horstmann, Jessica M Szczuka, and Nicole C Krämer

  69. [69]

    Alexa, shut up!–A 2.5-year study on negatively connotated communication behaviour towards voice assistants in the family home.Behaviour & Information Technology(2025), 1–19

  70. [70]

    Clara Strathmann, Jessica Szczuka, and Nicole Krämer. 2020. She talks to me as if she were alive: Assessing the social reactions and perceptions of children toward voice assistants and their appraisal of the appropriateness of these reactions. In Conference’17, July 2017, Washington, DC, USA Dangol & Gupta et al. Proceedings of the 20th ACM international ...

  71. [71]

    Lev S Vygotsky. 1967. Play and its role in the mental development of the child. Soviet psychology5, 3 (1967), 6–18

  72. [72]

    2025.Toys are talking back thanks to AI, but are they safe around kids?Los Angeles Times

    Queenie Wong. 2025.Toys are talking back thanks to AI, but are they safe around kids?Los Angeles Times. https://www.latimes.com/business/story/2025-12- 24/toys-are-talking-back-thanks-to-ai-but-are-they-safe-around-kids

  73. [73]

    It Would Be Cool to Get Stampeded by Dinosaurs

    Julia Woodward, Feben Alemu, Natalia E. López Adames, Lisa Anthony, Jason C. Yip, and Jaime Ruiz. 2022. “It Would Be Cool to Get Stampeded by Dinosaurs”: Analyzing Children’s Conceptual Model of AR Headsets Through Co-Design. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–13

  74. [74]

    Julia Woodward, Zari McFadden, Nicole Shiver, Amir Ben-Hayon, Jason C Yip, and Lisa Anthony. 2018. Using co-design to examine how children conceptualize intelligent interfaces. InProceedings of the 2018 CHI conference on human factors in computing systems. 1–14

  75. [75]

    Robert K Yin. 2013. Validity and generalization in future case study evaluations. Evaluation19, 3 (2013), 321–332

  76. [76]

    Jason C Yip, Kung Jin Lee, and Jin Ha Lee. 2020. Design partnerships for par- ticipatory librarianship: A conceptual model for understanding librarians co designing with digital youth.Journal of the Association for Information Science and Technology71, 10 (2020), 1242–1256. https://doi.org/10.1002/asi.24320

  77. [77]

    Jason C Yip, Kiley Sobel, Caroline Pitt, Kung Jin Lee, Sijin Chen, Kari Nasu, and Laura R Pina. 2017. Examining adult-child interactions in intergenerational participatory design. InProceedings of the 2017 CHI conference on human factors in computing systems. Association for Computing Machinery, New York, New York, 5742–5754. https://doi.org/10.1145/30254...

  78. [78]

    Chao Zhang, Cheng Yao, Jianhui Liu, Zili Zhou, Weilin Zhang, Lijuan Liu, Fang- tian Ying, Yijun Zhao, and Guanyun Wang. 2021. StoryDrawer: A Co-Creative Agent Supporting Children’s Storytelling through Collaborative Drawing. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. 1–6. Appendix A That’s mean butwoohoo that’s...