ID: 218
/ MCI-Demo Session: 1
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Entertainment Computing, Gamification und Serious GamesStichworte: Narrative game controls, video games, immersion
Morigami: Exploring Control Schemes in Narrative-driven Gaming Contexts
Marlien Hauser, Johann Hauser, Martin Kocur, Michael Lankes
University of Applied Sciences Upper Austria, Austria
We developed a 2.5D game Morigami that won an Austrian game award and is being prepared to be published at Steam. Our game demonstrates how control schemes, which emphasize the connection between the player and the characters through direct impact of player inputs, can extend beyond gameplay functionality to become crucial elements of the narrative. We present how we developed narrative controls as mechanisms that link player inputs with the game narrative, enhancing emotional involvement and engagement. The game Morigami employs these controls to reflect character development and story progression. A qualitative analysis with players shows that narrative controls can enhance the gaming experience, making narratives more compelling and immersive. This demo paper presents a game and suggests pathways for further investigation into narrative controls across different gaming contexts and underscores their potential in enhancing storytelling in interactive media.
ID: 248
/ MCI-Demo Session: 2
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Interaktion mit eingebetteten und ambienten Systemen, Virtuelle, gemischte und erweiterte Realitäten, Mobile und ubiquitäre Interaktion, Multimodale Schnittstellen, Be-greifbare Interaktion, Interaktionstechniken, Anwendungsfelder, Reflexion und Perspektiven: Individuum und GesellschaftStichworte: Generative AI, Diffusion Algorithms, Image Diffusion, Tangible Interactions, Speculative Design, Co-Creation, Interaction Design, Co-Ideation
Transferscope – Making Multi-Modal Conditioning for Image Diffusion Models Tangible
Christopher Pietsch, Aeneas Stankowski
University of Design Schwäbisch Gmünd, Germany
The significance of artificial intelligence (AI) is progressively amplifying for designers, especially within the domain of human-computer interaction. For design students, a foundational comprehension of machine learning (ML) algorithms is indispensable to navigate and utilize this technology in both theoretical and applied contexts - in order to leverage it within design proposals, and also within the design process.
Generative AI Tools have rapidly entered creative processes of designers and artists alike, and have been heavily adopted by lay people. They have been praised for democratizing high-quality image creation. However, there are still concerns about the limited artistic control and steerability they provide , especially for professional creatives. This raises questions about how well these tools can be integrated into carefully developed creative workflows, given the constraints on composition and detail.
Additionally, text2image algorithms are highly competitive with more manual creation and visualisation techniques in terms of speed and fidelity, while lacking opportunities for deliberation and fine-grained control.
As a physical artifact, Transferscope attempts to tangibly introduce professional designers and students to generative AI powered workflows that facilitate creative control, while maintaining the option to leverage serendipity-driven iteration uniquely made possible by the instant-availability provide by image generation models like stable diffusion.Transferscope serves an educational purposes within an experiential teaching approach, and has been designed to work within exhibition and classroom settings alike.
ID: 284
/ MCI-Demo Session: 3
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Be-greifbare Interaktion, InklusionStichworte: co-design, musical interfaces, tangible user interfaces, closeness over a distance
ConMusiCo: A TUI Connecting Children through Shared Music Making
Holger Klapperich, Bernhard Wohlmacher, Tom Seiffert, Mareike Focken, Sabrina Großkopp, Alina Huldtgren
Department of Media, Hochschule Düsseldorf, Deutschland
ConMusiCo is a tangible user interface (TUI) that connects two children in a shared musical activity over a distance. Feeling related as a core human need was key in the development of the TUI. In a co-design process we explored with children, how to express different emotions and how a TUI needs to be designed to support shared musical activities. The work combines research on supporting relatedness through technology, as well as insights on the positive effects of music making on pro-social behavior.
ID: 303
/ MCI-Demo Session: 4
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Virtuelle, gemischte und erweiterte Realitäten, InklusionStichworte: Access Barriers, Disability, Virtual Reality
Leveraging Virtual Reality Simulation to Engage Non-Disabled People in Reflection on Access Barriers for Disabled People
Timo Brogle, Andrej Vladimirovic Ermoshkin, Konstantin Vakhutinskiy, Sven Priewe, Claas Wittig, Anna-Lena Meiners, Kathrin Gerling, Dmitry Alexandrovsky
KIT, Germany
Disabled people experience many barriers in daily life, but non-disabled people rarely pause to reflect and engage in joint action to advocate for access.
In this demo, we explore the potential of Virtual Reality (VR) to sensitize non-disabled people to barriers in the built environment. We contribute a VR simulation of a major traffic hub in Karlsruhe, Germany, and we employ visual embellishments and animations to showcase barriers and potential removal strategies.
Through our work, we seek to engage users in conversation on what kind of environment is accessible to whom, and what equitable participation in society requires. Additionally, we aim to expand the understanding of how VR technology can promote reflection through interactive exploration.
ID: 305
/ MCI-Demo Session: 5
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: User Experience Design, Digital Humanities und UX, Affekt, Ästhetik, EmotionStichworte: pictographs, generative AI, LLM agent, inpainting, visulization, image generation, presentations
PictographAI: Interactive Generation of Stylized Pictographs for Presentations
Sarah Makarem, Tobias Röddiger, Till Riedel, Michael Beigl
Karlsruhe Institute of Technology, Germany
In today’s data-driven world, effective data visualization is crucial for communication. Recent studies have shown that meaningful and relevant visual embellishments and decorations can significantly enhance data visualization memorability and comprehension. Hence, we introduce PictographAI, a generative tool integrated into presentation software to transform traditional bar charts into Pictographic visualizations. Utilizing a multimodal AI pipeline, PictographAI processes text, images, and raw data from presentation slides to automatically generate contextually appropriate pictographs. Our pipeline uses an Large language model agent, a text-guided image-inpainting model, and algorithmic post-processing to make sense of the slide contents and generate pictographs. As users update their presentation slides, the AI pipeline automates the generation of new pictographs that represent the respective contents. In this work, we demonstrate the concept and working principle that motivates the system architecture and the generative AI pipeline on a bar chart generation use case that integrates into a presentation slide creation workflow.
ID: 317
/ MCI-Demo Session: 6
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Prototyping und Interaktionsmodellierung, Interaktion mit eingebetteten und ambienten Systemen, Brain-Computer Interfaces, Mobile und ubiquitäre Interaktion, Wearable und Nomadic Computing, Eyetracking und Gaze Interaction, Health und WellbeingStichworte: earables, hearables, open-source hardware, OSHW, open wearables
OpenEarable Suite: Open-Source Hardware to Sense 30+ Phenomena on the Ears
Tobias Röddiger, Michael Knierim, Philipp Lepold, Tobias King, Michael Beigl
Karlsruhe Institute of Technology, Deutschland
In this demo, we showcase the OpenEarable Suite, a comprehensive collection of ear-worn devices designed to sense and analyze over 30 different phenomena. The collection includes three distinct devices: OpenEarable 2.0, OpenEarable ExG, and OpenEarable ExG Headphones. "OpenEarable 2.0" integrates advanced sensors, such as ultrasound-capable microphones, a 9-axis inertial measurement unit, a pulse oximeter, an optical temperature sensor, and an ear canal pressure sensor, enabling extensive health monitoring, activity tracking, and human-computer interaction. "OpenEarable ExG" is an open-source platform focused on measuring biopotentials like EEG, ECG, and EMG, using up to four sensing channels, and validated for detecting eye movements, brain activity, and muscle contractions. "OpenEarable ExG Headphones" combine electrophysiological sensing with high-quality audio, utilizing OpenBCI biosignal amplification and a 3D-printed over-ear design for reliable EEG, EOG, ECG, and EMG measurements. The OpenEarable Suite aims to democratize earable research by providing accessible, open-source tools in different form factors that follow best practices in hardware and software development, facilitating diverse applications across various domains from medical to HCI.
ID: 318
/ MCI-Demo Session: 7
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: User Experience Design, Evaluationsverfahren für Alle, Virtuelle, gemischte und erweiterte Realitäten, Methodische Aspekte und Modellierung, Automotive User Interfaces, Mensch-Roboter InteraktionStichworte: Social Robots, V2X, Virtual Reality, Traffic, Vulnerable Road User
Experiencing Social Robots for Traffic Guidance using Virtual Reality Videos
Maximilian Schrapel, Manuel Bied, Barbara Bruno, Alexey Vinel
Karlsruhe Institute of Technology, Deutschland
Integrating autonomous vehicles and smart infrastructure into urban traffic systems is a crucial component for the development of future cities. Therefore, effective public communication and early citizen involvement is essential to align expectations with the capabilities of novel technology. We propose to use point of view 360-degree videos in Virtual Reality to present potential technologies in early stages to stakeholders, accelerate design processes and to measure physiological responses. We demonstrate our proposed method by the use case of social robots using V2X communication and arm gestures for pedestrian traffic guidance at unsignalized intersections. Initial video recordings with the robot in traffic in a Wizard of Oz setting showed curiosity among pedestrians about the robot's use case.
ID: 319
/ MCI-Demo Session: 8
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: User Experience Design, Prototyping und Interaktionsmodellierung, Interaktion mit eingebetteten und ambienten Systemen, Multimodale Schnittstellen, Be-greifbare Interaktion, Interaktionstechniken, Affekt, Ästhetik, EmotionStichworte: climate change visualization, artistic visualiations, motion capturing
Do not touch! - An artistic climate data visualization using motion capturing and 3D computer graphics
Alexander Doudkin, Martin Christof Kindsmüller
Brandenburg University of Applied Sciences
This demo explores an innovative artistic installation that creatively visualizes global temperature data using graphical visualization and motion capture technologies. By combining video-based posture capturing of nearby individuals with a dynamically rendered 3D model of the planet Earth, this installation offers an interactive and immersive experience. The goal is to transform climate change data into an engaging visual format, making it more accessible and impactful for a wide range of audiences.
ID: 322
/ MCI-Demo Session: 9
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Learning TechnologyStichworte: Machine Learning Education, Artificial Intelligence
Demonstrating SandwichNet—A Playful Tool for Teaching the Basics of Neural Networks
Felix Sewing, Rahel Flechtner
Hochschule für Gestaltung Schwäbisch Gmünd, Deutschland
Artificial intelligence (AI) is increasingly crucial for designers, particularly in the field of human–computer interaction. A basic understanding of machine learning (ML) algorithms is essential for design students to effectively engage with this technology both conceptually and practically. However, teaching this complex content in disciplines outside of computer science requires a different approach. To address this, we introduce SandwichNet, an interactive web-based tool to teach the basic principles of neural networks. SandwichNet provides a playful interactive visualization of a neural net whose parameters can be manipulated to recreate and understand the learning behavior of the network. We introduce the tool and outline how it is integrated into our fundamental teaching of AI technologies for designers.
ID: 326
/ MCI-Demo Session: 10
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: User Experience Design, Prototyping und Interaktionsmodellierung, Haptik, Touch und Gestik, Be-greifbare Interaktion, Inklusion, Health und Wellbeing, Assistive Technologien, Universal DesignStichworte: Demenz, Psychosoziale Interventionen, Tangible User Interface, Reminiszenztherapie, Musiktherapie
memTUI - Tangible memories. Digital Support for Dementia.
Matthia Leyendecker, Jurek Breuninger
IU Internationale Hochschule, Erfurt, Deutschland
Mit der demografischen Entwicklung steigt nicht nur der Anteil älterer Menschen in der Gesellschaft, sondern auch derer mit körperlichen und/oder geistigen Beeinträchtigungen wie Demenzen, die auf Unterstützung, beispielsweise in Pflegeheimen, angewiesen sind. Dies stellt auch eine große Herausforderung für Betreuungskräfte, insbesondere in Hinblick auf die Kommunikation und soziale Interaktion mit den Betroffenen, dar. Musik- und Reminiszenztherapie haben sich in diesem Kontext als wirksame Ansätze erwiesen. Während in der Praxis bislang kaum digitale Medien zum Einsatz kommen, zeigt die Forschung bereits Ansätze auf, wie entsprechende Therapien durch deren Verwendung unterstützt werden können. Die vorliegende Arbeit legt dar, wie Tangible User Interfaces gestaltet und eingesetzt werden können, um die aktive Teilhabe und selbstbestimmte Nutzung durch Demenzerkrankte zu fördern.
ID: 330
/ MCI-Demo Session: 11
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: User Experience Design, Virtuelle, gemischte und erweiterte Realitäten, Interaktionstechniken, Learning TechnologyStichworte: Virtual Reality Leaning Environments, User Centered Design
Photo Studio - A VR learning environment for Design Students.
Holger Reckter, Thomas Lüttich
University of Applied Sciences Mainz, Deutschland
This demo paper presents "Photo Studio - A learning environment for design students".
Photo Studio is an Virtual Reality (VR) application, which focuses on important learning aspects
of design students in various university courses of fine art, like communication design and media design, as well as in photography and architectural programs. Photo Studio is based on our experiences with VRoom, which we evaluated and developed further. The results of the evaluation are given in the chapter Motivation and were the reason to develop Photo Studio, which is used to learn lighting and pre production for photography sessions in a real photographic studio. This VRLE has higher standards for interaction and user interface, like spacial manipulation of objects for different use cases. Some of our solutions regarding user interface and interaction are presented in the chapters Design and Results in more detail.
The demo paper is sublemented by a video which briefly presents the use interface as well as one use case.
ID: 337
/ MCI-Demo Session: 12
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Prototyping und Interaktionsmodellierung, Be-greifbare InteraktionStichworte: Data visualization, Data physicalisation, Energy consumption, Household energy usage
Shining a Light on Energy Use: Combining AR and Physicalisation to Represent Household Energy Consumption
Melek Sungur, Negar Rahnamae, Konstantina Marra, Rosa van Koningsbruggen, Eva Hornecker
Bauhaus-Universität Weimar, Germany
Typically, energy consumption is recorded through a series of measurements, and while the potential impacts of it are discussed, the interconnection between causes and effects is often not clearly illustrated. Our project focuses on representing household energy consumption through data physicalisation and Augmented Reality. The demo consists of a studio maquette containing four household electronic devices: a laptop, washing machine, fridge, and lamp. The energy consumption of these devices is represented through the brightness of the device. To enhance the viewer’s experience and show, we use AR storytelling to show the impact of this data on the climate. Motivating the user to make positive improvement to their household energy consumption behaviour.
ID: 366
/ MCI-Demo Session: 13
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Be-greifbare InteraktionStichworte: bioplastics, bacterial cellulose, biomaterials, do-it-yourself, DIY, sustainability
Biomaterials for Prototyping in HCI
Madalina Nicolae1,2,3, Vivien Roussel2, Marion Koelle4, Claire Lefez2, Anne Roudaut5, Aditya Shekhar Nittala6, Samuel Huron3, Marc Teyssier2, Jürgen Steimle1
1Saarland University, Saarland Informatics Campus, Saarbrücken, Germany; 2Léonard de Vinci Pôle Universitaire, Research Center Paris La Défense, France; 3Télécom Paris, CNRS i3 (UMR 9217) Institut Polytechnique de Paris Palaiseau, France; 4OFFIS – Institute for Information Technology, Oldenburg, Germany; 5University of Bristol, Bristol, United Kingdom; 6University of Calgary, Calgary, Canada
While prototyping is a widespread practice among researchers, creating sustainable, functional devices remains challenging due to the limited range of available tools and materials. We present several approaches to sustainable prototyping of functional devices. Our methods range from using bio-based and bio-degradable materials as sustainable alternatives to biologically growing electronic substrates. These methods enable a new class of interactive devices that integrate electronic components with sustainable materials. Our research on Interactive Bioplastics [1] introduces a DIY approach for producing conductive bioplastics that are compatible with digital fabrication techniques. Furthermore, [2] introduces an integrated fabrication framework for sustainable soft shape-changing interfaces made of bioplastics. Finally, our work on Biohybrid Devices [3] showcases how the biological growth of living bio-materials--such as Bacterial Cellulose--can be used as assembling and embedding process for electronics. In addition to presenting various artifacts, we highlight the processes introduced by our fabrication frameworks [1-3] and engage the audience in discussions about the life-cycle phases of producing artifacts, promoting a critical reflection of sustainable practices in prototyping.
ID: 369
/ MCI-Demo Session: 14
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Health und WellbeingStichworte: Menstrual Cycle Tracking, Menstrual Health Technology, User Needs, Digital Female Health, Feminist HCI, Feminist Intersectionality in Digital Health, Women’s Health, Health Literacy
CyMe [si:][mi:]: Personalized and Seamless Menstrual Health Tracking
Marinja Principe1,4, Debs Stebler1,4, Davinny Sou2,3,4, Tobias Kowatsch1,2,3,4, Marcia Nißen2,4,
1University of Zurich, Switzerland; 2University of St.Gallen, Switzerland; 3ETH Zürich, Switzerland; 4Centre for Digital Health Interventions, Switzerland
Menstruating individuals experience various physiological and psychological changes throughout their reproductive lives and each menstrual cycle. Although numerous menstrual health tracking apps, exist, there is limited research on the impact of customization options and personalized visualizations of menstrual cycle data and associated symptoms on user experiences. Furthermore, evidence on the long-term effects of these features on menstrual health awareness and literacy is sparse. This demo contribution presents “CyMe” [siː][miː], a menstrual health tracking prototype designed to support the different needs of individuals while focusing on a simple yet effective reporting approach integrating smartphone and smartwatch data. CyMe focuses on customization options for seamless data reporting and personalized visualizations, enabling users to focus on their individual menstrual health challenges. CyMe also aims to enhance the user experience by giving the user agency over the look and feel of the application, allowing them to adjust self-reporting options, colors, and reminders and connect different sensor devices. This approach should support the wide range of needs of menstruating individuals to be seen and allow users to get actionable insights into their menstrual health.
ID: 371
/ MCI-Demo Session: 15
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Be-greifbare Interaktion, Reflexion und Perspektiven: Individuum und Gesellschaft, Affekt, Ästhetik, EmotionStichworte: Inflatables, pneumatics, data physicalization
Exploring Emotion physicalization Through Soft Robotics
Hanna Danilishyna, Alisa Popp, Rosa Koningsbruggen, Eva Hornecker
Bauhaus Universität Weimar, Germany
It has become increasingly possible to track and represent our emotions to get better understanding of them. However, whereas our emotions are ephemeral and fluctuate, common representations of emotions look true, permanent, and scientific. To explore how we can represent emotions in a way that is closer to how we feel and experience them, Emotion Bouquet was created. Emotion Bouquet consist of four pneumatic flowers that represent the four emotions of anger, sadness, calmness, and happiness through the shape, color, and movement of each flower. Pneumatics were used to created breathing patterns that replicate the our breathing during the respective emotions. For the color mappings, we relied on existing literature. Emotion Bouquet was deployed during a two-day exhibition. With this work, we aim to show the potential of soft robotic, physical data representations for the representation of emotions.
ID: 374
/ MCI-Demo Session: 16
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Prototyping und Interaktionsmodellierung, InklusionStichworte: Human Computer Interaction, Prototyping, Inklusion
Kubinos — Interaktionsprototypen zur Entwicklung von crossmedialen Interaktionenen im Kita-Alltag
Antonia Schäfer, Felix Sewing, Alexander Müller-Rakow
HTW Berlin, Deutschland
ENGLISCH – Educational professionals in daycare centres are confronted with a variety of challenges, including unanswered questions from children, language barriers and different behaviours. The PIIQUE research project has the objective of promoting the professionalisation of these professionals. In preliminary studies, the challenges in the working and learning process of the professionals were identified, and the development of small support objects for everyday daycare centre work was initiated. In this context, five interaction prototypes, designated as Kubinos, were developed, each equipped with a variety of sensors and actuators. The aforementioned prototypes were employed in workshops with professionals, during which they collaborated to develop cross-media objects designed to facilitate the day-to-day operations of the daycare centre. This indicated that functions such as RFID and visual pictogram displays were particularly desired. The results demonstrate the potential of the Kubinos to enrich the day-to-day work of the daycare centre and support the professionals in their work. The insights gained allow for the targeted further development of the Kubinos in order to further improve their practical suitability and educational effectiveness.
DEUTSCH – Pädagogische Fachkräfte in Kindertagesstätten (Kitas) sind mit einer Vielzahl von Herausforderungen konfrontiert, darunter unbeantwortete Fragen der Kinder, Sprachbarrieren und unterschiedliche Verhaltensweisen. Das Forschungsprojekt PIIQUE zielt darauf ab, die Professionalisierung dieser Fachkräfte zu fördern. In Vorstudien wurden die Herausforderungen im Arbeits- und Lernprozess der Fachkräfte identifziert. Auf deren Grundlage sollen kleine Unterstützungsobjekte entwickelt werden. In diesem Zusammenhang wurden fünf Interaktionsprototypen, sogenannte Kubinos, entwickelt, die mit verschiedenen Sensoren und Aktoren ausgestattet sind. Diese Prototypen wurden in Workshops mit Fachkräften eingesetzt, um gemeinsam crossmediale Objekte zur Unterstützung im Kita-Alltag zu entwickeln. Dabei zeigte sich, dass insbesondere Funktionen wie RFID und visuelle Piktogrammdarstellungen gewünscht werden. Die Ergebnisse zeigen das Potenzial der Kubinos, den Kita-Alltag pädagogisch zu bereichern und die Fachkräfte in ihrer Arbeit zu unterstützen. Die gewonnenen Erkenntnisse ermöglichen eine gezielte Weiterentwicklung der Kubinos, um ihre Praxistauglichkeit und pädagogische Wirksamkeit weiter zu verbessern.
ID: 378
/ MCI-Demo Session: 17
MCI: Demos: Interaktive Systeme oder Demonstratoren
Mensch-Computer-Interaktion: Mensch-Roboter Interaktion, Health und WellbeingStichworte: Voice Assistant, Embodied, Digital Health Interventions
Demonstrating GRACE: Our Embodied Voice Assistant Providing Cognitive Interventions
Rasita Vinay1, Nora C. Tommila2, Mathias Schlögl3, Stefan Klöppel4, Nikola Biller-Andorno5, Tobias Kowatsch6
1Institute of Biomedical Ethics and History of Medicine, University of Zurich; School of Medicine, University of St. Gallen, Switzerland; 2Department of Management, Technology and Economics, ETH Zurich, Switzerland; 3Department of Geriatric Medicine, Clinic Barmelweid, Switzerland; 4University Psychiatric Services Bern, University Hospital of Old Age Psychiatry and Psychotherapy, Switzerland; 5Institute of Biomedical Ethics and History of Medicine, University of Zurich, Switzerland; 6Institute for Implementation Science in Health Care, University of Zurich; School of Medicine, University of St. Gallen; Centre for Digital Health Interventions, Department of Management, Technology, and Economics at ETH Zurich, Switzerland
Our demo describes the development and interaction of our first prototype of GRACE, an embodied voice assistant. GRACE was designed to provide voice-based cognitive interventions to its users, and was first piloted with healthy adults in its current version. The body of GRACE, which was 3D printed, encases the internal components such as the Raspberry Pi, reSpeaker microphone, an AMOLED screen, and a bluetooth speaker. We utilized an open-source robotics platform and their simulation software for executing the script and commands for the voice interactions. A text-to-speech voice from ElevenLabs was used as the voice for GRACE. We were able to develop and design four activities, including two cognitive interactions based on cognitive stimulation therapy. The interaction consisted of an introductory warm-up activity, two cognitive interventions, and a concluding activity.
|