ID: 310
/ I3: 1
Innovation Area Activity Proposal
Keywords: STEM, accessibility
POSTER (15'): The STEM Motivation & Accessibility (STEMMA) project: a Preliminary Study on Awareness of the STEM Gender Gap
B. Leporini1,2
1University of Pisa, Italy; 2ISTI-CNR
The STEMMA (Science, Technology, Engineering, Mathematics, Motivation and Accessibility) project is aimed at reducing the gap in STEM access and curricula related to gender and disabilities by empowering every person toward a scientific career. For this purpose, the project investigates the perceived difficulties and obstacles that may limit both women and visually impaired people from accessing educational and career paths in STEM areas.
The literature reports that there are numerous difficulties in accessing STEM content and careers; consequently, many people may be deterred from pursuing STEM studies for these reasons. However, the reasons could also be related to a lack of awareness and knowledge that with the right tools and educational approaches one can also tackle STEM careers. Motivational and persuasive tools and methodologies could be more proposed and disseminated in the literature, and among students who have to decide for their school career. We believe that this part is still underdeveloped, and the STEMMA project wants to make a contribution in this direction. Starting with a cognitive survey on STEM subjects and possible testimonials in the field, in a first phase of the project we intend to investigate among citizens about their awareness of this. For this purpose, we designed an interactive eBook and set up a survey questionnaire on STEM topics to be disseminated among the population. To this end, exploiting social networks and communities an accessible on line questionnaire has been distributed in Italy and will be now distributed in EU, in order to detect any different point of view also deriving from cultural aspects.
Bibliography 1.Battaglini C., Bottari D., Gnecco G. and Leporini B. (2023).
Game accessibility for visually impaired people: a review.
2.Hersh M.A., Leporini B., Buzzi (2024).
A Comparative Study of Disabled People's Experiences with the Video Conferencing Tools Zoom, MS Teams, Google Meet and Skype. In Behaviour & Information Technology, 2024, Taylor & Francis.
3.Battaglini C., Biancalani F., Bottari D., Camurri A., Gnecco G., Leporini B. (2023).
Increasing accessibility of online board games to visually impaired people via machine learning and textual/audio feedback: the case of “Quantik”. In the 14th EAI International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN 2023), Novembre 27,2023, springer.
4.Buzzi M., Di Bella D., Gazzarrini C., Leporini B. (2023)
Experience of Visually-Impaired Children in the Enjoyment of Cartoons. In the 9th EAI International Conference on Smart Objects and Technologies for Social Good (GoodTechs). October 18-20, 2023. Springer.
5.Vozna A., Galesi G. and Leporini B. (2023).
Investigating the use of the Thunkable end-user framework to Develop Haptic-based Assistive Aids in the Orientation of Blind People. In the Proc. Of the 19th International Conference on Web Information Systems and Technologies, WEBIST (Rome, Italy, Novembre 15-17, 2023).
6.Buzzi M.C., Buzzi M., Della Penna G., Leporini B. and Ricci F. (2023).
Accessibility of e-government Websites in Italy: the User Experience of People with Disabilities. In the Proc. Of the 19th International Conference on Web Information Systems and Technologies, WEBIST (Rome, Italy, Novembre 15-17, 2023).
ID: 305
/ I3: 2
Innovation Area Activity Proposal
Keywords: Augmentative and alternative communication, Autism spectrum disorder, iPad, PVSM Model, Social communication
POSTER (15'): Social Communication Intervention Integrating iPad for a Young Adult with Severe Autism Spectrum Disorder
Y.-C. Wang
Division of Special Education, PhD Program of Education, National Dong Hwa University, Taiwan, R. O. C.
The study investigated the effectiveness of using an iPad App “Listen and Talk” and intergrating augmentative and alternative communication (AAC) intervention to improve social communication participation of a young adult with severe autism spectrum disorder (ASD), the intervention approach combined “preparation, functional vocabulary, sentence structure and using in milieu model” (PVSM Model). The study used case study method. Data were collected in 14 weeks sessions, including social communication performance of the user during intervention and interviewed his communication partner. The results showed that the intervention enhanced user’s participation in taking-a-taxi activity and generalized social communication performance with his communication partner. In conclusion, the study showed that a young adult with severe ASD could benefit from using iPad technology and AAC intervention to increase social communication participation.
ID: 299
/ I3: 3
Innovation Area Activity Proposal
Keywords: Ambient and Assisted Living (AAL), gaze-based Accessibility Technology, Mobile Devices, Smart Home
DEMO (60'): A Gaze-based personalisable Environment Control for mobile Android Devices
T. Ableitner, P. Gersbacher, G. Zimmermann
Stuttgart Media University, Germany
In Germany alone, there are around 209,000 people with at least one arm missing or limited in its function and a further 295,000 people with musculoskeletal impairments as a result of an organic brain psychosyndrome (Federal Statistical Office, statistics on severely disabled people, 2022). For those affected, operating digital devices using touch, which is widely used on smartphones and tablets, can become a challenge or even impossible. To overcome these and other barriers, operating systems - such as iOS and Android - have accessibility services. For people with motor impairments in their arms / hands, Android offers the Voice Access and Switch Access accessibility service. With Voice Access, a smartphone / tablet can be operated using voice commands. Clickable elements are labelled or a grid is displayed. With switch access, the clickable elements can be controlled using various scanning methods. Both physical buttons and eye and facial gestures, which are detected by the Android device's front camera, can be used as switches. Both accessibility services enable people with motor impairments to access mobile Android devices. However, they are not suitable per se for all application contexts and affected persons. Voice Access, for example, requires good speech skills to ensure smooth and reliable operation. Particularly in people whose motor impairments are due to a neurological disease (e.g. ALS), their ability to speak is often impaired during the course of the disease. Furthermore, noises (e.g. video playback) can impair speech recognition in Voice Access, whereas switch access works more reliably. However, compared to conventional input methods (touch and mouse), it can be slow and require more interaction steps. An alternative to the aforementioned accessibility services can be a mouse, which is controlled by the affected person as a mouth stick using the tongue or camera-based head tracking. However, both alternatives have in common that additional hardware is required, the acquisition costs of which quickly exceed those of the mobile device and stigmatisation of the user cannot be ruled out.
As part of the "FourWays" research project, funded by the German Federal Ministry of Education and Research, we have developed a pilot Android app for environment control for people with motor impairments that only requires the built-in front camera. It can be operated via manual/automatic scanning or a pointer. When scanning, there is also a choice between linear scanning and scanning in 4 directions. The user controls the scanning or the pointer with facial gestures. We use the ML Kit from Google to detect these gestures. We optimise the sensor values obtained from this with our own algorithms in order to keep the rate of false detections and incorrect entries as low as possible in line with the individual abilities of the user. Specifically, the environment control utilises the following facial gestures:
- Head tilt (up, down, left, right)
- Head rotation (left, right)
- Blinking (distinction between left and right eye)
- Smile
- Open mouth
When developing the environment control, we placed great emphasis on personalisation. For the input methods, this means that each user can set which facial gestures they use (example configuration for manual linear scanning: previous element = turn head to the left, next element = turn head to the right, select element = blink). In addition, the sensitivity at which an input event is triggered can be set for each facial gesture.
In addition to the input methods, the app's tile-based user interface can also be personalised (e.g. number of tiles per screen, font size). For example, very large tiles enable touch operation for people with limited motor skills. The tiles are arranged in a hierarchical menu and each linked to an action (e.g. switch on the socket). The menu is defined via an XML file and can therefore be customised to the application context. Our environment control currently supports thermostats, sockets and lamps from various manufacturers as well as the playback of multimedia content on an external device with a large screen. Thanks to a plug-in concept, third parties can integrate additional devices at any time without having access to the source code of the environment control.
In the future, we plan to expand the environment control with a function for assisted communication and to supplement the configuration with an import and export function for profiles in order to simplify switching between different mobile devices and / or application contexts.
In the demo session at ICCHP 2024, we would like to start with a short presentation (15min) to explain the concept of our gaze-based app for environment control. Afterwards, attendees can either try out the app themselves or have a demo (45min). We hope that the demo session will provide us with feedback on the current state of development and contacts with potential future users.
Bibliography Ableitner, T., Heilemann, F., Schilling, A., Soekadar, S., & Zimmermann, G. (2022, July). Hands-Free Interaction Methods for Smart Home Control with Google Glass. In International Conference on Computers Helping People with Special Needs (pp. 121-129). Cham: Springer International Publishing.
Ableitner, T., Soekadar, S., Schilling, A., Strobbe, C., & Zimmermann, G. (2019). User acceptance of augmented reality glasses in comparison to other interaction methods for controlling a hand exoskeleton.
Ableitner, T., Soekadar, S., Strobbe, C., Schilling, A., & Zimmermann, G. (2018). Interaction techniques for a neural-guided hand exoskeleton. Procedia Computer Science, 141, 442-446.
Koch, S., Ableitner, T., & Zimmermann, G. (2022, July). Comparison of Guidelines for the Accessible Design of Augmented Reality Applications. In International Conference on Computers Helping People with Special Needs (pp. 89-98). Cham: Springer International Publishing.
Schneider, R., Ableitner, T., & Zimmermann, G. (2022, July). Layered audio descriptions for videos. In International Conference on Computers Helping People with Special Needs (pp. 51-63). Cham: Springer International Publishing.
Koch, S., Ableitner, T., & Zimmermann, G. (2022, July). Implementation and Evaluation of a Control System for a Hand Exoskeleton on Mobile Devices. In International Conference on Computers Helping People with Special Needs (pp. 411-419). Cham: Springer International Publishing.
Ngo, E., Ableitner, T., Koch, S., & Zimmermann, G. (2022, November). Virtualization of a Smart Home Lab: Design, Implementation and Evaluation. In Online-Labs in Education (pp. 79-98). Nomos Verlagsgesellschaft mbH & Co. KG.
Reuter, B., Zimmermann, G., Ableitner, T., & Koch, S. (2022, November). OpenAPETutorial–A Problem-Based Learning Unit for the Personalization of Smart Home Applications. In Online-Labs in Education (pp. 181-200). Nomos Verlagsgesellschaft mbH & Co. KG.
Reuter, B., Zimmermann, G., Ableitner, T., & Koch, S. (2022, November). Universal Design & Personalization for Smart Homes–Implementation. In Online-Labs in Education (pp. 289-306). Nomos Verlagsgesellschaft mbH & Co. KG.
Hayat, Y., Ableitner, T., Zimmermann, G., & Koch, S. (2022, November). Universal Design & Personalization for Smart Homes–Concepts. In Online-Labs in Education (pp. 265-288). Nomos Verlagsgesellschaft mbH & Co. KG.
Kollotzek, G., Zimmermann, G., Ableitner, T., & Nebe, A. M. (2021). Comparison of Manual Evaluation Methods for Assessing the Accessibility of Websites based on EN 301 549. In CHIRA (pp. 24-35).
Zimmermann, G., Ableitner, T., & Strobbe, C. (2017, June). User needs and wishes in smart homes: what can artificial intelligence contribute?. In 2017 14th International Symposium on Pervasive Systems, Algorithms and Networks & 2017 11th International Conference on Frontier of Computer Science and Technology & 2017 Third International Symposium of Creative Computing (ISPAN-FCST-ISCC) (pp. 449-453). IEEE.
|