Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Only Sessions at Location/Venue 
 
 
Session Overview
Session
STS 2: STS Making Entertainment Content More Inclusive
Time:
Wednesday, 10/July/2024:
1:30pm - 3:30pm

Session Chair: Deborah Fels, Toronto Metropolitan University
Session Chair: Rumi Hiraga, Tsukuba University of Technology
Session Chair: Yuhki Shiraishi, Tsukuba University of Technology
Location: Track 2

Ceremony Room B Uni-Center, 1st floor 118 seats (145) Cinema/theater-style seating with a gallery https://www.jku.at/en/campus/the-jku-campus/buildings/uni-center-university-cafeteria/

Show help for 'Increase or decrease the abstract text size'
Presentations
ID: 112 / STS 2: 1
LNCS submission
Topics: STS Making Entertainment Content More Inclusive
Keywords: accessible darts, audio-tactile displays

TARGET: Tactile-Audio daRts for Guiding Enabled Throwing

D. Fels1, M. Kobayashi2

1National University Corporation Tsukuba University of Technology; 2Toronto Metropolitan University, Canada

The game of darts is a popular, simple to learn and difficult to master, social game. The TARGET tool was developed to support access to a BLE-enabled electronic dartboard for people who are Blind/Low Vision. Functionality included aurally announcing the score, current thrown position and remaining score to win. Human researchers tapped a metal rod on the dartboard to assist with aiming the dart, and provided advice on what possible areas on the board should be targeted. Tactile displays consisted of a small diameter replica of the dartboard that could be held in one hand and a second dart board on a horizontal surface that duplicated the dart positions that could be felt by the user. A user study with five blind participants was conducted to learn the game rules, practice throwing the darts and then playing an actual game. Results indicated that blind players enjoyed the game and were willing to play again, were in flow and developed a somewhat high degree of perceived competency over the duration of the study. Future work involves using machine learning technology to improve aiming support.



ID: 157 / STS 2: 2
LNCS submission
Topics: STS Making Entertainment Content More Inclusive
Keywords: Upper Extremity Motor Impairment, Video Game Accessibility, Mobile Device Accessibility, Head Gestures

Personalized Facial Gesture Recognition for Accessible Mobile Gaming

D. Ahmetovic

Università degli Studi di Milano, Italy

For people with Upper Extremity Motor Impairments (UEMI), interaction with mobile devices is challenging because it relies on the use of the touchscreen interface. Assistive technologies that replace touchscreen interactions with sequences of simpler, more accessible ones have been proposed. However, these sequential interactions are slower, and therefore not suitable for time-constrained interaction (e.g., games). One-to-one remapping of touchscreen interactions to alternative inputs has been proposed as a way to enable accessibility of existing games. In this context, external switches and vocal sounds have been used with promising results. However, for people with UEMI that cannot access external switches and have a speech impairment (e.g., anarthria), these interactions are still inaccessible.
We propose a new one-to-one interaction substitution method based on personalized Facial Gestures (FGs) recognition to account for the specific needs of different users with UEMI. Our approach relies on few-shot learning to enable custom definition of personalized FGs which are then mapped to the required game interactions. In this work we describe the FG recognition pipeline, and in particular we detail the processes of feature selection, few-shot learning, result aggregation, and their fine-tuning. Preliminary experimental evaluation indicates a classification accuracy of 96.99% and the ability to process 8.26 ± 1.55 frames per second on a commodity Android device.



ID: 195 / STS 2: 3
LNCS submission
Topics: STS Making Entertainment Content More Inclusive
Keywords: Visual impaired, para e-sports, spatial cognition, audio-tactile effects, falling block puzzle games

Tactris: Inclusive Falling Block Puzzle Game with Audio-Tactile Effects for Visually impaired People

M. Matsuo1, D. Erdenesambuu1, J. Onishi1, T. Miura2

1Tsukuba University of Technology, Japan; 2AIST, Japan

The purpose of this study is to develop an action puzzle game that is easy to play for visually impaired people, and to expand the number of visually impaired people participating in e-sports. To this end, we developed a prototype of Tactris, a falling-puzzle game that can be played by using auditory and tactile information, and evaluated the ease of use and playability of the game for six visually impaired people. We have been studying the presentation method, operation method, and game rules necessary to realize a falling object puzzle game that can be played by visually impaired people, and have aimed to develop a falling object puzzle game that combines voice, sound effects, and tactile illustrations. We aimed to research and develop a puzzle game that presents the situation in the game by combining audio, sound effects, and tactile illustrations. Since this interface uses dynamically changing tactile diagrams, it is expected to be used as a content to efficiently promote understanding of tactile perception in education for the visually impaired. It will also help to improve the tactile skills of dynamic tactile diagrams.
Six visually impaired people evaluated the ease of use and playability of the game, and although the ease of use remained an issue, there was a strong need to play the game repeatedly.



ID: 175 / STS 2: 4
LNCS submission
Topics: No STS - I prefer to be allocated to a session by Keyword(s)
Keywords: Design for All and Universal Design, inclusive game, fighting game, auditory cues, empirical study

Inclusive Fighting with Mind’s Eye: Case Study of a Fighting Game Playing with Only Auditory Cues for Sighted and Blind Gamers

M. Matsuo1, J. Onishi1, T. Miura2

1Tsukuba University of Technology; 2National Institute of Advanced Industrial Science and Technology (AIST)

Computer games have diversified due to advances in hardware performance and software capabilities. However, this context has led to the problem that visually impaired people frequently find it difficult to enjoy most of these state-of-the-art games, despite the vast amount of accessibility research on content and interfaces. Meanwhile, a growing number of playable games regardless of visual impairment status have been released. However, there are still few games enabling both blind and sighted players to compete against each other. The Street Fighter 6 ® (Capcom Co. Ltd.), released in 2023, introduced a sound accessibility feature for the first time in a commercial fighting action game. By clarifying the requirements for sighted and visually impaired players to compete smoothly using this feature, not only can they participate in the game regardless of their disability status, but also new accessible interfaces can be created by using the technology. In this study, the goal is to evaluate the playability of a fighting game with sound accessibility features for visually impaired and sighted groups. This paper reports on the evaluation results of usability and user experience of the sound accessibility features implemented in the Street Fighter 6, for two groups. In this extended abstract, we report only the results for the sighted players that we have analyzed, and if the study is accepted, we will report the results for the visually impaired players as well in our article.



ID: 184 / STS 2: 5
LNCS submission
Topics: STS Making Entertainment Content More Inclusive
Keywords: Deaf and Hard of Hearing, Music, Vibration, VIBES

Towards Improving The Correct Lyrics Detection By Deaf And Hard Of Hearing People

H. Yamamoto, R. Hiraga, K. Yasu

Tsukuba University of Technology

Access to music for people who are Deaf/Hard of Hearing (D/HoH) includes not only the instrumental portion but also the lyrics. While closed captioning can provide the lyrics in text format, it does not necessarily provide accurate timing of the lyrics with the instrumental portion. The purpose of this study is to clarify how vibrotactile stimuli affects the understanding of the onset timing of song lyrics for D/HoH people. To achieve this goal, we developed a system, called VIBES: VIBrotactile Engagement for Songs, that simultaneously provides music and vibration playback as an iPhone app. Unlike other vibrotactile systems for music, which focus primarily on the percussion/beat or frequencies of the instrumental portions, VIBES presents vibrations for the timing of vocal utterances, syllable by syllable. We conducted a study with 10 participants to determine the effectiveness of the system. Although statistically no effectiveness was obtained, the mean value of the correct timing acquisition increased after using the system. We found VIBES effective to a participant from the subjective evaluation who judged listening comprehension very poor before using VIBES while very positive after using it.



ID: 128 / STS 2: 6
LNCS submission
Topics: STS Making Entertainment Content More Inclusive
Keywords: Information Accessibility, Information Support, Information Sharing, Deaf and Hard of Hearing, Blind and Low Vision

Enhancing Accessibility in Sports and Cultural Live Events: A Web Application for Deaf, Hard of Hearing, Blind, Low Vision, and All Individuals

Y. Shiraishi, R. Hiraga, M. Kobayashi, Y. Zhong

Tsukuba University of Technology, Japan

This paper addresses the issue in live events, especially in sports viewing, where individuals who are deaf, hard of hearing (DHH), blind, or have low vision (BLV) struggle to access sufficient information. Traditionally, information accessibility has relied on specific professionals or volunteers, which is often inadequate. To tackle this challenge, we propose a mechanism that facilitates information sharing for all individuals, regardless of their abilities. We have developed an inclusive web application tailored to address these needs. This application is beneficial not only for DHH and BLV individuals but also for all audiences. Additionally, the system's applicability extends to other domains, such as museum visits. This paper details the newly developed web application and presents the outcomes of pilot studies conducted in sports viewing and museum settings with DHH and BLV participants. The results of these experiments are analyzed to assess the system's effectiveness and to identify future improvement areas.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ICCHP 2024
Conference Software: ConfTool Pro 2.8.102+TC+CC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany