Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Only Sessions at Location/Venue 
 
 
Session Overview
Session
STS 3B: STS: Blind and Low Vision: Orientation and Mobility
Time:
Thursday, 11/July/2024:
3:30pm - 5:00pm

Session Chair: Alireza Darvishy, Zurich University of Applied Sciences
Location: Track 4

Meeting Room 6 Uni-Center, 1st floor 140 people https://www.jku.at/en/campus/the-jku-campus/buildings/uni-center-university-cafeteria/

Show help for 'Increase or decrease the abstract text size'
Presentations
ID: 142 / STS 3B: 1
OAC Submission
Topics: No STS - I prefer to be allocated to a session by Keyword(s)
Keywords: blind and/or vision impaired, thermal-tactile biofeedback, spatial navigation, wearable

Orientation Aid For Blind And Visually Impaired People By Using Thermal-Tactile Biofeedback At Lumbar Region For Hazard Prevention: A User Survey

V. Frank

University of Applied Sciences Upper Austria, Austria

Blind and visually impaired people rely on acoustic information when using navigation guides. But the auditory sensation is imperative for road traffic orientation as well. Due to the fact that this sensory channel can quickly become overloaded, it is obvious to consider another sensorial perception like the haptic channel for navigation commands. Thermal biofeedback demonstrates high potential to present messages without demanding special attention and has therefore advantages for gathering low cognitive workload. Further investigations are needed to consider how thermal information could be perceived by this target group and how far it can support users’ navigation. Cold stimuli can be perceived more quickly. This fact resulted in the decision to use cold perception for stop signals. The prototype consists out of 2 heat modules for spatial instructions and 1 cooling module to display a stop command at lumbar region. A camera in the head area detects obstacles. In Trial 0 the users’ absolute perception threshold was estimated. During Trial 1&2 an anti-collision walking experiment was performed with 2 different transmission signals. The reaction behaviors and collisions were only observed, user feedback and data were collected, and the NASArtlx index was later determined. 8 BVI users participated. The stop signal triggers a reliable reaction. By considering only thermal biofeedback perception, all participants were surprised in a positive way.



ID: 248 / STS 3B: 2
LNCS submission
Topics: STS Accessibility and Usability of Mobile Platforms for People with Disabilities and Elderly Persons: Design, Development and Engineering
Keywords: People who are blind, orientation and mobility, customization, mobile devices

Empowering Orientation and Mobility Instructors: Digital Tools for Enhancing Navigation Skills in People Who Are Blind

W. Viana

Federal University of Ceará, Brazil

Spatial navigation could present challenges for a Person Who is Blind (PWB), impeding their ability to effectively determine locations, navigate, and interact with their environment. This study proposes an innovative orientation and mobility (OM) virtual environment. Our research encompasses the development of a map editor and an audio-haptic OM training environment for mobile devices, empowering OM instructors to create customizable maps for training PWBs, thus enhancing learner autonomy and independence. We evaluated it through an extensive assessment involving 25 human-computer interaction (HCI) specialists, 24 OM instructors, and 10 PWBs. The evaluation aimed to gauge the system's effectiveness in improving the navigation and wayfinding skills of PWBs. Findings suggest that our approach significantly aids PWBs in creating mental maps, facilitating navigation and spatial awareness. Feedback from HCI experts, OM instructors, and PWB participants was instrumental in identifying critical areas for further refinement, particularly in enhancing the intuitiveness of the user interface and the accuracy of audio-haptic feedback. By enabling OM instructors to tailor learning materials to the specific needs of their students, this tool has the potential to make a meaningful impact on the autonomy and mobility of PWB. Further research and development are warranted to refine the system and explore its full potential in real-world applications.



ID: 221 / STS 3B: 3
LNCS submission
Topics: STS Blindness Gain or New Approaches to Artwork Perception and ICT Mediation
Keywords: (e)Accessibility, User Centered Design and User Participation, Assistive Technology (AT)

Designing an Inclusive Audiovisual Tool for Museum Exploration

M. Erdemli

Carleton University, Canada

This study focuses on the formative design of an audio map for individuals with blindness and low vision (BLV) for accessible wayfinding at museums. The first step entails examining the preferred accessibility features of wayfinding tools among participants. This research aims to explore design elements of an audio map and evaluate interaction strategies for an accessible wayfinding tool enhancing wayfinding and spatial awareness for users with BLV. This research contributes to a co-design approach of an ability-based audio map through surveys, interviews and discussions with individuals with blindness, low vision, and hearing impairments, and collaboration with accessibility specialists.



ID: 126 / STS 3B: 4
LNCS submission
Topics: No STS - I prefer to be allocated to a session by Keyword(s)
Keywords: blindness, wayfinding, Sensor Technology

Step Length Estimation for Blind Walkers

R. Manduchi

UC Santa Cruz, United States of America

Independent wayfinding can be challenging for people who are blind. While GPS localization can be very helpful in outdoor environments, GPS cannot be used inside buildings, and thus different mechanisms for localization and wayfinding need to be relied upon. In this work we focus on inertial sensing for localization. Inertial sensors (accelerometers and gyros) are contained in any standard smartphone. Data from these sensors can be used by pedestrian dead-reckoning (PDR) algorithms to estimate the user's location given a known starting point.

Standard PDR algorithms use inertial data to count the number of steps taken by the user, and to determine the walking direction. By multiplying the number of steps by each step's length, the distance traversed in a certain period of time can be determined. This approach, however, requires knowledge of the length of each step taken by the walker.

In this study, we tested a machine learning algorithm for step length estimation on inertial data from 7 blind walkers (5 using a long cane and 2 using a dog guide). Note that the gait of a blind walker using a cane is typically different from that of a sighted walker, and it is also different from that of walkers using a dog guide. It is thus important that the step length prediction system be tested with data from walkers from the same communities the wayfinding is designed for.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ICCHP 2024
Conference Software: ConfTool Pro 2.8.102+TC+CC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany