Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Only Sessions at Location/Venue 
 
 
Session Overview
Session
STS 3A: STS: Blind and Low Vision: Orientation and Mobility
Time:
Thursday, 11/July/2024:
8:30am - 10:00am

Session Chair: Alireza Darvishy, Zurich University of Applied Sciences
Location: Track 4

Meeting Room 6 Uni-Center, 1st floor 140 people https://www.jku.at/en/campus/the-jku-campus/buildings/uni-center-university-cafeteria/

Show help for 'Increase or decrease the abstract text size'
Presentations
ID: 160 / STS 3A: 1
LNCS submission
Topics: No STS - I prefer to be allocated to a session by Keyword(s)
Keywords: Orientation and Mobility, Visually impaired, New technology and applications

Use of Technology and Applications in Orientation and Mobility of Visually Impaired Persons in Bulgaria: A Contemporary Overview

M. V. Tomova

Sofia University "St. Kliment Ohridski", Bulgaria

This paper examines the use and popularity of some technological solutions and applications for orientation and mobility purposes among visually impaired persons in Bulgaria. Through questionnaires it reports about the knowledge and use of technology in visually impaired students taught in special schools and adults with visual impairments who are provided orientation and mobility in rehabilitation centers as well as involves participants from the Union of the Blind in the country. The paper attempts to offer an overview of small number of representatives from variety of organizations for visually impaired in the country and to shape a picture of the current situation and the future trends in the technological devices’ and applications’ use in orientation and mobility among visually impaired individuals in Bulgaria.



ID: 119 / STS 3A: 2
LNCS submission
Topics: STS Innovation and Change in the Delivery of Future Assistive Technology Services
Keywords: visual impairment, Assistive Technology (AT), orientation, navigation, virtual simulation

Advancing Mobility for the Visually Impaired: A Virtual Sound-Based Navigation Simulator Interface

D. Erdenesambuu1, M. Matsuo1, T. Miura2, J. Onishi1

1Tsukuba University of Technology, Japan; 2National Institute of Advanced Industrial Science and Technology

This study explores the development and evaluation of a navigation system designed specifically for visually impaired users. The research primarily focuses on enhancing the clarity and accuracy of voice and sound guidance, leveraging technology adapted from autonomous driving navigation systems. The investigation begins by examining how visually impaired individuals comprehend and utilize guided systems and the duration required for them to become proficient in their use.
A key component of the study is the construction of a speech / sound guide model. This model integrates beacon sounds, such as auditory beacons, to improve the effectiveness of the guidance provided. A group of seven visually impaired participants was involved in evaluating this model. Their feedback and performance were critical in assessing the practicality and efficiency of the developed system.
The results demonstrate that all participants were able to reach their destinations effectively by following the guidance of the voice guide model. This indicates a significant improvement in the navigational aid provided to visually impaired users. The study's findings underscore the potential of incorporating advanced auditory signals in enhancing the mobility and independence of visually impaired individuals.
Overall, this research contributes to the growing field of assistive technologies for the visually impaired, offering insights into the design and implementation of more effective and user-friendly navigation systems.



ID: 229 / STS 3A: 3
LNCS submission
Topics: STS Accessibility and Usability of Mobile Platforms for People with Disabilities and Elderly Persons: Design, Development and Engineering
Keywords: (e)Accessibility, Assistive Technology (AT), People with Disabilities, Visually Imapired, Indoor Wayfinding

FindMyWay: Toward Developing an Accessible Wayfinding Application for Visually Impaired in an Indoor Space

U. Das, B. Hong

The College of New Jersey, United States of America

Due to the limitations of satellite-based navigation systems, such as GPS, indoor wayfinding has been very challenging for people with disabilities. Unfamiliarity and infrastructure complexity in an indoor space could put an additional burden on people in their wayfinding, who have various types of disabilities such as vision, mobility, hearing, etc. An accessible wayfinding application could be of great help to them in navigating an indoor space. To fulfill the wayfinding needs of visually impaired persons in an indoor space, an iOS application named “FindMyWay” has been developed in this work based on Bluetooth Low Energy (BLE) beacons. FindMyWay provides both exploration and navigation features for the visually impaired for indoor wayfinding. This work incorporated a parallel processing approach into the app while implementing both exploration and navigation functionalities. The exploration feature provides customized information about a point of interest (PoI) in an indoor space based on user preference. On the other hand, the navigation feature provides guidance to reach a destination in a multi-floor environment. The preliminary results indicate the effective performance of the app while navigating an indoor space.



ID: 180 / STS 3A: 4
LNCS submission
Topics: STS Advanced Technologies for Innovating Inclusion and Participation in Labour, Education, and Everyday Life
Keywords: Independent mobility, Blindness, Visual impairment, Electronic travel aids, Artificial intelligence, Intelligent system, Neural network, Cane mountable, Embedded device

An Embedded AI System for Predicting and Correcting the Sensor-Orientation of an Electronic Travel Aid During Use by a Visually Impaired Person

P. Chanana

Indian Institute of Technology Delhi, India

We have developed an AI-based embedded system that fits on the Electronic Travel Aid (ETA), detects the orientation of the sensors of ETA in real-time using an Inertial measurement unit sensor, and guides the user to self-correct incorrect sensor orientation through intuitive audio-vibratory feedback. The system aims to minimize the dependence on trainers for learning to use ETA effectively and promote self-learning, especially in resource-constrained geographics.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ICCHP 2024
Conference Software: ConfTool Pro 2.8.102+TC+CC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany