Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Only Sessions at Location/Venue 
 
 
Session Overview
Session
STS 6A: STS Tactile Graphics and 3D Models for Blind People and Shape Recognition by Touch
Time:
Friday, 12/July/2024:
8:00am - 9:30am

Session Chair: Yoshinori Teshima, Chiba Institute of Technology
Session Chair: Tetsuya Watanabe, Niigata University
Session Chair: Kazunori Minatani, National Center for University Entrance Examinations
Location: Track 1

Ceremony Room A Uni-Center, 1st floor 210 seats (253) Cinema/theater-style seating with a gallery https://www.jku.at/en/campus/the-jku-campus/buildings/uni-center-university-cafeteria/

Show help for 'Increase or decrease the abstract text size'
Presentations
ID: 139 / STS 6A: 1
LNCS submission
Topics: STS Tactile Graphics and 3D Models for Blind People and Shape Recognition by Touch
Keywords: Design for All and Universal Design, Assistive Technology (AT), (e)Accessibility, Tactile Relief

Designing an Inclusive Tactile Panoramic Relief of the City of Graz

A. Reichinger

VRVis Zentrum für Virtual Reality und Visualisierung Forschungs-GmbH, Austria

Tactile models are an important tool for blind and visually impaired (BVI) people. They help to understand objects or situations that are difficult to perceive without the visual channel.

In this paper, we report on the practical experience of designing a tactile representation of the view over a city and landscape from the elevated position of a hill, which was digitally created from a combination of various geographic data sources and photographs. The tactile relief is part of a permanent museum exhibition and is mounted on an outdoor balcony enabling all museum visitors to experience the breathtaking view. Unlike existing works, we intended to create a faithful tactile representation of the view, mimicking many aspects of how the human eye perceives the world. This includes correct panoramic projection, three-dimensional representation of all buildings with realistic surface textures, correct depth layering, and plausible foreshortening not only in image space but also in depth. As perceived by the human eye, near objects are not only larger, but also have a more pronounced depth.

We describe the entire process, from design considerations, data acquisition, depth-aware projection mapping algorithm, texture generation, production, mounting, and the inclusion of markers and a legend pointing out important buildings. We provide preliminary user feedback and will conduct a formal evaluation in time for the final publication.



ID: 177 / STS 6A: 2
LNCS submission
Topics: STS Tactile Graphics and 3D Models for Blind People and Shape Recognition by Touch
Keywords: Additive Manufacturing, Anatomical 3D Model, Simplified Shape, Tactile 3D Model, Tactile Teaching Material, Tactile Learning

Improvements of Tabletop Model of Internal Organs for Anatomy Learning of the Visually Impaired

Y. Teshima

Chiba Institute of Technology, Japan

This study established an enhanced tabletop model of internal organs designed to anatomy learning for students with visual impairments. We implemented several modifications to the model presented in 2022. These included adjusting the color scheme of the respiratory system components, correcting the positional relationship between the pancreas and stomach, and modifying the shape of the large intestine. Unlike the previous model, which did not incorporate magnets to connect or fixate organs, this updated model integrates this feature. As a result of these modifications, we have developed three variations of the model: Model-A, which does not use magnets to connect and fixate organs; Model-B, which employs magnets solely for fixation purposes and not for connectivity; and Model-C, which uses magnets for both organ connection and fixation. The results of the evaluation experiment revealed that Model-B serves as a superior instructional tool in terms of operability.



ID: 209 / STS 6A: 3
LNCS submission
Topics: STS Tactile Graphics and 3D Models for Blind People and Shape Recognition by Touch
Keywords: (e)Accessibility, Assistive Technology, Sensor Technology, Tactile Graphics, Audio Labeling

Accessible Point-and-Tap Interaction for Acquiring Detailed Information about Tactile Graphics and 3D Models

A. Narcisi1, D. Ahmetovic1, J. Coughlan2

1University of Milan, Italy; 2Smith-Kettlewell Eye Research Institute, United States of America

We have devised a novel “Point-and-Tap” interface that enables people who are blind or visually impaired (BVI) to easily acquire multiple levels of information about tactile graphics and 3D models. The interface uses an iPhone’s depth and color cameras to track the user’s hands while they interact with a model. To get basic information about a feature of interest on the model read aloud, the user points to the feature with their index finger. For additional information, the user lifts their index finger and taps the feature again. This process can be repeated multiple times to access additional levels of information. For instance, tapping once on a region in a tactile map could trigger the name of the region, with subsequent taps eliciting the population, area, climate, etc. No audio labels are triggered unless the user makes a pointing gesture, which allows the user to explore the model freely with one or both hands. In addition, multiple taps can be issued in rapid succession to skip through to the desired information (an utterance in progress is halted whenever the fingertip is lifted off the feature), which is much faster than having to listen to all levels of information being played aloud in succession to reach the desired level. Experiments with BVI participants demonstrate that the approach is practical, easy to learn and effective.



ID: 230 / STS 6A: 4
LNCS submission
Topics: STS Tactile Graphics and 3D Models for Blind People and Shape Recognition by Touch
Keywords: tactile graphics, handwriting, blind

Automatic Generation of Tactile Graphics of Characters to Support Handwriting of Blind Individuals

S. Sonobe, A. Fujiyoshi

Ibaraki University, Japan

This study develops an automatic generation tool for tactile graphics of characters. The tool is developed by using the tactile graphics production system BPLOT. The tool generates character images from fonts installed on Windows, extracts the contours of these images, and outputs a figure-drawing program for BPLOT from the coordinates of those contours. Then, through BPLOT, the final tactile graphics are produced from a braille plotter printer.

Users of the tool can choose a typeface of tactile graphics of characters from the four choices: Mincho (serif font), Gothic (sans-serif font), Kaisho (regular script font). In addition to choosing character spacing and arrangement, users can also select whether to apply the thinning processing to character images.

The evaluation of the tactile graphics of characters was conducted with four blind university students as experimental participants. As experimental materials, the following four types of tactile graphics are prepared: (1) Mincho typeface, (2) Gothic typeface, (3) Kaisho typeface, (4) Kaisho typeface subjected to the thinning processing. As a result, though the material (1) got the lowest score, all tactile graphics were evaluated as being legible. All participants rated the material (4) as having the highest readability.



ID: 265 / STS 6A: 5
LNCS submission
Topics: STS Tactile Graphics and 3D Models for Blind People and Shape Recognition by Touch
Keywords: Blind and Visually Impaired People, User Centered Design and User Participation, Perception of Spatial Information, Accessibility, Usability

Exploring Space: User Interfaces for Blind and Visually Impaired People for Spatial and Non-verbal Information

R. Koutny

JKU, Austria

Meetings play an important role in today’s work environment. Unfortunately, blind and visually impaired people often encounter difficulties in participating fully and equally. The main reasons are twofold: Firstly, visual aids such as whiteboards, flipcharts, or projectors are frequently used, which present information in a 2D format. Secondly, nonverbal communication plays an essential role, with frequent use of deictic gestures, like pointing gestures, incorporating spatial information into the conversation. These factors lead to a disadvantage for blind and visually impaired individuals in the workplace.

As part of the research project [removed for blind review], an accessible brainstorming tool for meetings has been developed, capable of storing nonverbal and spatial information, which can be connected to various devices such as computers, smartphones, and smartwatches. This provides a solid foundation for successfully addressing these issues through innovative user interaction concepts, making spatial information accessible and understandable for blind and visually impaired individuals. Prototypes of these user interface concepts have been developed and tested with the target group in an staged and iterative manner, with implementations running in the browser, on a smartphone, and on a smartwatch. This paper will outline the development and testing procedures, as well as the corresponding test results.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ICCHP 2024
Conference Software: ConfTool Pro 2.8.102+TC+CC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany