RIMMA2025 - International Conference on
Forecasting, Preparedness, Warning and Response
Visualization, Communication and Information Management
28 - 30 January 2025, Excursions 31 January 2025
University of Bern, Switzerland
Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
|
Session Overview | |
Location: A-119 Lecture Hall UniS, Schanzeneckstrasse 1, 3012 Bern / Basement level 1, Places: 32, Seating: not fixed |
Date: Wednesday, 29/Jan/2025 | |
2:00pm - 3:15pm | Table Top Drought I: Swiss Civil Protection Service Tabletop Exercise on Extreme Drought: Insights for the Insurance Sector Location: A-119 Lecture Hall Session Chair: Astrid Björnsen Part II of the Workshop will take place on Wednesday, 29 January 2025, from 3:45 pm to 5 pm in Room A-119. Please register for this workshop by writing an email to astrid.bjoernsen@wsl.ch
|
|
Swiss Civil Protection Service Tabletop Exercise on Extreme Drought: Insights for the Insurance Sector Swiss Federal Research Institute WSL, Switzerland |
3:45pm - 5:00pm | Table Top Drought II: Swiss Civil Protection Service Tabletop Exercise on Extreme Drought: Insights for the Insurance Sector Location: A-119 Lecture Hall Session Chair: Astrid Björnsen Part I of the Workshop will take place on Wednesday, 29 January 2025, from 2:00 pm to 3:15 pm, in Room A-119. Please register for this workshop by writing an email to astrid.bjoernsen@wsl.ch
|
Date: Thursday, 30/Jan/2025 | |
10:45am - 12:30pm | Contemporary Visualization: Contemporary Visualization and Extended Reality Approaches to Hazard Preparedness and in-situ Emergency & Rescue Response – Current state of user-centered technology, automation and AI Location: A-119 Lecture Hall Session Chair: Arzu Çöltekin Both sessions will cover visualization and extended (i.e., virtual, augmented or mixed) reality-related research and applications about conference themes (i.e., these presentations and discussions will be directed at work that intersects the common phases of crisis management) and specifically touch upon user-centred technologies (user experience, empirical studies) as well as the latest technology and science breakthroughs in the automation of visualization and 3D modelling and other related processes through, e.g., generative AI and other solutions. |
|
Contemporary Visualization and Extended Reality Approaches to Hazard Preparedness and in-situ Emergency & Rescue Response – Current state of user-centered technology, automation and AI University of Applied Sciences and Arts Northwestern Switzerland (FHNW), Switzerland We propose a double session in which the first session would consist of presentation of scientific and applied projects, and second session would have an interactive panel (forum with experts but with audience participation). In the conference session we will invite (or include papers that are submitted to the conference already) topics that are focusing on current technologies both from a scientific perspective demonstrating innovation and discoveries, and applied projects demonstrating case studies. In both scientific and applied contributions, we expect a reflection that outlines the relevance, strengths and limitations of the visualization or XR solution that are tested, used or proposed. In the panel + forum session, we will facilitate two levels of dialogue: 1) Among experts to frame and provide an overview of the current understanding and future directions of the covered topics, 2) Among audience and experts based on the current bottlenecks, pain points, challenges and exciting developments that might offer solutions. Both sessions will cover visualization and extended (i.e., virtual, augmented or mixed) reality related research and applications in relation to conference themes (i.e., these presentations and discussions will be directed at work that intersects the common phases of crisis management), and specifically touch upon user-centered technologies (user experience, empirical studies) as well as latest technology and science breakthroughs in automation of visualization and 3D modeling and other related processes through e.g., generative AI and other solutions. Introduction to a Voxel-based Urban Digital Twin for Emergency Response Information Systems (ERISs) 1Institute of Cartography and Geoinformatics, Leibniz University, Hannover; 2Information Systems Institute, Leibniz University, Hannover With increasing disasters, effective emergency response information systems (ERISs) are vital for mitigating impacts on communities. Traditional methods often prove inadequate in providing timely, detailed information needed for decision-making. To address this, a voxel-based urban digital twin for ERISs is introduced. The proposed framework is designed to integrate real-time data from various sensors into a georeferenced voxel grid. This approach intends to provide continuous updates to ensure an accurate 3D representation and interaction with the urban environment during emergency operations. Therefore, an exemplary scenario of firefighters' operation is developed for the research. The produced voxel-based urban digital twin is rendered through a web application. Constructed from a high-resolution classified point cloud, the 3D voxel model incorporates crucial elements for fire emergency response, such as hydrants, smoke sensors, and their associated attributive information. The system's key functionalities include multiple exploration modes, dynamic rendering, a focus+context visualization technique, and the use of transparency as a visual variable to highlight critical information and ensure clear, efficient communication of high-priority data to the user. The web application provides specialized tools for navigation and interaction, aimed at enhancing situational awareness and efficiency of firefighting operation. Feedback from firefighters highlights significant improvements in the web application over traditional methods, while also identifying areas for usability enhancement. The research demonstrates the potential of a voxel-based digital twin as a more interactive and immersive tool for emergency management than conventional 2D and basic 3D visualizations. User Experience with Geodashboards Visualizing Preparedness and Response to Natural Hazards 1University of Warsaw, Faculty of Geography and Regional Studies, Department of Cartography, Poland; 2University of Applied Sciences and Arts Northwestern Switzerland, School of Engineering, Institute of Interactive Technologies, Switzerland; 3Department of Geography, Faculty of Social and Educational Sciences, Norwegian University of Science and Technology NTNU, Dragvoll, NO-7491 Trondheim, Norway Management of natural hazards and associated risks requires access to multivariate information. Access to rich spatiotemporal data that contains information on all aspects of the hazardous event— e.g., factors that led to the event, what was affected by the event, impact of any previous (or planned) interventions—should support proper understanding as well as informed decision making for current and future actions (Gołębiowska et al., 2023). However, studying multiple variables and the interactions between them is cognitively demanding, and when it is not done right, it can impair human comprehension and decision making rather than improving it (Keskin et al., 2023; Cheng et al., 2024). In this context, we examine geodashboards that contain multiple linked visualizations, which offer opportunities for exploration and communication of spatiotemporal data from many perspectives through, e.g. maps, plots, graphs, spreadsheets, networks etc. (Golebiowska et al., 2017, 2020), though their complexity could lead to high levels of cognitive load (Nadj et al., 2020). We conducted several user experiments where participants are given natural hazard related sense making and decision making tasks with such complex dashboards as described above, and measure their performance as well as eye movements, from which we can surmise their cognitive load to some degree (Ke et al., 2023). Specifically, we investigated user experience and layout design related challenges; i.e., inexperienced participants’ process while learning the complex interface, their process of information retrieval from multiple-view tools, and the effect of different layouts of geodashboards. Combining usability performance metrics (efficiency, effectiveness and satisfaction), and eye tracking data (Çöltekin et al., 2009), we get insights into the users’ reasoning and cognitive processes. The tested geovisualizations present data on preparedness, i.e., vulnerability and exposure to natural hazards (floods, landslides, storms), as well as consequences of natural hazards in a form of insurance compensations due to natural hazards (storms, floods, landslides, storm surge, water intrusion). Participants were asked to carry out various task types using the presented geovisualization tools. Our results broadly suggest that despite the visual complexity of the tools, even the inexperienced participants find them convenient and helpful in exploring large sets of spatio-temporal data. We thus posit that properly designed geodashboards can be effective tools supporting users, enabling them access to complex data. Designing AR Viewer for QField: Towards Supporting Handling Geospatial Data In Situ For Emergency Response Situations 1University of Applied Sciences and Arts Northwestern Switzerland, Switzerland; 2OPENGIS.ch GmbH GIS tools ubiquitously employ maps to aid visualization of the geographically referenced information (geo-data) across diverse disciplines, including civil engineering, forestry, geology, ecology, and archaeology. In this applied science project, we collaborated with OpenGIS.ch, a company that developed QField (https://qfield.org), an award-winning mobile tool to collect, manage, and edit geo-data in situ tailored to the needs of the GIS fieldwork. Beyond traditional uses in civil engineering, such as construction, urban planning, and infrastructure work, QField has also been employed to facilitate disaster management and recovery tasks. For example, it has been used in mapping flooding damage to houses, infrastructure and vehicles in Canton Ticino, Switzerland [4], assessing flooding damage of the croplands in Fiji [7], and monitoring the (agricultural) recovery of the damaged lands and infrastructure in Tonga due to a volcano eruption [6, 13]. However, despite its interoperability (QField is based on a popular QGIS open-source project https://qgis.org), several challenges remain in rendering and interacting with geospatial data in situ. Specifically, interactions using the current mobile/tablet app are constrained to the manipulation of 2D data points on the map interface, which can often cause issues such as overplotting and occlusion [1] or are prone to difficulties in spatial interpretation and decision-making processes [12]. To address the above-mentioned challenges of user perception and interaction and take advantage of the strengths of both 2D and 3D visualizations, we propose an Augmented Reality (AR) viewer for Qfield. By implementing AR, we enable the placement and rendering of 3D geo-data in situ. Previous research shows mixed evidence regarding the usefulness and usability of the advantage of 3D visualizations in AR for understanding statistical data, local topography, and reading maps [2, 3, 5 9, 11]. We believe, in this case, the AR viewer will facilitate the efficiency of decision-making in the field by visualizing relevant geo-data in the immediate real-world environment, supporting various field tasks from planning underground utilities beneath the surface [8, 10] to virtual demarcation of the forecasted flood territories. We design and develop an AR viewer for QField for both handheld and wearable AR experiences to support a broad range of tasks and enhance interactivity with geo-data and real-world immersion, thus improving spatial understanding and decision-making in situ. A specific strength of the AR in this case is to display the relevant information in its spatial context, which we hypothesize should facilitate quicker comprehension of the situation, as it offers an experience-based approach rather than strictly an analytical one. The contribution of our work-in-progress is threefold: (1) we elicit the AR needs of field workers when it comes to in situ interactivity with geospatial data; based on these needs (2) we design and develop an interactive prototype of the AR viewer for QField; and through continuous user evaluations (3) we examine how data points in AR can be represented across different form factors, such as handheld and wearable AR, by referencing scholarly discussions on the visualization of 2D vs. 3D data for both experiential and analytical tasks [2, 5, 9, 11]. Does Extended Reality Work for Skills Training? University of Applied Sciences and Arts Northwestern Switzerland (FHNW), Switzerland In this brief position paper, we outline some key arguments about why extended (i.e., virtual, augmented and mixed) reality, i.e., XR, might work well for skills training, specifically in the context of emergency preparedness. XR offers a wide variety of benefits in skills-training and experience based learning (Çöltekin et al., 2020a, 2020b). Not only can we fully control and simulate all imaginable scenarios in XR, but we can do this safely, enabling learning from past experiences and preparing for future events. XR might not fully replace traditional training in emergency preparedness, but it can greatly improve them, most pronouncedly for high-severity, low-frequency events. While the implied benefits are numerous, in our view, XR technologies for training for emergency preparedness are becoming accessible considering cost- and user-centric perspectives only recently. XR has been proposed for a long time in the emergency related fields and used since the 2010s, where training dominates much of the discourse (Zhu et al., 2021, Khanal et al., 2022). Of the different types of XR, VR excels in fully controlled experiments and the simulation of rare events, where full immersion leads to memorable experiences (e.g., Lokka & Çöltekin 2026, Lokka et al., 2018), which is key to learning, In contrast, AR and MR excel at augmenting existing training methods with relevant information embedded “in-situ”, offering most value in the field exercises and during interventions. Furthermore, AR/MR enables information push as needed or on-demand, which can potentially lower the responders’ task (and thus cognitive) load (Elmasllari, 2018). Previous work has shown that XR can effectively transfer knowledge into skills (e.g. del Amo et al., 2018). Skills training is experience based, i.e., arguably, does not require much conscious thought (e.g. del Amo et al., 2018, Rasmussen 1983). As XR enables unlimited repetitions and variations for even the rarest scenarios, such experience based learning, i.e., effectively converting knowledge into skill is possible. Furthermore, by recording scenarios from the participant’s point of view, XR allows for a much better post-training briefing than would be possible from the participant’s recollection alone (Forondo et al., 2016). Also importantly, XR can enable team-based simulation and training to address collaboration and coordination needs during emergency response (Reed et al., 2017). With current technologies, shared XR exercises can be organized by single units or even single responders wanting to learn together. Small teams could participate (virtually) in shared large-scale XR exercises, for which travel time in the real world would have been a limiting factor (Kanal et al., 2022). In conclusion, based on the above arguments, we surmise that as XR can respond to needs of safety, cost-effectiveness, logistics, and collaboration, it can meet a large part of the requirements of a training environment, and thus are strong candidates in most emergency related scenarios, but especially in low-frequency, high-impact cases. With the current developments in artificial intelligence (AI), we anticipate AI-generated sound, video, behaviors etc. will strengthen XR even more, and foster new opportunities for experiential learning. |
2:00pm - 3:30pm | Emergency and Crises Management: Emergency and Crises Management as Core Aspects of HEIs Curricula and Infrastructures: Enhancement of their Resilience and in Support of Secure Societies. Location: A-119 Lecture Hall Session Chair: Aikaterini POUSTOURLI Session Chair: Horst Kremers Speakers:
|
|
Civil Protection and especially Emergency Management curricula and administrative services within universities (HEIs) International Hellenic University (IHU), Greece HEIs must frequently adapt broad, varied emergency management policies to deal with the scope of emergencies and disasters that can occur in on-campus settings. Fires, earthquakes, floods, and some of the most common natural disasters possess the capacity for losses of life and property, with the potential to effectively disrupt and damage a university community. Man-made crises, such as cybersecurity threats, CBRN hazards, protestors and campus shootings among others also pose a serious threat to life and property; to preemptively reduce or prevent the severity of emergencies, universities must coordinate and implement policies to effectively eliminate unnecessary risks' and decrease potential losses. Incidents vary among continents and it is worthy to examine the threats perceived by European, American, Japanese and other Universities and consider the steps these institutions may take to protect their communities from harm. HEIS need to have a well-designed plan of procedures to respond to emergencies. These plans of response provide the entire campus with specific guidelines to properly prepare, respond, and recover from emergencies. The University as an organization, including facility members, students, staff, and suppliers should all be familiar with the plan's procedures, and use it as a quick reference for effective action. |
Contact and Legal Notice · Contact Address: Privacy Statement · Conference: RIMMA2025 |
Conference Software: ConfTool Pro 2.8.105+CC © 2001–2025 by Dr. H. Weinreich, Hamburg, Germany |