09. Assessment, Evaluation, Testing and Measurement
Symposium
ICT and Education: Perspectives from ICILS and PIRLS
Chair: Mojca Rozman (IEA)
Discussant: Wolfram Schulz (ACER)
The role of information and communication technology (ICT) has become increasingly integral in shaping how we work, learn, and connect with others. This is especially recognized by UNESCO, who describes ICT as a “social necessity” for ensuring education as a basic human right, particularly in times of crises (UNESCO, 2023).
International large-scale assessments (ILSAs) such as those conducted by the International Association for the Evaluation of Educational Achievement (IEA), provide a lens to monitor the evolving role of ICT in education and its connections to student outcomes. Furthermore, ILSAs provide extensive contextual data, enabling comprehensive analyses of various aspects of ICT, such as access to ICT resources, students’ attitudes towards ICT, or teacher preparedness for the use of ICT in the classroom.
The goal of this symposium is to demonstrate how different ILSAs can be used to address a wide range of research questions related to ICT in education and to inform research, policy and practice. We focus on two IEA studies: the International Computer and Information Literacy Study (ICILS) and the Progress in International Reading Literacy Study (PIRLS).
ICILS aims to respond to the question: how well are students prepared for study, work, and life in a digital world? (Fraillon & Rožman, 2023). It examines eighth-grade students’ computer and information literacy and, as an optional module, students’ computational thinking. The 2023 cycle of ICILS marked 10 years of the study. PIRLS measures fourth-grade students’ reading achievement. Its latest cycle, conducted in 2021, is the only ILSA that successfully collected data during the COVID-19 pandemic, providing a rich data source to inform about the impact of the pandemic on reading achievement (Mullis et al., 2023). Despite these two studies having different research questions and focusing on different content domains, both ICILS and PIRLS provide valuable information on the topic of ICT in education across a diverse range of educational systems.
Four symposium papers provide different perspectives on how ICILS and PIRLS can be used to study the role of ICT in education. The first paper gives an overview of IEA studies on ICT in education, to then look at their representation in the academic literature. The main goal is to map the evolution of publications and to describe the type of research that has been conducted.
The second paper uses ICILS 2018 data to explore digital applications usage among foreign language teachers. Specifically, it aims to identify different classes of digital application usage as well as factors related to the identified classes.
Using PIRLS 2021 data, the third paper examines the implementation of remote learning during the COVID-19 pandemic in the Dinaric region. In particularly, it examines the different response measures implemented as well as the preparedness for digital remote learning.
The fourth paper evaluates two question formats used to assess teaching beliefs in the field trial of ICILS 2023. The two formats are compared on multiple criteria of data quality, providing insights into the use of alternative question formats in digital context questionnaires.
The presenting authors will focus on the main findings of their studies, highlighting the different ways in which ICILS and PIRLS data can be used. The discussant of the symposium will offer remarks about the presentations, reflecting on the evolving role of ICT in education and how ILSAs can help us study this topic from different thematic and methodological perspectives.
ReferencesFraillon, J. & Rožman, M. (2023). International Computer and Information Literacy Study 2023. Assessment Framework. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA). https://www.iea.nl/sites/default/files/2023-12/20231221%20ICILS2023_Assessment_Framework__Final_0.pdf
Mullis, I. V. S., von Davier, M., Foy, P., Fishbein, B., Reynolds, K. A., & Wry, E. (2023). PIRLS 2021 International Results in Reading. Boston College, TIMSS & PIRLS International Study Center. https://doi.org/10.6017/lse.tpisc.tr2103.kb5342
UNESCO. (2023). Digital Education: What You Need to Know. https://www.unesco.org/en/digital-education/need-know
Presentations of the Symposium
The Use of IEA Studies in Research: A Systematic Review of Comped, SITES, and ICILS Related Research
Ana María Mejía-Rodríguez (IEA), Mojca Rozman (IEA), Rolf Strietholt (IEA)
Over the last decades, information and communication technology (ICT) has become an important part of our lives, including education. Already in 1989, the International Association for the Evaluation of Educational Achievement (IEA) was interested in this topic, when it launched its first study about the introduction and use of computers in education (Pelgrum & Plomp, 1993). With over 30 years of different studies about ICT in education, the IEA continues its investigations of how technologies are used in schools and in classrooms and how prepared are students for a digital world through the International Computer and Information Literacy Study (ICILS).
While the international reports of ICILS and its predecessors offer a broad range of information, they only scratch the surface of what can be done with the available data. Additional, and highly relevant, insights come from external publications. Following the reviews of Hopfenbeck et al. (2018) and Lenkeit et al. (2015), the present study is a systematic review of English-language peer-reviewed articles related to three IEA studies about ICT in education: Computers in Education (Comped), the Second Information Technology in Education Study (SITES) and ICILS.
The main goal of this review is to map the evolution of publications based on these studies and to describe the type of research that has been conducted, both in terms of research topics and methodological approaches. Through this, we aim not only to identify crucial literature to be used by any established or newcomer researcher in the field but also to provide guidance on topics for future research. An additional goal is to encourage the use of ICILS in secondary research.
The studies that are included in the review were identified through an electronic search was conducted across five different channels including, for example, multiple electronic databases and target searches in journals focused on international large-scale assessments or on ICT in education. After screening procedures, a total of 91 publications were deemed as relevant for the review. Results map the frequency of publications through years, journals and countries. Further results summarize the major topics studied across within four types of publications identified: descriptive studies, effectiveness studies, critiques or scale evaluations, and case studies.
References:
Hopfenbeck, T. N., Lenkeit, J., Masri, Y. E., Cantrell, K., Ryan, J., & Baird, J.-A. (2018). Lessons Learned from PISA: A Systematic Review of Peer-Reviewed Articles on the Programme for International Student Assessment. Scandinavian Journal of Educational Research, 62(3), 333–353. https://doi.org/10.1080/00313831.2016.1258726
Lenkeit, J., Chan, J., Hopfenbeck, T. N., & Baird, J.-A. (2015). A review of the representation of PIRLS related research in scientific journals. Educational Research Review, 16, 102–115. https://doi.org/10.1016/j.edurev.2015.10.002
Pelgrum, W. J., & Plomp, T. (1993). The IEA study of computers in education: Implementation of an innovation in 21 education systems. Pergamon.
Latent Classes of Digital Application Usage in Foreign Language Teaching in Germany and Related Determinants – Secondary Analyses Based on ICILS
Jan Niemann (Paderborn University), Birgit Eickelmann (Paderborn University), Kerstin Drossel (Paderborn University)
International comparative school performance studies, such as the IEA-Study ICILS (Fraillon & Rožman, 2023), offer insights into educational practices across Europe and the world. The methodological design of the ICILS-Study enables sub-samples to be formed, allowing for the examination of specific groups and the generation of knowledge that could be used to improve school systems. This methodological possibility is used in this contribution to identify classes of digital applications usage by foreign language teachers and related determinants. Previous non-subject-specific studies like Graves and Bowers (2018) were able to identify four media patterns (evaders, assessors, presenters, dexterous). Additionally, factors influencing ICT use, such as teachers' self-efficacy are well studied across subjects (Gerick, Eickelmann & Bos, 2017). However, specific digital application usage classes and their determinants in foreign language teaching remain unexplored, despite possible subject-subcultural influences. This contribution aims to answer two research questions:
1. To what extent can different digital application usage classes be identified for foreign language teachers compared to non-foreign language teachers in Germany?
2. To what extent is there a connection between identified digital application usage classes and determinants of ICT use for both groups?
The study employs ICILS 2018 teacher data from Germany (n=2328; Eickelmann et al., 2019), taking into account data weighting (Tieck & Meinck, 2020).
To answer the first RQ, a latent class analysis is conducted using MPlus8. The class solution is based on statistical information criteria (e.g. smallest BIC; Eshima, 2022). The analysis identifies three usage classes for foreign language (a) and non-foreign language (b) teacher groups: avoiders (a: 80.4%; b: 78.1%), selective users (a: 17.4%; b: 19.7%), and multiple users (a: 2.2%; b: 2.2%) of digital applications.
For the second RQ, a hierarchical regression analysis was conducted using the IDB Analyzer to explore the connections between usage classes and determinants.
The analysis, grounded in theoretical considerations, employs five regression models. Results highlight significant correlations, including foreign language teachers' affiliation with multiple users being linked to positive attitudes towards ICT (Model V; ß=.27, adjusted R²=.15). Correlations vary across usage classes and teacher groups.
The findings contribute to the understanding of the integration of digital applications in language teaching. This provides valuable insights for researchers and policymakers, particularly in Europe. Potential explanations, such as subject-subcultural influence on digital application usage, related determinants and alternative methodological approaches are discussed.
References:
Eickelmann, B., Bos, W., Gerick, J., Goldhammer, F., Schaumburg, H., Schwippert, K. et al. (Hrsg.). (2019). ICILS 2018 #Deutschland. Computer- und informationsbezogene Kompetenzen von Schülerinnen und Schülern im zweiten internationalen Vergleich und Kompetenzen im Bereich Computational Thinking. Münster: Waxmann.
Eshima, N. (2022). An Introduction to Latent Class Analysis. Singapore: Springer Singapore. https://doi.org/10.1007/978-981-19-0972-6
Fraillon, J. & Rožman, M. (2023). International Computer and Information Literacy Study 2023. Assessment Framework. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA). https://www.iea.nl/sites/default/files/2023-12/20231221%20ICILS2023_Assessment_Framework__Final_0.pdf
Gerick, J., Eickelmann, B. & Bos, W. (2017). School-level predictors for the use of ICT in schools and students’ CIL in international comparison. Large-scale Assessments in Education, 5(1), 1–13. DOI: 10.1186/s40536-017-0037-7
Graves, K. E. & Bowers, A. J. (2018). Toward a Typology of Technology-Using Teachers in the ”New Digital Divide”: A Latent Class Analysis of the NCES Fast Response Survey System Teachers’ Use of Educational Technology in U.S. Public Schools, 2009 (FRSS 95). Teachers College Record, (8), 1-42. http://www.tcrecord.org/Content.asp?contentid=22277
Tieck, S. & Meinck, S. (2020). Weights and variance estimation for ICILS 2018. In Mikheeva, E., Meyer, S. (Eds.). IEA International Computer and Information Literacy Study 2018 - User Guide for the International Database. Amsterdam: International Association for Educational Achievement (IEA).
Dinaric Region During the COVID-19 Disruption: Schools’ Response Measures and Digital Preparedness
Marina Radović (Examination Centre of Montenegro), Dijana Vučković (University of Montenegro), Jelena Radišić (University of Oslo)
The COVID-19 pandemic caused severe global challenges for education systems and schooling worldwide, with the Dinaric region being no exception. Although the demand for digital competence among teachers and using digital tools and devices in teaching and learning has been present in the region for over a decade, the existing practices could not fully meet the difficulties associated with the COVID-19 pandemic disruption. Centering on the Dinaric area (i.e., Albania, Croatia, Kosovo, Montenegro, North Macedonia, Serbia and Slovenia) concerning the COVID-19 disruption, the paper sheds light on the disruption and response measures in the region against the demand for remote instruction during COVID-19. It examines the diverse response measures and how they were conveyed to different stakeholders, coupled with prior established practices and ease of access to digital devices and their use in teaching and learning.
Data collected during the PIRLS 2021 cycle from students, teachers, school principals and parents and analyses of the PIRLS 2021 Encyclopedia (Reynolds et al., 2022) are used as primary sources in the analyses. Both national reports and responses from school principals indicate that the level of disruption varied at different times of school operation, prompting different types of responses from the schools, often dependent on school location and overall country response to the pandemic. Results also show certain common patterns across the Dinaric region concerning the systems’ wide range of activities to answer the challenge. National Ministries of Education coordinated technical and overall resource support across the most Dinaric countries. Access to different digital resources and access provided to students and teachers somewhat varied. Internet-based resources dominated distant learning resources. Sharing devices within the class was the leading established practice. In some cases, the availability of smartphones outpowered the presence of one’s own or shared computer (tablet), according to the student reports. Teachers’ professional development across the board was focused more on instruction related to digital literacies than integrating technologies into reading instruction. Parents’ perceptions of whether their child’s learning progress has been adversely affected during the COVID-19 disruption varied between and within countries.
References:
Reynolds, K.A., Wry, E., Mullis, I.V.S., & von Davier, M. (2022). PIRLS 2021 Encyclopedia: Education Policy and Curriculum in Reading. Retrieved from Boston College, TIMSS & PIRLS International Study Center website: https://pirls2021.org/encyclopedia
Rating or Ranking: Assessing Teaching Beliefs in an International Online Survey Experiment
Mojca Rozman (IEA), Andrés Christiansen (IEA), Rolf Strietholt (IEA)
International large-scale assessments (ILSA) administer context questionnaires to students, teachers, and principals to collect information about school, classroom and learning conditions. These questionnaires usually consist of a series of rating type items which often face issues such as social desirability, self-presentation, and acquiescence bias (e.g., Lelkes and Weiss, 2015; Schaeffer and Dykema, 2020). There are alternatives to rating scales, such as forced choice items, rankings, anchoring vignettes or situational judgement tasks. Alternative item types can address some issues found with rating item types. It was found that ranking reduces the response style, and it improves data quality (Krosnick & Alwin, 1988). Furthermore, computer-based surveys enable administering items or response scales that are difficult to implement on paper. They provide an opportunity to use functions such as sliders, drag-and-drop, or drop-down menus.
In the field trial of the International Computer and Information Literacy Study (ICILS) 2023, Q-sort was introduced as an alternative question type to assess teaching beliefs of secondary school teachers. Q-sort is a technique that was initially developed for clinical interviews, requiring respondents to arrange and rank a series of cards according to their preference. In this paper, we investigate the feasibility of using the Q-sort (ranking) format when collecting data about teaching beliefs in an international survey and explore and compare the quality and usefulness of the data gathered by two question types, ranking and rating.
We use teacher data from 28 countries participating in ICILS 2023 field trial to investigate the effect of the question format using multiple criteria of data quality. The two question types were randomly distributed across the participating teachers within countries. We compare the two versions by the amount of missing data, distribution of responses, item and scale means, and the correlations between the scale scores and teacher characteristics.
For ranking higher proportion of missing values were observed because the cognitive load is higher for the parallel sorting of a total of 18 items than for the rating items that are answered individually. In addition, we observed more variance in the responses from the ranking than in the rating version. The ranking removes the possibility that respondents can agree equally with all statements and can thus reduce acquiescence bias. Although some advantages were found for the ranking format, we could not suggest the implementation of the current version for further data collection because of the high amount of missing data observed.
References:
Krosnick, J. A., & Alwin, D. F. (1988). A test of the form-resistant correlation hypothesis: Ratings, rankings, and the measurement of values. Public Opinion Quarterly, 52 (4), 526–538.
Lelkes, Y., & Weiss, R. (2015). Much ado about acquiescence: The relative validity and reliability of construct-specific and agree–disagree questions. Research & Politics, 2 (3), 053168015604173.
Schaeffer, N. C., & Dykema, J. (2020). Advances in the science of asking questions. Annual Review of Sociology, 46 , 37–60.