Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Please note that all times are shown in the time zone of the conference. The current conference time is: 10th May 2025, 08:15:35 EEST

 
 
Session Overview
Session
09 SES 05.5 A: General Poster Session
Time:
Wednesday, 28/Aug/2024:
12:45 - 13:30

Location: Anastasios G. Leventis Building Ground Floor / Outside Area and Basement Level / Open Area

ECER Poster Exhibition Area

General Poster Session

Show help for 'Increase or decrease the abstract text size'
Presentations
09. Assessment, Evaluation, Testing and Measurement
Poster

Science Anxiety in Times of a Pandemic: Can Mindfulness Training Ease the School Transition Experience?

Michael Hast

IU International University of Applied Sciences, Germany

Presenting Author: Hast, Michael

Although the severity of the COVID-19 pandemic has somewhat subsided, its aftermath remains evident. While schools adapted quickly to changes, the progress of learning has also slowed down. Reading skills of German primary school students now show a deficit of up to one-third of a school year, which has been directly attributed to the pandemic experiences (Ludewig et al., 2022). Similar declines have been observed in other areas such as mathematics (Schult et al., 2022). These delays are critical, as children need to catch up on existing skills as well as continue acquiring new skills and knowledge.

The transition to a new school is often associated with anxiety due to new teachers, requirements, and social contacts (Tay & Hast, 2022). Following the transition, German students first become familiar with science as a distinct school subject, which builds on skills that have been critically delayed due to the pandemic, such as reading or numerical proficiency. Science anxiety, defined as “a debilitating combination of fearful negative emotion and cognition in the context of science learning” (Bryant et al., 2013, p. 432) hampers general participation in science lessons but also directly affects performance, success (Ucak & Say, 2019), and knowledge acquisition (Theobald et al., 2022). Preventing a domino effect starting with insufficient preparation thus gains importance. Science anxiety is comparatively underexplored but is distinct from test and generalised anxiety (Megreya et al., 2021).

To address the challenge of these three elements meeting – pandemic, transition and science anxiety – the current project aims to investigate science anxiety among German fifth-graders. The project’s goals include adapting an existing science anxiety rating scale (Megreya et al., 2021) for use with German students, In addition, training sessions using mindfulness have shown to alleviate test anxiety and positively impact knowledge acquisition (Theobald et al., 2022), reducing stress by improving attention and emotion regulation (Lam & Seiden, 2020). The project aims to additionally examine whether the implementation of a mindfulness program can successfully reduce science anxiety among German fifth-graders.


Methodology, Methods, Research Instruments or Sources Used
The study employed a case study approach involving one public regular school in the north of Germany who transitioned from primary to secondary school in the summer of 2023. The sample consisted of three classes of fifth-graders (N = 67). A quasi-experimental intervention format with a pretest-intervention-posttest design was implemented. Children in all three groups completed a translated form of the abbreviated science anxiety rating scale at the start of the school year. The results of this form the baseline measure. Each of the three participating classes was then allocated a different sequence of tasks. Groups 1 and 2 received six weeks of mindfulness activity at the start of each science lesson. Each lesson began with the Silent 60 exercise, which was then followed by a different mindfulness exercise lasting for around 3 to 4 minutes. Group 3, acting as control group, continued their lessons in the usual approach. After six weeks, all students again completed the science anxiety rating scale. Following, to assess potential delayed effects of improvement, Group 1 continued with the mindfulness activities for another six weeks, but not Group 2. At the end of the second six weeks, all three groups again completed the rating scale.
Conclusions, Expected Outcomes or Findings
At the pre-test level, the three groups did not differ significantly in their mean anxiety rating scores, and they suggest moderate science anxiety levels are present in German fifth graders immediately after the transition to secondary school. At the post-test level, Groups 1 and 2 both showed significantly reduced rating scores, indicating reduced anxiety levels, but the control group’s scores were not significantly different. Science anxiety can thus be successfully reduced within six weeks through the implementation of a mindfulness program. Analysis of the final set of ratings is currently outstanding, but continued improvement in Group 1 compared to Group 2 could indicate a longer-term need for the mindfulness program. Improvement in Group 2 on the other hand may suggest an incubator effect. Improvements in the control group could indicate generally delayed improvements, such as increased familiarity with science lessons.
References
Bryant, F. B., Kastrup, H., Udo, M., Hislop, N., Shefner, R., & Mallow, J. (2013). Science anxiety, science attitudes, and constructivism: A binational study. Journal of Science Education and Technology, 22, 432-448.
Lam, K., & Seiden, D. (2020). Effects of a brief mindfulness curriculum on self-reported executive functioning and emotion regulation in Hong Kong adolescents. Mindfulness, 11(3), 627-642.
Ludewig, U., Kleinkorres, R., Schaufelberger, R., Schlitter, T., Lorenz, R., König, C., ... & McElvany, N. (2022). COVID-19 pandemic and student reading achievement: Findings from a school panel study. Frontiers in Psychology, 13, 876485.
Megreya, A. M., Szűcs, D., & Moustafa, A. A. (2021). The Abbreviated Science Anxiety Scale: Psychometric properties, gender differences and associations with test anxiety, general anxiety and science achievement. PLoS ONE, 16(2), e0245200.
Schult, J., Mahler, N., Fauth, B., & Lindner, M. A. (2022). Did students learn less during the COVID-19 pandemic? Reading and mathematics competencies before and after the first pandemic wave. School Effectiveness and School Improvement, 33(4), 544-563.
Tay, V., & Hast, M. (2022). Standing on your own two feet: An examination of Singaporean trainee teachers’ perceptions of the primary-to-secondary school transition. Asia Pacific Journal of Educational Research, 5(2), 1-22.
Theobald, M., Breitwieser, J., & Brod, G. (2022). Test anxiety does not predict exam performance when knowledge is controlled for: Strong evidence against the interference hypothesis of test anxiety. Psychological Science, 33(12) 2073-2083.
Ucak, E., & Say, S. (2019). Analyzing the secondary school students’ anxiety towards science course in terms of a number of variables. European Journal of Educational Research, 8(1), 63-71.


09. Assessment, Evaluation, Testing and Measurement
Poster

Understanding Response Rates in International Large-Scale Assessments

Sylvia Denner, Brenda Donohue

Educational Research Centre, Ireland

Presenting Author: Denner, Sylvia

Survey data from International Large-Scale Assessments (ILSA) provide valuable information for governments, institutions, and the general public. High response rates are an important indicator of the reliability and quality of the survey, conversely low response rates in ILSAs can threaten the inferential value of the survey method. ILSA data are highly valued by the Ministries of Education of participating countries as a guide to inform policy-making.

An important ILSA is the Programme for International Student Assessment (PISA) which assesses the performance of 15-year-old students in reading, mathematics, and science. First administered in 2000, PISA has been implemented every three years since. Meeting the response rate thresholds specified by a low-stakes test such as PISA has often proven to be a challenge for many PISA participating countries (Ferrera et al. 2010). In the PISA 2022 cycle, an elevated number of countries were required to undertake Non-Response Bias Analysis (NRBA) due to low response rates (OECD, 2023). Ireland has participated in PISA since the first cycle in 2000 and had consistently met the response rate standards at both student and school level until 2022, when it failed to meet the student response rate. This leads us to the main research question ‘Why was there a change in the student response rate between PISA 2018 and 2022 in Ireland?’.

Two major differences were observed between 2018 and 2022, a move from spring to autumn testing and the COVID-19 pandemic. For Ireland, the PISA Main Study took place in the spring (March/April), this was followed by a Feasibility Study in the autumn (October/November). The purpose of the Feasibility Study was to evaluate the possibility of moving testing in Ireland to autumn and for the first time in PISA, testing took place in the autumn 2022. Secondly, while school restrictions were no longer in place in Ireland during testing in 2022, there was still a level of disruption associated with the COVID-pandemic in the school environment.

Various theories have been proposed to understand response rates and why some people participate in surveys and others do not. For example, the theory of cognitive dissonance which according to Festinger (cited in Miller, Clark, & Alayna, 2015) suggests that reducing the lack of agreement between people is an important factor in whether a person will respond or not to a survey. Alternatively, the theory of commitment or involvement suggests that the nature of the first request in the ‘foot in the door’ technique may have a significant effect upon participation (Freedman & Fraser, 1966). However, Self-Determination Theory (SDT) may provide a theoretical framework to facilitate the examination of the role motivation (extrinsic/intrinsic) may play in determining response rates. In SDT, three factors that assist motivation are competence, autonomy and relatedness, according to Deci and Ryan (1985). These three factors are seen as essential psychological needs that guide behaviour. Wenemark et. al. (2011) used SDT to redesign a health-related survey in an effort to improve response rates. In a similar vein, this study will use it to examine the change in the student response rate between PISA 2018 and 2022 in Ireland.

While the focus of this poster is on the changing response rates in PISA in Ireland, the implications of the findings will assist other countries participating in similar ILSAs. With the number of countries experiencing lower response rates in PISA 2022 at an unprecedented level, it is of urgent importance that countries begin to understand and address the complex reasons behind falling response rates in order to maintain the reliability and quality of these kinds of studies.


Methodology, Methods, Research Instruments or Sources Used
A case study of Ireland’s procedures in administering ILSAs such as PISA will be undertaken to examine the research question ‘Why was there a change in the student response rate between PISA 2018 and 2022 in Ireland?’ The research will use Ireland’s participation in four separate administrations of ILSA studies, spring and autumn 2018 PISA, PISA 2022 (autumn), with reference to Trends in Mathematics and Science Study 2023 (TIMSS). The inclusion of TIMSS 2023 allows us to consider a second post-COVID reference point.
The adoption of a case study as a research strategy allows for several techniques of data collection such as the study of documents used (e.g. letters/ manuals/webinars), logs of procedures and communications from the initial contact with schools to the day of testing, as well as conversations with ILSA project managers. The case study will be descriptive (in describing the processes employed) and explanatory in an attempt to explain why there was a change in response rates.
The analysis will be two-fold. The first step of analysis will consider operational issues such as the changed circumstances brought about by the COVID-19 school closures, the introduction of data protection legislation, and the switch to autumn testing. Changes in procedures and processes between the four ILSA administrations will be recorded, categorised, and then evaluated.
In the second step, the recorded and categorised processes will be analysed in relation to motivational theory. The various constructs of motivational theory such as extrinsic/intrinsic motivation will be applied, and the factors that influence motivation (competence, autonomy and relatedness) will also be considered. This two-fold process will give rise to insights not only on important operational changes (in the first instance), but will also shed light on the motivations of students, school staff and test administrators in the second step of the analysis.
Ultimately, conducting the analysis in this manner will assist in an understanding of possible links between motivation and participation. Furthermore, this methodology may allow for the development of useful strategies that could assist future administrations of ILSAs in meeting the specified response rates.

Conclusions, Expected Outcomes or Findings
The initial results highlight a number of differences between the administrations of the four implementations of ILSAs at the empirical level. In PISA 2022, a higher rate of absence was recorded amongst students, more test dates needed to be rescheduled due to scheduling conflicts within the school, and a higher rate of parental refusal was observed. These observations will be furthered examined using motivational theory.
Examining processes and procedures using motivation theory, has already gone some way in understanding the change in response rates between 2018 and 2022. For example, a theme identified in a thematic analysis of semi-structured interview with principals in the PISA 2018 autumn study indicated that if there was more ‘buy in’ from teachers, students and parents there would not be an issue with response rates. The ‘buy in’ is an indication of a person being motivated to take on a task, in this case participating in a ILSAs.
On foot of this initial analysis, we consider the change in response rates to be attributable to a combination of logistic and motivational factors. We consider motivation theory to be a valuable tool in the analysis of participation, given that ILSA’s are low-stakes tests at the student level (though the stakes are higher at a system-level). In an effort to maintain response rates at the required levels, project managers could consider employing strategies that not only address logistical factors, but that also give due consideration to the part that motivation factors may play in response rates. These strategies may ultimately provide a useful tool for project managers in administering ILSAs.

References
Deci, E.L., & Ryan, R.M. (1985). Intrinsic motivation and self-determination in human behavior. Springer, New York.
Ferraro, D., Kali, J., & Williams, T. (2009). Program for International Student Assessment (PISA) 2003: U.S. Nonresponse bias analysis (NCES 2009-088). National Centre for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC. Retrieved at: https://nces.ed.gov/pubs2009/2009088.pdf
Freedman, J.L. & Fraser, S.C. (1966). Compliance without pressure: The Foot-In-The-Door technique. Journal of Personality and Social Psychology. 4(2). 195-202.
Miller, M. K., Clark, J. D., & Jehle, A. (2015). Cognitive dissonance theory (Festinger). The Blackwell encyclopedia of sociology, 1, 543-549.
OECD (2023). PISA 2022 technical report. Paris: OECD Publishing. https://www.oecd.org/pisa/data/pisa2022technicalreport/
Wenemark, M., Persson, A., Brage H. N., Svensson, T., & Kristenson, M. (2011). Applying Motivation Theory to Achieve Increased Response Rates, Respondent Satisfaction and Data Quality. Journal of Official Statistics, Vol. 27, No. 2, 2011, pp. 393–414


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ECER 2024
Conference Software: ConfTool Pro 2.6.153+TC
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany