Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Please note that all times are shown in the time zone of the conference. The current conference time is: 14th June 2024, 02:57:12pm IST

 
 
Session Overview
Session
P25.P6.DU: Paper Session
Time:
Thursday, 11/Jan/2024:
9:00am - 10:30am

Location: Rm 4035

Trinity College Dublin Arts Building Capacity 30

Show help for 'Increase or decrease the abstract text size'
Presentations
ID: 179 / P25.P6.DU: 1
Data Use Network
Individual Paper
Orientation of proposal: This contribution is mainly an academic research contribution.
ICSEI Congress Sub-theme: Leading schools and education systems that promote equity, inclusion, belonging, diversity, social justice, global citizenship and/ or environmental sustainability

To Grade or Not to Grade? Mapping Students’ Progress Outside the Formal Curriculum.

Torbjörn Ott1, Giulia Messina Dahlberg1, Julia Eskilsson2, Pernilla Schagerlind2, Amanda Terlevic2

1University of Gothenburg, Sweden; 2Bräckegymnasiet, Sweden

Objectives

This paper reports the development work in cooperation between school and university at an introductory programme in Sweden. The introductory programme is an alternative educational path to support students who do not qualify for the national upper-secondary programmes. These students, for instance, migrant students or students with intellectual disabilities, belong to groups that are at-risk and have been marginalised in different ways during their school paths. An important challenge that we focus upon is that a majority of the students do not reach the goals set in the national curriculum for eligibility for a national programme or establishment in the labor market. More specifically, a narrow focus on eligibility for the national programmes implies that eligibility is measured in grades. Many students attend the programme for several years without achieving grades, even though they develop competences that are relevant and may facilitate transitions to future educational and professional paths.

This paper aims to investigate and shed light on alternative methods to map and assess students’ progress in ways that are not constrained by the formal curriculum.

The research questions are:

1. From a practice perspective, is it possible to develop alternative assessment categories?

2. If so, what categories were put into practice?

3. Why were these categories selected?

Perspective

Grades can constitute external motivation for students to engage in short time learning (Stan, 2012). However, when students run the risk of not reaching grades they often stop trying. Therefore, there is a need to develop methods both to motivate, and to show progress for students who experience failure to reach grades. Beyond assessing student achievement, data can support the development of teaching practices (Datnow & Park, 2018). This study investigates what such data could be and how the awareness and use of it is developed by teachers.

Methods

This paper draws from engaged scholarship (Van de Ven, 2007) and collaboration between practitioners in schools and researchers. Different techniques have been tested in the study to create matrixes and templates that support teachers and students in their reflective work on their learning and development. These tools were created after discussions and testing and an iterative process. Interviews were conducted with teachers and students to document this process. The data is compiled from continuous notes and documents generated during the process.

Findings

Preliminary findings show that the process of designing, testing, and implementing templates for assessment is long and complex. It implies that teachers need to come to a shared understanding of relevant data that reflect students’ success and not only their failures. An important consequence is that the process facilitated teachers’ and students’ engagement in deeper conversations about their learning, and the factors that may bring successful transitions for this student group. One challenge is that the process is time-demanding and requires planning and needs a focused investment by the school leadership in order to be implemented. Hopefully, the results of this study can provide an example for others who struggle to promote equity, inclusion and citizenship in their education.

References
References
Datnow, A., & Park, V. (2018). Opening or closing doors for students? Equity and data use in schools. Journal of Educational Change, 19, 131-152.


Stan, E. (2012). The role of grades in motivating students to learn. Procedia-Social and Behavioral Sciences, 69, 1998-2003.

Van de Ven, A. H. (2007). Engaged scholarship: A guide for organizational and social research. Oxford University Press.


ID: 347 / P25.P6.DU: 2
Data Use Network
Individual Paper
Orientation of proposal: This contribution is mainly an academic research contribution.
ICSEI Congress Sub-theme: Leveraging research and data for inquiry, insight, innovation and professional learning

Defining ‘Comparable Schools’. A Delphi-Study On Meaningful And Valid Comparisons Of School Performance Feedback

Glen Molenberghs, Roos Van Gasse, Sven De Maeyer, Jan Vanhoof

Universiteit Antwerpen, Belgium

Formal (i.e. systematically collected) school performance feedback (SPF) (such as test scores from central tests) can be a powerful tool for data-based decision making (Schildkamp, 2019). The assumption here is that school leaders and teachers (can) use SPF as a mirror to identify strengths and weaknesses in order to inform and drive teaching and school improvement (Coburn & Turner, 2011; Hulpia & Valcke, 2004; Schildkamp & Teddlie, 2008). However, research has found that both school leaders and teachers mostly fail to translate this data into meaningful information (Goffin, Janssen, & Vanhoof, 2023; Mandinach & Gummer, 2016; van der Kleij & Eggen, 2013). As a result, critical signals may not be picked up, or worse, incorrect or invalid inferences may lead to wrong decisions.

Sensemaking is a crucial stage in the cycle of data use (Mandinach & Schildkamp, 2021; Schildkamp, 2019), in which school leaders and teachers try to understand what the data means for the school or the classroom (Datnow, Park, & Kennedy-Lewis, 2012; Spillane, 2012). In this, comparing school’s performance against a well-chosen comparator, such as other (comparable) school’s performance (i.e. norm comparisons), can support users in understanding what the SPF means (Neumann, Trautwein, & Nagy, 2011; Schildkamp, Rekers-Mombarg, & Harms, 2012; Vanhoof, Mahieu, & Van Petegem, 2009). After all, only after SPF is interpreted (and analysed) it is transformed into information that can be used as a basis for data-based decision making (Mandinach, Honey, Light, & Brunner, 2008; Schildkamp & Poortman, 2015). Although education professionals commonly compare their school’s performance to that of a reference group (Goffin et al., 2023), literature shows that interpreting such norm-oriented information in SPF presents challenges (Hellrung & Hartig, 2013). In addition, it appears that data use processes are hampered when the norm-oriented comparator for SPF is not perceived as 'fair' (Vanhoof, Verhaeghe, & Van Petegem, 2009).

Given these considerations, in this study we explore the importance of comparing SPF in sensemaking (RQ1) and describe - from a user perspective - appropriate indicators and operationalisations for valid norm comparisons (RQ2). To this end, we conducted a Delphi study with two systematic rounds of consultations with both (future) feedback users and educational scientists. 18 informants participated in the study. Their responses were initially (round 1) analysed inductively according to the principles of Grounded Theory (Glaser & Strauss, 1967) and subsequently (round 2) deductively according to principles of framework analysis (Miles & Huberman, 1984). Preliminary results suggest that norm comparisons mainly help to understand the broader context of a school’s performance. In addition, the importance of comparing both 'gross' and 'net' school test scores appears to be endorsed. Finally, informants underline - in pursuit of valid comparisons of SPF - the importance of taking into account (mainly) pupil intake characteristics. As school feedback from central tests only proves relevant when the SPF is interpreted (and used) meaningfully and validly, these findings give rise to a discussion of theoretical lessons and policy implications.

References
Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis. Measurement: Interdisciplinary Research & Perspective, 9(4), 173-206.
Datnow, A., Park, V., & Kennedy-Lewis, B. (2012). High school teachers' use of data to inform instruction. Journal of Education for Students placed at Risk 17(4), 247-265.
Glaser, B., & Strauss, A. (1967). The Discovery of Grounded Theory. NY: Aldine Publishing Company.
Goffin, E., Janssen, R., & Vanhoof, J. (2023). Principals’ and Teachers’ Comprehension of School Performance Feedback Reports. Exploring Misconceptions from a User Validity Perspective. Pedagogische Studien, 100(1), 67-97.
Hellrung, K., & Hartig, J. (2013). Understanding and using feedback–A review of empirical studies concerning feedback from external evaluations to teachers. Educational Research Review, 9, 174-190.
Hulpia, H., & Valcke, M. (2004). The use of performance indicators in a school improvement policy: The theoretical and empirical context. Evaluation & Research in Education, 18(1-2), 102-119.
Mandinach, E. B., & Gummer, E. S. (2016). What does it mean for teachers to be data literate: Laying out the skills, knowledge, and dispositions. Teaching and Teacher Education, 60, 366-376. doi:10.1016/j.tate.2016.07.011
Mandinach, E. B., Honey, M., Light, D., & Brunner, C. (2008). A Conceptual Framework for Data-Driven Decision-Making. In E. Mandinach & M. Honey (Eds.), Data-Driven School Improvement: Linking Data and Learning. New York, Y: Teachers College Press.
Mandinach, E. B., & Schildkamp, K. (2021). Misconceptions about data-based decision making in education: An exploration of the literature. Studies in Educational Evaluation, 69, 100842.
Miles, M. B., & Huberman, A. M. (1984). Qualitative data analysis: An expanded sourcebook. London: Sage.
Neumann, M., Trautwein, U., & Nagy, G. (2011). Do central examinations lead to greater grading comparability? A study of frame-of-reference effects on the University entrance qualification in Germany. Studies in Educational Evaluation, 37(4), 206-217.
Schildkamp, K. (2019). Data-based decision-making for school improvement: Research insights and gaps. Educational Research, 61(3), 257-273. doi:10.1080/00131881.2019.1625716
Schildkamp, K., & Poortman, C. (2015). Factors influencing the functioning of data teams. Teachers College Record, 117, 1-42.
Schildkamp, K., Rekers-Mombarg, L. T. M., & Harms, T. J. (2012). Student group differences in examination results and utilization for policy and school development. School Effectiveness and School Improvement, 23(2), 229-255. Retrieved from https://www.tandfonline.com/doi/pdf/10.1080/09243453.2011.652123?needAccess=true
Schildkamp, K., & Teddlie, C. (2008). School performance feedback systems in the USA and in The Netherlands: A comparison. Educational Research and Evaluation, 14(3), 255-282.
Spillane, J. P. (2012). Data in practice: Conceptualizing the data-based decision-making phenomena. American journal of Education, 118(2), 113-141.
van der Kleij, F. M., & Eggen, T. J. (2013). Interpretation of the score reports from the Computer Program LOVS by teachers, internal support teachers and principals. Studies in Educational Evaluation, 39(3), 144-152.
Vanhoof, J., Mahieu, P., & Van Petegem, P. (2009). Geïnformeerde schoolontwikkeling: van een nieuw gegeven naar een beleidsinstrument. Kwaliteitszorg in het onderwijs, 22, 17-51.
Vanhoof, J., Verhaeghe, G., & Van Petegem, P. (2009). Verschilllen in het gebruik van schoolfeedback: een verkenning van verklaringsgronden. Tijdschrift voor onderwijsrecht en onderwijsbeleid(4), 306-322.


ID: 427 / P25.P6.DU: 3
Data Use Network
Individual Paper
Orientation of proposal: This contribution is mainly an academic research contribution.
ICSEI Congress Sub-theme: Leveraging research and data for inquiry, insight, innovation and professional learning

"Is This my Clasroom?... ": Revealing Social Network Information for Better Student Outcomes

Tarang Tripathi1, Chandraditya Raj2, Palaash Bhargava3, Christoforos Mamas1, Smriti Sharma2

1University of Calfornia, San Diego, United States of America; 2Aawaaz Foundation, India; 3Columbia University

Introduction

Over time it has become evident that peers and networks of students play a key role in their social and academic development (Bhargava et al. [2022], Calvo-Armengol et al. [2009]). Hence, analyzing social networks in classrooms is important to understand and leverage the benefits of these connections (Zarate [2023]). However gauging and collating this information can be daunting for teachers, especially when they have to cater to large classroom sizes and multiple administrative responsibilities in schools (King and Nomikou, 2018). Hence we ask a dual question; What are the social networks of students in Indian classrooms? Second, How do teachers engage with the network data of their classrooms?

Context

This study involved 24 classrooms and teachers from three schools with a total of 584 students in total. Participating classrooms ranged from grades 1 - 10. The schools from the states of Rajasthan and Uttar Pradesh were selected to represent three different economic brackets (low, middle, and high income).

Methods and Analysis

The study utilized a mixed-method approach to answer the research questions. Quantitatively, we used the method of social network analysis (Jackson et al. [2023]) to unpack and visualize these networks.

Additionally, we conducted semi-structured interviews with teachers as they were making sense of the visuals of their classroom network data. The qualitative data were analyzed using a constant comparative method (Glasser, 1965) to surface the different themes in the ways that teachers engage with their classroom data and connect it to daily practices.

Findings

Through our network analysis, we found that there was considerable dispersion in degrees across several dimensions. At least 20% of students in each classroom get nominated by only one other individual as a very good friend. On the other end of the spectrum, 20% of the students within each classroom get nominated by more than 6 individuals as very good friends. Similar levels of dispersion are seen in academic help and recess networks.

In our interviews, we saw teachers being surprised about the students who were isolated in their classrooms. While they had an idea of the “popular” students in their classrooms, they were less aware of the students who were at the margins. Second, teachers were making tangible connections between the data and their future practices. These prospective practices ranged from making seating plans according to the data to nominating isolated students to take a more active part in school events.

Implications and Connection to ICSEI Theme:

Through our network analysis, it is evident that there are students in classrooms who do feel isolated and need more support. Additionally, our engagements with teachers speak to the fact that they value data that is specific to their classrooms. We see teachers making nuanced inferences from the data and relating it back to tangible practices. This study connects to the overarching theme of the conference to create professional support and training for teachers utilizing data. Additionally, it also directly connects to the sub-theme of leveraging research and data for inquiry, insight, innovation, and professional learning.

References
Bhargava, P., Chen, D. L., Sutter, M., & Terrier, C. (2023). Homophily and transmission of behavioral traits in social networks.

Calvó-Armengol, A., Patacchini, E., & Zenou, Y. (2009). Peer effects and social networks in education. The review of economic studies, 76(4), 1239-1267.

Glaser, B. G. (1965). The constant comparative method of qualitative analysis. Social problems, 12(4), 436-445.

Jackson, M. O., Nei, S. M., Snowberg, E., & Yariv, L. (2023). The Dynamics of Networks and Homophily (No. w30815). National Bureau of Economic Research.

King, H., & Nomikou, E. (2018). Fostering critical teacher agency: The impact of a science capital pedagogical approach. Pedagogy, Culture & Society, 26(1), 87-103.

Zárate, Román Andrés. 2023. "Uncovering Peer Effects in Social and Academic Skills." American Economic Journal: Applied Economics, 15 (3): 35-79.


ID: 247 / P25.P6.DU: 4
Data Use Network
Individual Paper
Orientation of proposal: This contribution is mainly an academic research contribution.
ICSEI Congress Sub-theme: Leading improvement collaboratively and sustainably

Promoting Data Use In Schools: How Local And Regional School Administrators Can Support Schools - A German Perspective

Ruth Anna Hejtmanek, Esther Dominique Klein

Technical University Dortmund, Germany

To improve successfully, schools should engage in data-informed school improvement, and research shows that principals are key players when it comes to such data use in schools (e. g. Datnow & Hubbard, 2016; Demski & Racherbäumer, 2015). Only few schools engage in data-informed improvement systematically (e. g. Demski, 2019), and the implementation of such practices seems to be contingent on several factors (e. g. Altrichter & Maag Merki, 2016; Bremm et al., 2017). The question therefore is how principals (and their schools) can be supported and trained so that their practice is more data-informed. So far, the discourse regarding school improvement in Germany has focused on the responsibility that individual schools have for their own improvement, whereas the tasks and responsibilities of superordinate actors in the local and regional school administration (LRSA) have hardly been addressed (Authors 1). According to German law, LRSA have supervisory responsibility for the schools (Ackeren et al., 2015, p. 97; Authors 2). However, there is no coherent strategy or a common understanding of the role and tasks that LRSA have with regard to school improvement (e. g. Brüsemeister & Gromala, 2020; Authors 1). Instead, there is a lack of concrete guidance at the policy and legal level, on how to shape the role of LRSA, e.g., with regard to their support services for schools (Authors 1). Their self-perceptions primarily involves a controlling, possibly advisory function, but no responsibility for enabling schools to improve (Authors 2). And although “‘data-based’ governance has become a large part of contemporary school supervision” (Dabisch, 2023, p. 65), the support practices of LRSA with regard to data use are highly heterogeneous (Dabisch, 2023; Huber et al., 2020). Moreover, there is little theoretical input or research on how LRSA can and should be trained if they should function as supporting agency for school improvement (Authors 1). Consequently, most schools currently are on their own when it comes to school improvement capacities, such as data-informed school improvement (ibid). This paper therefore aims to explore what LRSA in Germany can or should do to support schools in developing a data-informed approach to improvement. To do so, we used guided interviews (Mayring, 2015) with principals and teachers of successful schools to understand the prerequisites of data-informed practices in schools. We moreover used observation protocols of expert workshops with experts from administration, practice, and research, in which they discussed what structures, resources, and professional support schools need to develop data-informed practices based on the interview data. Initial findings point to the importance of school-wide – and possibly cross-organizational – structures, networking between schools as well as between different actors in the field, and collaboration as relevant organizational structures that need to be supported by LRSA. Moreover, LRSA must cultivate positive attitudes toward data and improvement in principals and teachers, which necessitates that LRSA actors model that attitude. Since LRSA often are not specifically trained for their task, it is also vital that they acquire knowledge about data (use) as well as helpful structures for effective data use first.

References
Ackeren, I. van, Klemm, K., & Kühn, S. M. (2015). Entstehung, Struktur und Steuerung des deutschen Schulsystems. Eine Einführung (3 ed.). Springer VS.
Altrichter, H., & Maag Merki, K. (2016). Handbuch Neue Steuerung im Schulsystem (2 ed.). Springer VS.
Bremm, N., Eiden, S., Neumann, C., Webs, T., Ackeren, I. van, & Holtappels, H. G. (2017). Evidenzorientierter Schulentwicklungsansatz für Schulen in herausfordernden Lagen. Zum Potenzial der Integration von praxisbezogener Forschung und Entwicklung am Beispiel des Projekts „Potenziale entwickeln - Schulen Stärken“. In V. Manitius & P. Dobbelstein (Eds.), Schulentwicklungsarbeit in herausfordernden Lagen (pp. 140 - 158). Waxmann. https://doi.org/10.25656/01:20629
Brüsemeister, T., & Gromala, L. (2020). Konstellationen zwischen Schulleitung und Schulaufsicht. In E. D. Klein & N. Bremm (Eds.), Unterstützung - Kooperation - Kontrolle. Zum Verhältnis von Schulaufsicht und Schulleitung in der Schulentwicklung (48 ed., pp. 125 - 135). Springer VS. https://doi.org/10.1007/978-3-658-28177-9_7
Dabisch, V. (2023). The practices of data-based governance: German school supervision, professionalism and datafied structurations. Tertium Comparationis (TC), 29(1), 48-72. https://elibrary.utb.de/doi/abs/10.31244/tc.2023.01.03
Datnow, A., & Hubbard, L. (2016). Teacher capacity for and beliefs about data-driven decision making: A literature review of international research. Journal of Educational Change, 17(1), 7-28. https://doi.org/10.1007/s10833-015-9264-2
Demski, D. (2019). Und was kommt in der Praxis an? In J. Zuber, H. Altrichter, & M. Heinrich (Eds.), Bildungsstandards zwischen Politik und schulischem Alltag (pp. 129-152). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-22241-3_6
Demski, D., & Racherbäumer, K. (2015). Principals’ evidence-based practice – findings from German schools. International Journal of Educational Management, 29(6), 735-748. https://doi.org/10.1108/IJEM-06-2014-0086
Huber, S. G., Arnz, S., & Klieme, T. (2020). Schulaufsicht im Wandel. Rollen und Aufgaben neu denken. Dr. Josef Raabe Verlags-GmbH.
Mayring, P. (2015). Qualitative Inhaltsanalyse. Beltz Verlagsgruppe.


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ICSEI 2024
Conference Software: ConfTool Pro 2.6.150+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany