Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Please note that all times are shown in the time zone of the conference. The current conference time is: 10th May 2025, 11:55:32 EEST

 
 
Session Overview
Session
22 SES 02 A: Students' Assessment and Feedback
Time:
Tuesday, 27/Aug/2024:
15:15 - 16:45

Session Chair: Jani Ursin
Location: Room 039 in ΘΕE 01 (Faculty of Pure & Applied Sciences [FST01]) [Ground Floor]

Cap: 70

Paper Session

Show help for 'Increase or decrease the abstract text size'
Presentations
22. Research in Higher Education
Paper

Re-considering Authentic Assessment Through the Lenses of Sustainability, Diversity and Partnership

Patrick Baughan

The University of Law, United Kingdom

Presenting Author: Baughan, Patrick

This paper will re-examine the widely used term of ‘authentic assessment’ and argue that certain notions and assumptions about it might usefully be re-considered in view of shifting expectations and priorities in higher education. The paper links to the theme of the conference in suggesting that our conceptions of authentic assessment, and applications of it to pedagogic practice, need to shift to account for broader changes, most pointedly social, economic and environmental issues categorised under the heading of sustainability and ESD (education for sustainable development). The paper seeks to offer some new perspectives in response to the following questions:

  1. How is authentic assessment understood and used in contemporary higher education?
  2. Would there be greater value in re-pivoting our use of authentic assessment, so that it reflects current societal and educational pressures and priorities, such as sustainability, diversity and partnership?

Whilst authentic assessment is a valuable term in that it provides a tool for educating about assessment, it has also become one that is somewhat generalised. It is often used as an explanatory mechanism to promote better practice in assessment; similarly, it is often tied to employability agendas, with the assumption that assessment should focus on preparing learners for the world of work.

But should authentic assessment be about more than these things? In this re-conceptualisation, I will opine that, during their assessment journeys and over the duration of their studies, students should be exposed to wider issues by way of their assessment experiences, through alternative, contemporary lenses. These lenses include:

Authenticity as sustainability - Higher education needs to engage more deeply and urgently with sustainability. Students should graduate as ‘sustainable beings’, which means that environmental and social aspects of sustainability should be embedded in curricula, teaching and assessment. These points are supported in student-based research – see, for example work by ‘Students Organising for Sustainability’ (sos.org.uk).

Authenticity as student-staff partnership and student experience - Students should have the be actively involved in their assessment process as a normal practice. We need to collaborate with students, to provide a more student-centred student experience in assessment.

Authenticity as equality, diversity and inclusion - Assessment cannot be authentic if some learners are disadvantaged. Authentic assessment is that which is fair to all, inclusive of all, and takes steps to mitigate against unconscious bias.

Whilst other factors also need to be addressed in our efforts to develop authenticity, such as the rise and influence of Artificial Intelligence (AI), the above three will be focused on here. Further, given the recognised ‘urgency of sustainability’ the paper will concentrate especially on the first of the above.

The paper will draw on a range of contemporary literature, including that on assessment design (Sambell, 2013; Brown and Sambell, 2020), authentic assessment (and feedback) (Navé Wald and Harland, 2017; Carless et al, 2020; McArthur, 2023), education for sustainable development (UNESCO, 2017; Advance HE, 2021; Smith, 2023); equality, inclusivity and unconscious bias (Agarwal, Mercer-Mapstone and Bovill, 2020, Tai et al, 2023; Sway, 2020) and student-staff partnership (Cook-Sather, Bovill and Felten, 2014; Students Organising for Sustainability – SOS – www.sos-uk.org).

Finally, examples will be given of how authentic assessment has been applied in specific contexts using the lenses advocated.


Methodology, Methods, Research Instruments or Sources Used
The key points and arguments to be presented are formed from a literature review, which draws on themes including assessment design, authentic assessment, education for sustainable development, equality, diversity and inclusion, and staff student partnership. It applies and discusses publications and policy documents including those identified in the previous section – although these are examples and additional literature will be utilised. Further, the work makes use of other secondary sources, these being informal conversations and notes from the author’s direct involvement in assessment policy and pedagogy at several institutions. Specific application will be made to several key sources and guideline documents on education for sustainable development, including a forthcoming text of which the author of this paper is contributing, entitled ‘Education for Sustainable Development in Universities: Nurturing Graduates for Our Shared Future’ (Routledge, 2024). Finally, a reflective element based on a theory of reflection by Moon (2005) has been used to guide the approach and analysis. Whilst this is a UK based work, it will, through its use of literature and in its discussion, consider European and international contributions and frameworks, and again, particularly by way of its focus on education for sustainable development (ESD).
Conclusions, Expected Outcomes or Findings
In sum, this paper, which reports on an ongoing work, argues that to ensure authentic assessment remains a term with currency in higher education, and to ensure that authentic assessment is itself practised authentically, we need to connect it to wider, contemporary issues and challenges – through different lenses at different times - such as sustainability, wellbeing, equality, diversity and inclusion, and collaboration and partnership. In essence, authentic assessment should no longer be seen a static term to be applied in the same way to every learning, teaching, and assessment context, but as a more fluid and flexible entity. By adopting such an approach, we are more likely to achieve our goal of sustaining authenticity in assessment in the long term, as a central part of the learning and teaching process.
References
Agarwell, P. (2020). Sway: Unravelling Unconscious Bias, Bloomsbury.

Baughan, P. (2021). Reflecting on significant moments: how our own assessment journeys guide us in assessing and providing feedback to others. Invited paper, Teaching and Learning Event (online), Autonomous University of Barcelona, 4th June.

Brown, S. and Sambell, K. (2020). The changing landscape of assessment: possible replacements for unseen, time constrained, face-to-face invigilated exams. Retrieved 10.8.23 from: https://www.seda.ac.uk/wp-content/uploads/2021/04/Paper-3-The-changing-landscape-of-assessment-some-possible-replacements-for-unseen-time-constrained-face-to-face-invigilated-exams-4.pdf

Carless, D. (2020). Feedback in online learning environments, in Baughan, P, Carless, D, Moody, J, and Stoakes, G. Moving Assessment and Feedback On-Line: Key Principles for Inclusion, Pedagogy and Practice. Retrieved 1 June 2021 from: https://connect.advance-he.ac.uk/networks/events/33587 [member access only].

Cook-Sather, A., Bovill, C. and Felten. P. (2014). Engaging Students as Partners in Learning and Teaching: A Guide for Faculty. CA, Jossey-Bass.

Dawson, P., D. Carless, and Lee, P. P. W. (2020). Authentic feedback: Supporting learners to engage in disciplinary feedback practices. Assessment and Evaluation in Higher Education. https://doi.org/10.1080/02602938.2020.1769022

JISC (2015) https://www.jisc.ac.uk/guides/transforming-assessment-and-feedback/inclusive-assessment

McArthur, J. (2023). Rethinking authentic assessment: work, well-being, and society. Higher Education, 85(1), 85–101. https://doi.org/10.1007/s10734-022-00822-y

Mercer-Mapstone, L., & Bovill, C. (2020). Equity and diversity in institutional approaches to student–staff partnership schemes in higher education. Studies in Higher Education, 45(12), 2541–2557. https://doi.org/10.1080/03075079.2019.1620721

McCune, V. and Hounsell, D. (2005) The development of students’ ways of thinking and practising in three final year biology courses, Higher Education, 49, 3, pp. 255-289.

Navé Wald, N and Harland, T.  (2017) A framework for authenticity in designing a research-based curriculum, Teaching in Higher Education, 22:7, 751-765, DOI: 10.1080/13562517.2017.1289509

Quality Assurance Agency (QAA) (2023) Resources for implementing Education for Sustainability, Gloucester,  QAA.
https://www.qaa.ac.uk/news-events/news/collection-of-resources-for-implementing-education-for-sustainability-now-available

Quality Assurance Agency (QAA) and Advance HE (2021) Education for Sustainable Development Guidance, Gloucester,  QAA. https://www.advance-he.ac.uk/teaching-and-learning/education-sustainable-development-higher-education

Sambell, K. (2013). Assessment for Learning in Higher Education. London, Routledge.

United Nations Educational, Scientific and Cultural Organization (UNESCO) (2017). Education for Sustainable Development Goals: Learning Objectives. Paris, France. https://unesdoc.unesco.org/ark:/48223/pf0000247444

Smith, J. (2023). Climate Change and Student Mental Health – Report. Student Minds / UPP. https://www.studentminds.org.uk/uploads/3/7/8/4/3784584/climate_change_and_student_mental_health.pdf

Tai, J., Ajjawi, R., Bearman, M., Boud, D., Dawson, P., & Jorre de St Jorre, T. (2023). Assessment for inclusion: rethinking contemporary strategies in assessment design. Higher Education Research & Development, 42(2), 483–497. https://doi.org/10.1080/07294360.2022.2057451


22. Research in Higher Education
Paper

Students’ Voices In Co-designing Internal Feedback Research: First Methodological Steps

Maite Fernández-Ferrer1, Ana Remesal Ortiz2

1Universitat Oberta de Catalunya, Spain; 2Universitat de Barcelona, Spain

Presenting Author: Fernández-Ferrer, Maite; Remesal Ortiz, Ana

Recently, there has been considerable research on formative assessment, as evidenced by extensive scientific literature and systematic reviews. Particularly relevant are reviews exploring the evolving relationship between formative assessment (FA) and student self-regulated learning (SRL) (Winstone et al., 2017).

However, the process of self-assessment, which involves internalizing standards to regulate learning effectively, remains somewhat opaque (Lui & Andrade, 2022). Thus, there is a pertinent need to investigate the factors influencing this process, the student processes involved in interpreting and applying feedback, and how they contribute to self-regulated learning.

Feedback, understood here as the process through which students make sense of information to improve tasks and learning (Carles & Boud, 2018), requires both student and teacher’s feedback literacy. Our contribution is part of a new project on internal feedback processes of higher education students, which builds upon two main axes: (A) self-regulation and (B) self-assessment toward internal feedback (Nicol, 2020). Student self-regulation involves appropriating assessment criteria, seeking feedback, and engaging in personal reflection (Yan & Brown, 2017). Advanced self-regulation strategies enhance long-term learning prospects and transferability beyond academia. Yet, a lack of evaluative judgment and self-assessment experience may impede desired learning outcomes.

Students’ production and seeking of internal feedback to bridge performance gaps are crucial for engagement and learning efficacy (To & Panadero, 2019). Understanding students’ cognitive and affective responses to feedback, as well as the mechanisms of feedback processing, is essential for effective feedback utilization (Lui & Andrade, 2022).

Research often focuses on formal feedback experiences, neglecting informal feedback's potential for learning. Investigating how students transform external feedback into internalized feedback and their cognitive processes is imperative. This necessitates a shift towards holistic, transformative theoretical frameworks to comprehend feedback phenomena.

In summary, while various factors contributing to more efficient and higher-quality feedback have been identified, such as active student engagement in the learning and assessment process, feedback literacy, anonymous assessment roles, qualitative formats over numerical ones, provision of examples for comparison, and the integration of technology (Carless, 2019; Henderson et al., 2018; Panadero & ​​Alqassab, 2019), the processes underlying feedback mechanisms remain elusive (Lui & Andrade, 2022).

Moreover, while existing research has explored students' perceptions, emotions, and behavioral responses to feedback, understanding students' internal processes as they receive and internal feedback is crucial. This entails investigating the role of various factors in students' decision-making and behavioral responses to feedback, as well as examining additional elements such as interpretations, significance, and evaluative judgment capacity (Yan y Brown, 2017; Winstone et al., 2017).

Furthermore, there is a need to broaden the scope of feedback research beyond comments provided by evaluators to encompass comparisons with other sources, thus unlocking its full potential for learning. This necessitates an exploration of strategies through which students convert natural comparisons into formal, explicit ones, enabling them to articulate and reflect upon these comparisons independently (Nicol, 2020). Overall, advancing our understanding of how students transform external feedback into internalized feedback, along with its implications for self-regulated learning, could inform the design of effective pedagogical practices and foster improvements in students' academic performance and self-regulatory skills.

In this context, the objective of this paper is investigating students' value attribution to different feedback processes and to share our exploration of formal and informal feedback processes utilized by students, identifying mechanisms of information assimilation from external to internal feedback.Student’s personal voices are of utmost importance here, so that our presentation will focus on the process of co-designing the environment and strategy of data collection during natural learning processes.


Methodology, Methods, Research Instruments or Sources Used
The project proposes an initial collaborative co-creation process with students and international experts to develop a technologically-supported environment for the longitudinal qualitative collection of reflections. These reflections aim to reveal the processes by which students internalize received information and eventually do the transition from external feedback to internalized feedback during natural teaching and learning processes over one academic year.

From here, a mixed-method study will be conducted. Initially, a more quantitative approach will be adopted to identify, on one hand, the internal factors of processing and interpreting this feedback based on three variables: (a) previous self-regulation profiles, b) evaluative beliefs - particularly about feedback -, and (c) self-efficacy. For this purpose, specific questionnaires will first be administered for each of these constructs. On the other hand, other intervening variables will be controlled, such as field of knowledge, and academic year, type of task, feedback sources, feedback characteristics according to an ad hoc guideline.

Secondly, a qualitative study is proposed to intensively monitor the evolution of students' ability to generate internal feedback throughout an academic year. This will allow: (a) identification of the sources of information that students find most relevant for each type of task, (b) understanding the value they attribute to deliberate practices and what other natural sources they employ, (c) comprehension, by reclaiming their voices, of the actions they orchestrate as a consequence of the received information, and (d) identification of the change intentions generated by this process. In this part of the research, a series of three in-depth interview protocols has been designed, based on literature, to gather the subjective experiences, strategically placed at three different moments of the task-resolution/learning processes: at the starting point after knowing task demands, before delivering the student’s end-product, and after receiving the teacher’s feedback. Information will be collected also using tools co-designed with students, drawing on examples from Think Aloud protocols, reflective journals, and other strategies involving metacognition.

Conclusions, Expected Outcomes or Findings
There is a pressing need to delve into the cognitive, metacognitive, affective, and social processes at stake when students receive and interpret feedback. This understanding could lead to the development of tailored support structures, guidelines for teachers, and directives for students to enhance their evaluative competence, particularly in refining their evaluative judgment.

At this moment of the project we are carrying out the co-design process with 20 students of a variety of disciplinary areas. Qualitative data are being gathered with respect to their preferences and suggestions for establishing a technological environment and procedure of close accompaniment during a whole semester in natural teaching and learning settings. The resulting design, in turn, will be implemented in the second phase of the study with new participating students.

References
Broadbent, J., Sharman, S., Panadero, E., & Fuller-Tyszkiewicz, M. (2021). How does self-regulated learning influence formative assessment and summative grade? Comparing online and blended learners. The Internet and Higher Education, 50(March), 100805. https://doi.org/10.1016/j.iheduc.2021.100805
Carless, D. (2019). Feedback loops and the longer-term: Towards feedback spirals. Assessment and Evaluation in Higher Education, 44(5), 705-714. https://doi.org/10.1080/02602938.2018.1531108
Henderson, M., Boud, D., Molloy, E., Dawson P., Phillips, M., Ryan, T., & Mahoney, P. (2018). Feedback for Learning: Closing the Assessment Loop – Final Report. Canberra: Australian Government Department of Education and Training
Lui, A. M., & Andrade, H. L. (2022). The Next Black Box of Formative Assessment: A Model of the Internal Mechanisms of Feedback Processing. Frontiers in Education, 7, 751548. https://doi.org/10.3389/feduc.2022.751548
Nicol, D. (2020). The power of internal feedback: Exploiting natural comparison processes. Assessment and Evaluation in Higher Education, 46(5), 756-778. https://doi.org/10.1080/02602938.2020.1823314
Panadero, E., & Alqassab, M. (2019). An empirical review of anonymity effects in peer assessment, peer feedback, peer review, peer evaluation and peer grading. Assessment and Evaluation in Higher Education, 44(8), 1253-1278. https://doi.org/10.1080/02602938.2019.1600186
Panadero, E., Lipnevich, A., & Broadbent, J. (2019). Turning Self-Assessment into Self-Feedback. En M. Henderson, R. Ajjawi, D. Boud, & E. Molloy (Eds.), The Impact of Feedback in Higher Education (pp. 147-163). Springer International Publishing. https://doi.org/10.1007/978-3-030-25112-3_9
To, J., & Panadero, E. (2019). Peer assessment effects on the self-assessment process of first-year undergraduates. Assessment and Evaluation in Higher Education, 44(6), 920-932. https://doi.org/10.1080/02602938.2018.1548559
Winstone, N. E., Nash, R. A., Parker, M., & Rowntree, J. (2017). Supporting Learners’ Agentic Engagement With Feedback: A Systematic Review and a Taxonomy of Recipience Processes. En Educational Psychologist,52(1), 17-37. https://doi.org/10.1080/00461520.2016.1207538
Yan, Z., & Brown, G. T. L. (2017). A cyclical self-assessment process: Towards a model of how students engage in self-assessment. Assessment and Evaluation in Higher Education, 42(8), 1247-1262. https://doi.org/10.1080/02602938.2016.1260091


22. Research in Higher Education
Paper

Development of a Scale to Assess Students’ Needs-based Study Crafting: Evidence from a Pilot Study Among Japanese University Students

Hiroyuki Toyama1, Jumpei Yajima2, Katja Upadyaya1, Lauri Hietajärvi1, Katariina Salmela-Aro1

1University of Helsinki, Finland; 2Beppu University, Japan

Presenting Author: Toyama, Hiroyuki

In the realm of academic pursuit, the quest for effective learning strategies is perpetual. Among the evolving methodologies, study crafting emerges as a novel paradigm, adapted from the concept of job crafting in occupational health psychology (Tims et al., 2010). Defined as the proactive adaptation of study by students to optimize learning experiences, study crafting represents a transformative departure from conventional strategies centered on reactive adjustments to external demands (Körner et al., 2021). By empowering learners to curate their educational journey, study crafting imbues a sense of ownership, fostering personalized and engaging learning trajectories.

The significance of this proactive approach reverberates profoundly in academic circles, with implications spanning beyond mere scholastic achievements. Extant literature underscores its role in cultivating deeper comprehension, enhancing motivation, and fortifying resilience amidst academic challenges and adversities (Körner et al., 2023; Körner et al., 2021; Mülder et al., 2022). However, despite its potential, the conceptualization and empirical investigation of study crafting remain in nascent stages, warranting a comprehensive framework to elucidate its underpinnings.

In this context, the Integrative Needs Model of Crafting (de Bloom et al., 2020) was recently proposed as a theoretical framework that integrates crafting research. Rooted in the understanding that psychological needs play a pivotal role in the crafting process, this model provides a comprehensive lens through which to explore why and how individuals engage in crafting across various life domains. While extensively applied in occupational health research, the integration of this model into educational discourse remains conspicuously absent. Notably, the prevailing study crafting model (Körner et al., 2021) adopts a demands-resources-based approach, departing from the needs-centric perspective espoused by the Integrative Needs Model of Crafting.

Bridging this gap, the aim of this study is to extend the Integrative Needs Model of Crafting to the student context and develop an instrument to assess students’ needs-based study crafting, which we refer to students’ proactive and self-initiated changes in their study in order to achieve psychological needs satisfaction.


Methodology, Methods, Research Instruments or Sources Used
A new scale to assess six dimensions of needs-based study crafting (i.e., crafting for detachment from study, relaxation, autonomy, mastery, meaning, and affiliation) were created, referring to the Needs-Based Job Crafting Scale (Tušl et al., 2024). To rigorously evaluate the psychometric properties of this instrument, we conducted a pilot study among university students. Drawing participants from a local university in Japan, we conducted a cross-sectional survey. The survey booklet administered during class sessions included the Needs-Based Study Crafting Scale, alongside established measures assessing JD-R-based study crafting, proactive personality, DRAMMA needs satisfaction, study engagement, subjective vitality, and school life satisfaction.
The Needs-Based Study Crafting Scale were scored on a 5-point Likert scale, ranging from 1 (never) to 5 (very often). JD-R-based study crafting was measured using an instrument used in Mülder et al. (2022). The items were scored on a 5-point Likert scale, ranging from 1 (not true at all) to 5 (totally true). Proactive personality was assessed using four items from the Proactive Personality Scale (Bateman & Crant, 1993). The items were rated on a 5-point Likert scale, ranging from 1 (not at all true) to 5 (very true). DRAMMA needs satisfaction was assessed using the Recovery Experience Questionnaire for detachment and relaxation (Sonnentag & Fritz, 2007), the Basic Psychological Need Satisfaction Scale for autonomy, mastery, and affiliation (Chen et al., 2015), and the Meaning in Life Questionnaire (Steger et al., 2006) for meaning. All items were scored on a 5-point scale, ranging from 1 (Not agree at all) to 5 (Fully agree). Study engagement was assessed using the 9-item version of the Work Engagement Scale for Students (Tayama et al., 2019). The items were rated on a 7-point Likert scale, ranging from 0 (never) to 6 (always). Subjective vitality was assessed using the Subjective Vitality Scale (Ryan & Frederick, 1997). The items were rated on a 5-point Likert scale, ranging from 1 (very rarely or never) to 5 (very often or all the time). Finally, school life satisfaction was measured using a single item adapted from Van den Broeck et al. (2010): “How satisfied have you been with your school life over the past month?”. This item was scored on a scale, ranging from 1 (very dissatisfied) to 10 (very satisfied).

Conclusions, Expected Outcomes or Findings
The data showed high internal consistency of the scale (α = .96 for the global scale; α = .95 for crafting for detachment from study, α = .97 for crafting for relaxation, α = .86 for crafting for autonomy, α = .91 for crafting for mastery, α = .91 for crafting for meaning, and α = .94 for crafting for affiliation). The results of CFA confirmed the proposed six-factor structure of the scale. Correlation analysis revealed that the scale is meaningfully associated with theoretically relevant constructs, including the JD-R-based study crafting, proactive personality, study engagement, vitality, and school life satisfaction. Furthermore, the scale showed incremental validity in explaining variance in DRAMMA needs fulfillment, study engagement, vitality, and school life satisfaction over and above needs-based off-job crafting. Collectively, the results presented herein suggest the scientific utility of the developed scale, thereby advocating for its continued exploration and utilization in practical contexts. Its completion will enable researchers to reasonably evaluate students’ needs-based study crafting and encourage new research efforts to gain novel insight into the construct.        
References
de Bloom, J., Vaziri, H., Tay, L., & Kujanpää, M. (2020). An identity-based integrative needs model of crafting: Crafting within and across life domains. Journal of Applied Psychology, 105(12), 1423–1446. https://doi.org/10.1037/apl0000495
Körner, L. S., Mülder, L. M., Bruno, L., Janneck, M., Dettmers, J., & Rigotti, T. (2022). Fostering study crafting to increase engagement and reduce exhaustion among higher education students: A randomized controlled trial of the study coach online intervention. Applied Psychology: Health and Well-Being. Advance online publication. https://doi.org/10.1111/aphw.12410
Körner, L. S., Rigotti, T., & Rieder, K. (2021). Study crafting and self-undermining in higher education students: A weekly diary study on the antecedents. International Journal of Environmental Research and Public Health, 18(13), 7090. https://doi.org/10.3390/ijerph18137090
Mülder, L. M., Schimek, S., Werner, A. M., Reichel, J. L., Heller, S., Tibubos, A. N., Schäfer, M., Dietz, P., Letzel, S., Beutel, M. E., Stark, B., Simon, P., & Rigotti, T. (2022). Distinct patterns of university students study crafting and the relationships to exhaustion, well-being, and engagement. Frontiers in Psychology, 13:895930. https://doi.org/10.3389/fpsyg.2022.895930
Tims, M., & Bakker, A. B. (2010). Job crafting: Towards a new model of individual job redesign. SA Journal of Industrial Psychology, 36(2), a841. https://doi.org/10.4102/sajip.v36i2.841
Tušl, M., Bauer, G. F., Kujanpää, M., Toyama, H., Shimazu, A., & de Bloom, J. (in press). Needs-based job crafting: Validation of a new scale based on psychological needs. Journal of Occupational Health Psychology.


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ECER 2024
Conference Software: ConfTool Pro 2.6.153+TC
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany