Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Please note that all times are shown in the time zone of the conference. The current conference time is: 17th May 2024, 07:19:10am GMT

 
 
Session Overview
Session
02 SES 02 C: Assessment and Feedback in VET
Time:
Tuesday, 22/Aug/2023:
3:15pm - 4:45pm

Session Chair: Ann Karin Sandal
Location: Boyd Orr, Lecture Theatre 2 [Floor 2]

Capacity: 250 persons

Paper Session

Show help for 'Increase or decrease the abstract text size'
Presentations
02. Vocational Education and Training (VETNET)
Paper

Zooming in on Assessment: an Analytical Model for Investigating Changes in Assessment in VET

Karin Luomi-Messerer, Monika Auzinger

3s, Austria

Presenting Author: Luomi-Messerer, Karin; Auzinger, Monika

As education in general and VET systems in particular are influenced by and responsive to external drivers as well as policy or ideological considerations, assessment practices are also influenced by various factors and trends, such as social, demographic, economic, environmental, and technological trends and developments. Assessment is in particular influenced by changes related to educational principles and practices. Some factors that potentially shape the evolution of assessment in VET include, for example, the broadening of the skills and competence base of VET with a strengthened emphasis on general subjects and an increased focus on transversal skills and competences as well as changes in the organisation and delivery of VET (such as an increased focus on work-based learning). Also the growing emphasis on accountability can influence how assessment is organised and shaped. Additional contributing factors include the vast technological developments and digitalisation as well as the upskilling and reskilling needs of adults that are gradually driving authorities and providers to open up to new groups of learners.
Examining assessment approaches in VET in different countries and how they have developed over time can provide important insights into how learners’ competences and achievement of intended learning outcomes are determined, how evidence on an individual’s progress and achievement of learning goals is collected and judged and for what purposes the results are used.

The key research question underpinning this paper therefore is as follows: Which are the prevalent assessment forms applied in initial VET in Europe, how have these evolved during the past 25 years and what future trends can be identified?

This research applies an analytical framework that builds on Cedefop's (2020) ‘Three Perspectives Model for VET’, which comprises an epistemological and pedagogical perspective, an education system perspective and a socioeconomic perspective. This model includes diachronic (referring to changes over history within a country) and synchronic (comparisons between countries) analyses of VET systems and the development of corresponding patterns or profiles based on the interplay of characteristics.

While the original model includes assessment as one of the features of the epistemological and pedagogical perspective, a more detailed analysis of assessment approaches and their evolvement requires further differentiation of this dimension. This paper therefore ‘zooms in on assessment’ and identifies the following key areas to be explored for gaining insights into the changes in assessment: (a) main purposes and functions of assessment, (b) scope, focus and content of assessment, (c) reference points and criteria for assessments, (d) methods, tools and context of assessment and stakeholders involved, (e) alignment between intended learning outcomes, delivery of programmes and assessment criteria, (f) key technical characteristics ensuring quality of assessment.


Methodology, Methods, Research Instruments or Sources Used
To answer the research questions, several research methods and datasets were used, including desk research for conducting a comprehensive literature review for refining the analytical framework, specifying the key areas to be explored and identifying changes in assessment over time as well as future trends. Input from various experts across Europe and results of a survey among European VET providers were also used. The main source of information, however, were seven thematic case studies in seven countries (each focussing on specific features of assessment and related change processes) that were conducted based on desk research and interviews with relevant key stakeholders. The countries featured in the case studies include Austria, Croatia, Estonia, Finland, Lithuania, the Netherlands, and Poland. The research was conducted in 2021 and 2022.
The analytical framework provided the basis for both the design of the research instruments and the analysis of the data collected. Although the analytical model has some limitations (e.g. some of the dimensions refer to dichotomous characteristics and variants while others do not, and the model applies an artificial separation and differentiation of some dimensions that are actually closely related), the approach used in this study generally allowed for the identification of changes and trends in assessment.

Conclusions, Expected Outcomes or Findings
The research findings show that assessment approaches in VET are continuously being reformed in many countries, indicating their importance in improving the quality and value of VET. The way in which assessment has evolved during the period of study is closely related to changes in the way qualifications and curricula are described and organised. The shift towards learning outcomes and the increased focus on flexible learning pathways has led to the introduction of new approaches to assessment. Closer links to the labour market and employer involvement in all aspects of VET can be seen as driving the introduction of assessment methods that are also closely related to the labour market (in terms of locations, tasks to be solved or stakeholders involved in the assessment).
The developments that can be observed in European countries often do not follow a a linear process. In some cases, it is a matter of striving for an improved approach that is modified repeatedly, and at the same time there may be opposing tendencies. A kind of pendulum effect can be observed in some cases. For example, traditionally there has been a strong emphasis on summative assessment, while overall, an expansion of assessment functions, including formative assessment, can be observed. The latter aims at supporting learning and seems to be strengthened in several countries as a kind of countermeasure to the strong focus on summative assessment. There is also an increasing focus on standardised and external assessments, which are often used to ensure a high level of reliability of assessment, alongside an increasing use of workplace demonstrations of competence which can ensure authenticity and validity. However, trying to achieve different goals with assessment at the same time can lead to tensions (e.g. when accountability and reliability on the one hand and validity and authenticity on the other are to be achieved).

References
Cedefop (2020). Vocational education and training in Europe, 1995-2035: Scenarios for European vocational education and training in the 21st century. Luxembourg: Publications Office. Cedefop reference series, No. 114.
Cedefop (2022). The future of vocational education and training in Europe: volume 3:  the influence of assessments on vocational learning. Luxembourg: Publications Office. Cedefop research paper, No 90.
Coates, H. (2018). Assessing learning outcomes in vocational education. In: S. McGrath, S. et al. (eds). Handbook of vocational education and training:  developments in the changing world of work, pp. 1-17.
Field, S. (2021). A world without maps? Assessment in technical education: a report to the Gatsby Foundation.
Gulikers, J.T.M., et al. (2018). An assessment innovation as flywheel for changing
teaching and learning. Journal of Vocational Education and Training, Vol. 70, No 2, pp. 212-231.
Michaelis, C. and Seeber, S. (2019). Competence-based tests: measurement challenges of competence development in vocational education and training. The future of vocational education and training in Europe. Volume 3, In: Handbook of vocational education and training: developments in the changing world of work, pp. 1-20.
Nyanjom, J. et al. (2020). Integrating authentic assessment tasks in work integrated learning hospitality internships. Journal of Vocational Education and Training, pp. 1-23.
OECD (2013). Synergies for better learning: an international perspective on evaluation and assessment. Paris: OECD Publishing.
Panadero, E. et al. (2018). Self-assessment for learning in vocational education and training. In: Handbook of vocational education and training: developments in the changing world of work, pp. 1-12.
Pellegrino, J.W. et al. (2016). A framework for conceptualizing and evaluating the validity of instructionally relevant assessments. Educational Psychologist.
Psifidou, I. (2014). Redesigning curricula across Europe: implications for learners’ assessment in vocational education and training. In: Empires, post-coloniality and interculturality. Brill Sense, pp. 135-150.
Räisänen, A. and Räkköläinen, M. (2014). Assessment of learning outcomes in Finnish vocational education and training. Assessment in Education, Vol. 21, No 1, pp. 109-124.
Siarova, H. et al. (2017). Assessment practices for 21st century learning: review of evidence: analytic report. NESET II report, Luxembourg: Publications Office.
Tveit, S. (2018). Ambitious and ambiguous: shifting purposes of national testing in the legitimation of assessment policies in Norway and Sweden (2000-2017). Assessment in Education, Vol. 25, No 3, pp. 327-350
Winch, C. (2016). Assessing professional know‐how. Journal of Philosophy of Education, Vol. 50, No 4, pp. 554-572.


02. Vocational Education and Training (VETNET)
Paper

A Tool for the Self-Assessment of Informal Learning for Workers in the Metal and Electrical Industry

Martin Fischer

Karlsruhe Institute of Technology, Germany

Presenting Author: Fischer, Martin

One of the earliest practical attempts in Germany towards the recognition of informally acquired competences was a sector specific one in the metal and electrical industry of the German state Baden Württemberg. Although the relevance of the formal VET system in Germany has often been highlighted there is a remarkable number of employees who do not have a formal qualification and cannot prove their competences (in Baden-Württemberg 13.65 % of all employees in 2020).

It is a generally shared assumption that many competences - and especially work-relevant competences - are acquired through learning in the process of work (Boreham et al. 2002, Fischer et al. 2004). However, informally, non-formally and formally acquired competences are difficult to distinguish from each other. The same competence, e.g. mastery of a certain machine, may have been acquired informally (e.g. by "copying" from colleagues), non-formally (e.g. in a course) or formally (e.g. in initial vocational training). And competences are also acquired informally in formal settings, e.g. teamwork skills when learning in groups, or foreign language skills in specialised training. Formal/informal/non-formal learning are therefore – in contrast to many definitions (CEDEFOP 2009, OECD 2006, BMBF not discrete categories that can be completely separated from each other, but attributes that have been assigned to learning by the state and society (Colley et al. 2003). It follows from this: If one wants to record informally acquired competences, one cannot exclude anything from the spectrum of relevant competences from the outset.

The task of recording competences of employees in the metal and electrical industry was posed in several projects funded by the state of Baden-Württemberg and supported by the social partners. Against this background, an online tool was to be developed and then made available to any person interested in taking stock of their competences. Under such a premise, it quickly became apparent in an empirical study involving ten companies that employees in the metal and electrical industry have little use for the competence designations discussed in vocational education (Fischer et al. 2014) - at least as a means of self-assessment.

What people can give more information about are the tasks they perform or have performed and mastered in gainful employment. However, one encounters the fact that these tasks are described differently from person to person: What for one person means "maintaining machines and equipment" is for another "repair” and for a third "keeping the machines running". Therefore, task descriptions as a structure for classification and a framework for self-reflection must be presented in a generally understandable form so that people can express themselves in terms of "I can" or "I can't" and so that these statements can then be compared. This is done through so-called task inventories (Frieling et al. 2000), whereby work tasks in a professional field of activity are described and presented in a structured form. In the presentation it will be described how such task inventories were developed and transposed into an online tool, which is now freely available and translated into five languages. Advantages and disadvantages of using such a tool for the recognition of prior learning will be discussed in a final conclusion.


Methodology, Methods, Research Instruments or Sources Used
The instrument for making informally acquired competences visible was developed and tested in a participatory technology development process (cf. Fischer 2000, p. 249 et seq.) with the involvement of social partners, personnel managers, works councils, chambers, employment agencies, scientists and those affected.
In order to identify work tasks in the metal and electrical industry, an interview study and workplace observations were carried out. The project involved 75 interviews in 10 companies. The analysis of work activities focused mainly on semi-skilled and unskilled workers, but also on skilled workers. Technical supervisors were also interviewed where they could provide information on the nature, extent and systematisation of the work involved. It was important to include in the study people who were actively involved in the work activities, i.e. who either carried them out themselves or were supervisors of those who carried them out. Human resources staff, managers and works councils were also involved, as these are the people involved in personnel decisions.
Regardless of whether competences have been acquired through learning in the process of work or in institutional learning environments, it always requires a separate reflection on what kind of skills have emerged from the respective learning process. If it is and should be the subjects themselves (as in our projects) who document and make visible their skills, then a framework must be provided that simultaneously offers a stimulus for reflection and a structure for classification. This framework is provided by a task inventory.
However, such an inventory of tasks did not exist for the metal and electrical industry. We had to develop it first. All kinds of information were used for this purpose (training regulations, framework curricula, (company) qualification profiles, job advertisements, German Industrial Standards and the collective wage agreement).
Expert surveys were conducted to validate the task inventory. Furthermore, interested workers were able to test the competence tool for the recognition of informally acquired competences in a participatory pre-test phase and thus help to optimise the usability of the tool. These tests were accompanied scientifically and served to further develop, test acceptance and validate the task inventory.

Conclusions, Expected Outcomes or Findings
The developed tool AiKomPass (www.aikompass.de) is available online in five languages (German, English, French, Italian and Swedish) and can be used free of charge. With AiKomPass, users can select from a structured list those tasks that they are able to perform and/or that they are currently still performing. The tasks come from the fields of work preparation, production, maintenance and production and warehouse logistics in the metal and electrical industry. So-called digital competences have been added in the meantime. This specialist task inventory is expanded to include activities that are important in their free time, as well as the option to store a CV including references, certificates, etc. The result is an individual overall profile that can be used for personal and professional development and in the validation of informally acquired competences. However, the validation of self-assessed competences requires further analysis and interpretation by relevant experts.
The test phase and subsequent pilots have shown that interested people (even without a formal vocational qualification) can use AiKomPass and create an individual competence profile for themselves. However, the problem with this type of competence diagnostics qua self-assessment is that the possible scope of mastered work tasks alone says little about the quality of task processing. But at least information is provided on the scope of an individual competence profile, and this scope can be compared with the scope required in a training occupational profile, whereas all known procedures of vocational competence diagnostics in Germany attempt to derive statements for the overall vocational qualification from a more or less small section of competences tested in greater depth (cf. Fischer 2018).
For future research and development, the question therefore arises as to how self-assessment procedures can interact with "objective" procedures of competence diagnostics.

References
BMBF - Bundesministerium für Bildung und Forschung (2020): Weiterbildungsverhalten in Deutschland 2020. Ergebnisse des Adult Education Survey — AES-Trendbericht. https://www.bmbf.de/SharedDocs/Publikationen/de/bmbf/1/31690_AES-Trendbericht_2020.pdf.
Boreham, N. C., Samurcay, R., & Fischer, M. (eds.) (2002). Work Process Knowledge. London, New York: Routledge.
CEDEFOP – European Centre for the Development of Vocational Training (2009). European guidelines for validating non formal and informal learning. Luxembourg. http://www.cedefop.europa.eu/en/Files/4054_EN.PDF.
Colley, H., Hodkinson, P., & Malcolm, J. (2003): Informality and Formality in Learning. Learning and Skills Research Centre. University of Leeds. http://www.uk.ecorys.com/europeaninventory/publications/concept/ lsrc_informality_formality_learning.pdf.
Fischer, M. (2000). Von der Arbeitserfahrung zum Arbeitsprozeßwissen. Rechnergestützte Facharbeit im Kontext beruflichen Lernens. Opladen: Leske + Budrich, unchanged new edition: Berlin et al.: Springer.
Fischer, M., Boreham, N. C., & Nyhan, B. (eds.) (2004). European Perspectives on Learning at Work: The Acquisition of Work Process Knowledge. Cedefop Reference Series. Luxembourg: Office for Official Publications for the European Communities.
Fischer, M., Huber, K., Mann, E. & Röben, P. (2014): Informelles Lernen und dessen Anerkennung aus der Lernendenperspektive – Ergebnisse eines Projekts zur Anerkennung informell erworbener Kompetenzen in Baden-Württemberg. In: bwp@ Berufs- und Wirtschaftspädagogik – online, Ausgabe 26, pp. 1–21.
Fischer, M. (2018): Verfahren der Messung beruflicher Kompetenzen/ Kompetenzdiagnostik. In: R. Arnold/A. Lipsmeier/M. Rohs (Hrsg.): Handbuch Berufsbildung. Springer Reference Sozialwissenschaften. Wiesbaden: Springer VS, S. 263–277. DOI https://doi.org/10.1007/978-3-658-19372-0_22-1.
Frieling, E./Kauffeld, S./Grote, S. (2000): Fachlaufbahnen für Ingenieure – Ein Vorgehen zur systematischen Kompetenzentwicklung. In: Zeitschrift für Arbeitswissenschaft, 54, pp. 165–174.
OECD – Organisation for Economic Co-operation and Development (2006). New OECD Activity on Recognition of non-formal and informal Learning. Guidelines for Country Participation. http://www.oecd.org/edu/skills-beyond-school/recognitionofnon-formalandinformallearning-home.htm.


02. Vocational Education and Training (VETNET)
Paper

Self-regulation and Formative Feedback in Vocational Education and Training

Ann Karin Sandal, Kjersti Hovland

Western Norway Univ of Applied Sciences, Norway

Presenting Author: Sandal, Ann Karin; Hovland, Kjersti

The renewal of the Norwegian curricula in 2020, named Knowledge Promotion 2020, aims to strengthen relevance in school subjects and with specific priorities and consequently “prepare the students for the future” (Norwegian Directorate for Education and Training, 2020). The Overall part of the curricula presents overarching aims related to student agency and highlights especially students` in-depth learning, critical thinking, and learning strategies, as a foundation for life-long learning (Norwegian Directorate for Education and Training, 2020). Similar skills, such as cognitive and meta-cognitive skills, critical and creative thinking, learning-to-learn and self-regulation, are presented in the OCED Learning Framework 2030 (2018) and the Conceptual Learning Framework (OECD, 2019), reflected in the Norwegian curricula. Reflection on learning, learn to formulate questions, seek answers, and express their understanding will lay ground for student agency and learning strategies (Norwegian Directorate for Education and Training, 2020). The overarching aims, values ​​and principles formulated in the Overall part of the curricula comprise primary school and upper secondary education, including vocational education and training (VET). VET in Norway is part of the formal upper secondary education system (age 16-19). In the main model, VET is organised as two years in school (including practice placement periods) and two years in apprenticeship, also denounced as the 2+2 model.

Along with the educational policies and aims for future education in many countries, inspired by OECD, there is an extensive body of research in self-regulation (SR) and learning strategies. Self-regulation, as a trait related to motivation and assessment for learning (Smith et al., 2016), is related to how the learner sets goals and learn to monitor, regulate, and control cognition, motivation, and behavior to reach their goals (Andrade, 2010; Smith et al., 2016). Zimmerman (2000) emphasise feedback from the self (self-assessment) and significant others as important for the development of SR skills, and an elaboration of the interplay and dialogue between self-regulation and feedback is also found in Hattie and Timperley’s` work (2007). In their feedback model, student’s agency and SR will be stimulated and supported by dialogues related to learning goals and specific and timely feedback during the learning process, including information about the next step. Feedback thus can be defined as information provided to the learner about performance and aiming to promote further learning (Hattie & Timperley, 2007; Shute, 2008). Such feedback functions formative if the student can use the feedback (Black & Wiliam, 1998; Wiliam, 2011).

Formative feedback is included in the assessment regulations since 2006 and have been implemented at all levels in the school system. In the renewal of the curricula 2020, student agency in assessment is emphasized and related to self-regulation and learning strategies.

However, the concepts of SR and formative feedback are not to a substantial extent contextualized in VET (Panadero, 2017; Panadero et al., 2018). In the Norwegian VET, interpretation and implementation of the overarching aims in the curricula and assessment regulations in upper secondary school is often dominated by the traditions and methods in the general study subjects, including assessment practices (Sandal, 2021). The study this paper report from therefore was established, aiming to investigate how VET student develop of SR through formative assessment. Research question: How do VET students perceive and experience formative feedback as promoting self-regulation skills? An underlying premise for the study is formative feedback as an approach to and potential for stimulating SR skills.


Methodology, Methods, Research Instruments or Sources Used
Data is collected through three qualitative focus group interviews (Liamputtong, 2011) with VET students (N=11) in the VET program Health and Youth Development in an Upper Secondary School in Norway. The students were recruited from three classes through their teachers, who had participated in a professional development program in assessment. We used a semi-structured interview guide with the themes: feed up-phase/ discussions of learning goals and criteria, formative feedback practices, self-assessment and peer assessment, feed forward/ discussions of next step in the learning process, drawing on these concepts in literature (e.g., Andrade, 2010; Black & Wiliam, 1998; Hattie & Timperley, 2007; Zimmermann, 2000). The learning contexts for themes was mainly the school-based part of the vocational education, that is teaching in classroom and workshops/laboratories. The interviews were recorded, transcribed, and analyzed, using Nvivo as a tool (QSR International). The initial coding of the transcripts made basis for meaning condensation and categories (Kvale & Brinkmann, 2009). However, since the research question is directed to the relation between formative feedback and SR skills, the analysis had a deductive approach when establishing categories. The study is conducted according to ethical guidelines in research (NSD-Norwegian centre for research data).
Conclusions, Expected Outcomes or Findings
The preliminary findings show a structure and teaching design in the classrooms and workshops that enables formative feedback at all stages in the learning processes, although to varying degrees. The students report that the teachers put emphasis on discussing the learning goals with the students, for example what to learn and why, how they are going to work, as well as assessment criteria and success criteria. However, the learning goals are decided by the teachers and the students are not invited into processes of formulating individual learning goals. During the learning processes, the students both receive and seek feedback from the teacher and peers. They report that they receive detailed and specific feedback, and, in some classes, they are given time to practice and try again, for example take a test twice. Simultaneously, we also find that the students are preoccupied with grades and summative assessments and do not by themselves use the summative feedback to learn further. Student agency is found in for example teachers engaging the students in discussions of methods, cooperation in on work tasks, engaging the students in design assignment questions and work tasks. Stimulating student agency and SR is also found in the frameworks for self-assessment where teachers invite the students to assess their learning achievement, effort, and results, and articulate their need for support and feedback.
By analyzing the data through a conceptual framework of formative feedback, the findings indicate formative assessment activities supporting students` development of SR skills. However, there are several weaknesses with this perspective. The teachers` intentions with formative feedback is not clearly expressed and whether development of SR skills is embedded in their reasoning and practice. There is also a need for in-depth analysis of the learning methods, especially in the workshops to understand SR in VET.

References
Andrade, H.L. (2010). Students as the definitive source of formative assessment: Academic self-assessment and the self-regulation of learning. In H. Andrade & G. J. Cizek (Eds.), Handbook of formative assessment (pp. 90–105). New York, NY: Routledge
Black, P. & Wiliam, D. (1998). Inside the Black Box: Raising Standards through Classroom. Phi Delta Kappan (92)1, 81-90
Hattie, J. & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112.
Kvale, S. & Brinkmann, S. (2009). Interview. Introduktion til et håndværk. 2. udgave. Organisation for Economic Co-operation and Development (OECD) København: Hans Reitzels Forlag.
Liamputtong, P. (2011). Focus Group Methodology. Principles and Practice. London: Sage Publications.
Norwegian Directorate for Education and Training (2020). Core curriculum – values and principles for primary and secondary education, chapter 2.4. Oslo.
OECD (2018). The future of education and skills. Education 2030. http://www.oecd.org/education/2030/E2030%20Position%20Paper%20(05.04.2018).pdf. Retrieved 26.02.2019.
OECD (2019). Future of Education and Skills 2030. Conceptual learning Framework Skills for 2030. https://static1.squarespace.com/static/5e26d2d6fcf7d67bbd37a92e/t/5e411f365af4111d703b7f91/1581326153625/Education-and-AI.pdf
Panadero, E. (2017). A Review of Self-regulated Learning: Six Models and Four Directions for Research. REVIEW article. Frontiers in Psychology, 28 April 2017. https://doi.org/10.3389/fpsyg.2017.00422
Panadero, E., Garcia, D., & Fraile, J. (2018). Self-Assessment for Learning in Vocational Education and Training. In S. McGrath, M. Mulder, J. Papier, & R. Suart (Eds.), Handbook of Vocational Education and Training: Developments in the Changing World of Work (pp. 1-12). Cham: Springer International Publishing
QSR International. WHAT IS NVIVO? Software that supports qualitative and mixed methods research. http://www.qsrinternational.com/nvivo/what-is-nvivo.
Sandal, A.K. (2021). Vocational teachers` professional development in assessment for learning, Journal of Vocational Education & Training
Shute, V. (2008). Focus on formative feedback. Review of educational research, 153-189.
Smith, K., Gamlem, S.M., Sandal, A.K. & Engelsen, K.S. (2016). Educating for the future: A conceptual framework of responsive pedagogy. Cogent Education, 3(1), 1-12.
The Norwegian Directorate for Education (2018). Fagfornyelsen. https://www.udir.no/laring-og-trivsel/lareplanverket/fagfornyelsen/.
Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37(1), 3-14.
Zimmerman, B.J. (2000). Attaining self-regulation: A social cognitive perspective. In M. Boekaerts, P.R. Pintrich & M. Zeider (Eds.), Handbook of self-regulation. New York: Academic


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ECER 2023
Conference Software: ConfTool Pro 2.6.149+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany