Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Please note that all times are shown in the time zone of the conference. The current conference time is: 2nd May 2024, 09:15:40pm GMT

 
 
Session Overview
Session
06 SES 02 A: Focussing Media Literacies and Competencies: Data Privacy, Fake News and Algorithms
Time:
Tuesday, 22/Aug/2023:
3:15pm - 4:45pm

Session Chair: Theo Hug
Location: Gilbert Scott, G466 LT [Floor 4]

Capacity: 114 persons

Paper Session

Show help for 'Increase or decrease the abstract text size'
Presentations
06. Open Learning: Media, Environments and Cultures
Paper

Preparing Elderly Adult Communities for the Digital Culture: Understanding the Role of a Mediator for Data Privacy

Tobias Hölterhof, Daniela Thomas

Catholic University of Applied Science, Germany

Presenting Author: Hölterhof, Tobias; Thomas, Daniela

The everyday life is becoming increasingly interwoven with digitally shaped practices. Digitalisation meanders through every part of society by connecting people, building networks and offering opportunities for social participation (Klenk et al 2020). In order to prepare the society for fruitful and safe digital practices, the development of digital competences must be complemented with issues of data privacy and security. Especially elderly people are supposed to be vulnerable in this concern, as they are often untrained and show security-related reservations (Rathgeb et al., 2022). This vulnerability leads to reduced social participation and quality of life, as well as it increases isolation and loneliness (Rathgeb et al., 2022; Chopik 2016). Related to other demographic developments, like a growing number of chronic diseases and singularisation, preoccupation with digital vulnerability becomes more relevant. But typical educational offerings for data privacy are not focused on the particular demands of elderly adult communities (Doh et al. 2018).

A few educational design projects address digital competences of elderly adult communities, emphasising peer learning mostly in Germany. Projects like “FUTA” (Doh et al 2015), “KommmIT” (Doh et al 2021), “QuartiersNetz” (Stiel 2021) and “Gemeinsam in die digitale Welt” (Barczik 2020) are all characterised by an indirect and mediator resp. peer oriented approach, but without a deep reflection on the particular role of mediators resp. peers for this community. These projects use metaphors like ambassadors or companions to describe the particular role of persons mediating aspects of digital culture in elderly communities. Beside other issues, they provide and evaluate workshops to prepare mediators to act as agents of a digital culture in their communities. Interestingly, the workshops are often complemented with textbooks or handbooks. Further, the underlying concept of cultural ambassador resp. educational companions is returned to the educational psychology of peer learning, or to the methodology used in the project “Medienscouts NRW” (Kerres et al 2012), which was one of the first informal educational projects to address digital competences in Germany, here with a focus not on elderly persons but on school kids. But what it means for an elderly peer to be a mediator for aspects of the digital culture often remains uncertain and imprecise. This conceptual gap is of importance because beside developing digital practices, these projects also aim at preparing elderly adults for being mediators.

In order to introduce elderly adults into a digital culture and to prepare for secure digital practices, the understanding of being a mediator within that specific community is of importance. To develop a deeper understanding of that role, the current study follows both a conceptual as well as an empirical approach and is embedded in the project “CrossComITS”, funded by the German Federal Ministry of Education and Research. The project aims at training mediators for data privacy of elderly adults and develops a digital platform for their mediating practices in their communities. Beside drawing on similar projects, the conceptual part consists of an analysis of media philosophical and theoretical educational concepts of mediators. Here, Sybille Krämers ideas of a messenger-model (Krämer 2016) has been superimposed with the perspective of educational psychology and here especially with Kolb’s theory of experiential learning (Kolb 2015). The empirical part consists of analysis of 4 interviews with adults acting as mediators in different contexts. The analysis aims at developing typical characteristics of being a mediator in order to derive design principles for workshops to prepare elderly adults to act as mediators for secure digital practices in their communities.


Methodology, Methods, Research Instruments or Sources Used
The research question focuses on the development of a conceptual and evidence based understanding of mediation with regard to an educational design project to develop data privacy of elderly adults communities. Therefore, the methodology comprises 1. a conceptual analysis of mediation, 2. an analysis of similar educational projects and 3. guideline-based interviews with elderly mediators in different contexts about their motivations and understandings of the role of mediation. We analysed 4 similar projects and conducted 4 interviews so far. The findings are incorporated into the development of a pedagogical concept. Furthermore they lead to a deeper understanding of a theoretical framework on peer education and the notion of a mediator.

The conceptual analysis (1.) draws on concepts from educational psychology and media philosophy. Krämer's messenger model and Kolb's multi-contextual understanding of the role of the educator were analysed. Krämer classifies media as heteronomous and aims to shed light on the conditions and contexts of media and the phenomenon of transmission (Krämer 2016). Kolb's holistic model of education discusses various roles of educators in everyday life as well as their community aspects (Kolb 2015). Both emphasise, among other things, trustworthiness and the ability to bridge a difference through the adoption of an individual's abilities as key messenger characteristics.

The literature review of similar projects (2.) revealed 4 projects in Germany. "FUTA" (Doh et al. 2015) and "KommmIT" (Doh et al. 2021) focus on elderly people as users or potential users of digital devices with peer learning in private communities. "QuartiersNetz" (Stiel 2021) aims at describing a profile of media-savvy volunteers and developing media training for elderly communities. "Gemeinsam in die digitale Welt" (Barczik 2020) aims at the mediation and enhancement of digital competences of older people through peer-to-peer learning. The literature provides insights into the content structure of mediator training, different mediator metaphors (technology ambassador, facilitator) and descriptions of mediator characteristics such as trustworthiness, credibility and the ability to adapt input to the individual needs of the recipient. A deeper reflection on mediator and peer education approaches is missing.

The guideline-based interviews (3.) are selected with respect to adults with experience of volunteering as mediators in informal adult learning communities. Following an interpretative approach to unveil the mediator role, experiences and understandings of mediating in informal contexts, the results sketch their understandings of being successful, of disseminating innovations into a community and motives for being a volunteer. Further interviews are planned.

Conclusions, Expected Outcomes or Findings
Peer education seems to be in principle a suitable format to develop digital competences and to minimise digital vulnerability in communities of older adults. However, the analysis of similar projects shows that there is a lack of general conceptual understanding and reflection on the role of mediators and peer education in the field. Peer education still seems to be an umbrella term as mentioned by Shiner (1999).
In order to deal with the concept of the mediator in depth, the reflection from a media philosophical (Krämer) and pedagogical (Kolb) perspective has proven to be supportive. It aligns the empirical findings as analysed projects and interpreted interviews to theoretical concepts and hereby constitutes a deeper understanding. This leads to the conclusion that mediators take on different roles while mediating. In order to increase digital literacy in elderly communities, it is necessary to be responsive to individual needs and to be seen as trustworthy and reliable. On the other hand, it enables the mediator to create a learning environment that meets the individual's daily needs and to moderate, facilitate and empower older adults in order to introduce them to a digital culture and prepare them for safe digital practices.

References
Barczik, K. (2020). Stärkung der digitalen Medienkompetenz bei Älteren im ländlichen Raum: Qualifizierung von Technikbotschaftern und Anwendung der Peer- to-Peer Didaktik Bericht zum Projekt „Gemeinsam in die digitale Welt“ an der Volkshochschule Zwickau (Projektbericht Heft 14; Edition VHS Aktuell - Beiträge zur Weiterbildung). VHS Sachsen. https://vhs-sachsen.de/wp-content/uploads/2022/08/Barzcik_Zwickau_Projektbericht_gesamt.pdf

Chopik, W. J. (2016). The Benefits of Social Technology Use Among Older Adults Are Mediated by Reduced Loneliness. Cyberpsychology, Behavior, and Social Networking, 19(9), 551–556. https://doi.org/10.1089/cyber.2016.0151

Doh, M., Schmidt, L., Herbolsheimer, F., Jokisch, M. R., Schoch, J., Dutte, A. J., Rupprecht, F., & Wahl, H.-W. (2015). Neue Technologien im Alter Ergebnisbericht zum Forschungsprojekt „FUTA“ Förderliche und hinderliche Faktoren im Umgang  mit neuen Informations- und Kommunikations-Technologien im Alter Neue Technologien im Alter - Ergebnisbericht zum Forschungsprojekt „FUTA“

Doh, M., Jokisch, M. R., Rupprecht, F., Schmidt, L., & Wahl, H.-W. (2018). Förderliche und hinderliche Faktoren im Umgang mit neuen Informations- und Kommunikations-Technologien im Alter. In: C. Kuttner & C. Schwender (Hrsg.), Mediale Lehr-Lern-Kulturen im höheren Erwachsenenalter (Bd. 12, S. 223–242). kopaed.

Doh, M., Jokisch, M. R., Jäkh, S., Scheling, L., & Wahl, H.-W. (2021). KommmIT. Kommunikation mit intelligenter Technik. Ergebnisbericht der wissenschaftlichen Begleitung. https://www.lfk.de/medienkompetenz/seniorinnen-und-senioren/kommmit

Kerres, M., Rohs, M., & Heinen, R. (2012). Evaluationsbericht Medienscouts NRW (Band 46/Online; LfM-Dokumentation, S. 54). https://www.medienscouts-nrw.de/wp-content/uploads/2018/02/L131_Medienscouts_Evaluation980472252.pdf

Klenk, T., Nullmeier, F., & Wewer, G. (2020). Auf dem Weg zum Digitalen Staat?: Stand und Perspektiven der Digitalisierung in Staat und Verwaltung. In: T. Klenk, F. Nullmeier, & G. Wewer (Hrsg.), Handbuch Digitalisierung in Staat und Verwaltung (S. 3–23). Springer Fachmedien Wiesbaden.

Kolb, D. A. (2015). Experiential learning: Experience as the source of learning and development (Second edition). Pearson Education, Inc.

Krämer, S. (2016). The Messenger as a Model in Media Theory.  Reflections on the Philosophical Dimensions  of Theorizing Media. In N. Friesen (Hrsg.), Media Transatlantic: Developments in Media and Communication Studies between North American and German-speaking Europe (S. 197–213). Springer International Publishing 
       https://link.springer.com/content/pdf/10.1007/978-3-319-28489-7.pdf?pdf=button

Rathgeb, T., Doh, M., Tremmel, F., Jokisch, M. R., & Groß, A.-K. (2022). SIM-Studie 2021Senior*innen. Information, Medien Basisuntersuchung zum Medienumgang älterer Personen ab 60 Jahren (Medienpädagogischer Forschungsverbund Südwest (mpfs), Hrsg.). https://www.mpfs.de/fileadmin/files/Studien/SIM/2021/Web_SIM-Studie2021_final_barrierefrei.pdf

Stiel, J., Brandt, M., & Bubolz-Lutz, E. (2018). Technikbotschafter*in für Ältere werden. Lernformate im freiwilligen Engagement ,Technikbegleitung’. In: C. Kuttner & C. Schwender (Hrsg.), Mediale Lehr-Lern-Kulturen im höheren Erwachsenenalter (S. 201–221). kopaed.

Shiner, M. (1999). Defining peer education. Journal of Adolescence, 22, 555–566.


06. Open Learning: Media, Environments and Cultures
Paper

"But Wait, That Isn’t Real”: Evaluating ‘Project Real’, a Co-created Intervention Which Helps Young People to Spot Fake News Online

Yvonne Skipper1, Daniel Jolley2, Joe Reddington3

1University of Glasgow, United Kingdom; 2University of Nottingham; 3eQuality Time

Presenting Author: Skipper, Yvonne

Fake news is an intentionally fabricated news article that is verifiably false, and which could mislead the audience (Tandoc, et al, 2018). The World Economic Forum (2013) ranked the spread of misinformation as one of the top risks facing the world today. The “fake news pandemic” (Rajan, 2020) impacts public views on topics as varied as climate change and vaccines reducing the perceived seriousness of these issues and undermining both science and society (Lewandowsky, et al., 2017; van der Linden, et al. 2017). Fake news spreads six times faster online than the truth and therefore can reach more people quicker (Vosoughi, et al., 2018). Furthermore, people believe in fake news around 75% of the time (Silveman & Singer-Vine, 2016), meaning that many millions of people may have been fooled by fake news (Allcot & Gentzkow, 2017). Indeed, YouGov (2017) found that while many people believe they can tell the difference between real and fake news, only 4% of those surveyed could systematically differentiate the two. People across Europe are concerned about misinformation in their information environment (Hameleers, Brosius, De Vreese, 2021). Furthermore, fake news impacts not only people’s views but also their behaviour. Gunther and colleagues (2018) found that fake news affected how individuals voted during the 2016 USA elections. Therefore, it is vital that we take steps to develop people’s confidence and skills in recognising fake news and that we help young people to develop these skills early.

Whilst great strides are being made in the fight against online misinformation, much of the research on fake news is focused on adults and less is known about young people. This is a notable blind spot as many young people seek out their news via social media; around 54% say they get their news from social media (Common Sense Media, 2019). Young people report using social media as a source of news because they find traditional news boring and difficult to understand (Marchi, 2012). However, social media is notorious for spreading fake news, for example, Facebook leads to referrals to untrustworthy news sources over 15% of the time compared to authoritative news sources 6% of the time (Guess, Niham & Reifler, 2020). As more than 71% of adolescents have a social media profile (Ofcom, 2019) and more than 60% of 12-15-year-olds report that they do not think about the credibility of news stories when on social media (Ofcom, 2018), it has been suggested that digital media literacy should be a pillar of education (Select Committee on Communications, 2017). In fact, the Commission on Fake News and Critical Literacy in Schools, National Literacy Trust (2018) found that only 2% of young people had the skills needed to ascertain whether news was true and 60% reported that they trusted news less because of fake news. Furthermore, Herrero-Diz et al., (2020) found that young people cared less about the accuracy of news than its novelty or uniqueness and may not realise the damaging effect of sharing fake news. Thus, it is vital to increase young people’s awareness, confidence and skills to help them recognise fake news online.

Therefore, we co-created a fake news intervention ‘Project Real’, in collaboration with young people and influencers, alongside support from teachers. We hypothesised that participating in Project Real would lead participants to:

H1: become more confident in their ability to recognise fake news.

H2: show increased ability to recognise fake news.

H3: intend to make more checks about news stories before sharing them.


Methodology, Methods, Research Instruments or Sources Used
Participants
One hundred and twenty-six participants completed both the pre and post-test, 13 (aged 11), 45 (age 12) and 68 (age 13). Seventy-five were female, 42 were male, and 3 were nonbinary, six preferred not to state their gender. Seventy-three were White, 28 were Asian or Asian British, 10 were Black, 11 were Mixed or from multiple ethnic groups, and 4 were from other ethnic groups. Participants for the focus groups were 27 pupils from two schools. Five teachers from two schools participated in the interviews. Two had engaged in the co-creation and three had not.
Materials
Intervention co-creation
Pupils and teachers from three schools in Glasgow, social media influencers and academics co-created the intervention. Each school group discussed fake news and developed general ideas about what topics should be covered in the project and the format it should take. They created hour-long sessions with Powerpoint slides, short videos from the influencers and interactive activities. The topics were fake news, fake people, fake photos, fake stories (conspiracies) fake videos and finally keeping it real (where participants developed materials to teach other young people about fake news).  
Pre- and post- questionnaire
To understand young people’s use of social media to access news, we asked what websites they used for news. There were options such as Youtube and Instagram, as well as space to give their own answer or to state they did not use social media for news.
To examine how confident participants were in identifying fake news, we asked participants to answer three questions including: “Generally speaking, how confident do you feel in identifying fake news?”
To examine participants’ ability to identify fake news, we took a task from Maertens et al., (2021). Participants were shown 4 news headlines in the format of a ‘Tweet’ and asked how accurate and trustworthy the news was, how confident they were in this belief and whether they would share it.
We then asked participants what checks they would make before sharing a news story to ascertain their current behavioural practices. This was answered by selecting options such as “check if it was a trustworthy website”.
Focus groups and interviews
Semi-structured interview schedules were developed for teachers and pupils, which included questions about whether and how Project Real had impacted their behaviours and confidence in recognising fake news, for example “Did Project Real help you to feel more confident in recognising fake news?”.

Conclusions, Expected Outcomes or Findings
After completing Project Real, participants rated themselves more confident in recognising fake news (H1). They also intended to make more checks on news before sharing it (H3). However, their ability to recognise fake news did not significantly improve (H2). Qualitative data from teachers and pupils indicated that Project Real subjectively increased their confidence in recognising and their awareness of fake news. They also valued that the project had been co-created with young people and influencers.  
Our findings show that Project Real increased participants’ confidence in recognising fake news and intentions to make more checks before sharing news. This builds on previous research, which suggests that analytical thinking (Pennycook & Rand, 2020) and warnings about fake news (e.g., Ecker et al., 2010) can make people less likely to share misinformation. However, Project Real did not increase participants’ ability to recognise fake news. One potential reason for this was a measurement issue, as participants could not do any checks before responding to the questionnaire. The intervention was built around using a checklist to help participants identify fake news, but our measure did not allow them to do this. Had we allowed participants to make checks before giving their answers or asked what their behavioural intentions were, we may have found improvements in their ability to recognise fake news. Indeed, we found that they intended to make more of these checks before sharing news after the intervention.
In the last 10 months, the Project Real website had been visited by 33,000 users and 15% of those visitors have downloaded all resources.  While most users have been in the UK, there has been interest internationally including in Ukraine. Therefore, those considering the development of similar interventions may also want to utilise co-creation to maximise their reach and impact.

References
Guess, A., Nyhan, B., & Reifler, J. (2018). Selective exposure to misinformation: Evidence from the consumption of fake news during the 2016 US presidential campaign. European Research Council (Working Paper). Retrieved from http://www.dartmouth.edu/~nyhan/fakenews‐2016.pdf
Gunther, R., Beck, P. A, & Nisbet, E. C (2018).“Fake news” and the defection of 2012 Obama voters in the 2016 presidential election. Electoral Studies, 61, DOI: 10.1016/j.electstud.2019.03.006
Lawandowski, S., Ecker, U. K., & Seifert, C. M. (2012). Misinformation and Its correction: continued influence and successful debiasing.  Psychological Science in the Public Interest 13, 106– 131. DOI: 10.1177/1529100612451018
Lewandowsky, S., Ecker, U. K. & Cook, J. (2017). Beyond misinformation: understanding and coping with the “Post-Truth” era. Journal of Applied Research in Memory and Cognition, 6, 353–369. DOI: 10.1016/j.jarmac.2017.07.008
Marchi, R. (2012). With Facebook, blogs, and fake news, teens reject journalistic “objectivity”. Journal of Communication Inquiry, 36, 246–262. DOI: 10.1177/0196859912458700
Rajan, A. (2020, March 14). Coronavirus and a fake news pandemic—BBC News. BBC. https://www.bbc.com/news/entertainment-arts-51858555
Tandoc, E. C., Lim, Z, W., Ling, R. (2018). Defining “fake news.” Digital Journalism, 6, 137-153. DOI:10.1080/21670811.2017.1360143
van der Linden, S., A. Leiserowitz, S. Rosenthal, and E. Maibach. (2017). Inoculating the Public Against Misinformation About Climate Change. Global Challenges, 1, 1600008. DOI:10.1002/gch2.201600008.
World Economic Forum (2013). Outlook on the Global Agenda 2014. Retrieved from: http://reports.weforum.org/outlook-14/.


06. Open Learning: Media, Environments and Cultures
Paper

Making Sense of Video Recommendations. A Qualitative Study on Children’s Algorithm Literacy in German-Speaking Switzerland

Julian Ernst

Justus-Liebig-Universität Gießen, Germany

Presenting Author: Ernst, Julian

Algorithms are a central structural element of digitalized environments. In particular, so-called recommendation engines are a key type of algorithm that is especially relevant to everyday media use (Schrage, 2020): They are implemented in platforms such as YouTube, Instagram or TikTok, which are routinely accessed by many children in German-speaking Switzerland (Waller et al., 2019). The output of these systems are recommendations that suggest and continuously adapt the retrieval of specific content, for example videos, to the user (Louridas, 2020). Recommendation systems are not only used for the selfless reason of paving paths through the unmanageable number of possible retrieval options. Algorithmic recommender systems are closely intertwined with commercial interests (Beer, 2017), reflect dominant social categories (Noble, 2018), and can shape how users construct reality (Just & Latzer, 2017).

From a media educational perspective, therefore the ability to critically reflect on algorithms, to self- determinedly act in relation to them and thus to constructively shape the societies in which they are embedded has become increasingly important: algorithm literacy. Sharing basic assumptions of the digital and media literacy approach such as proposed by Hobbs (2021), algorithm literacy refers to “the ever-changing set of knowledge, skills, and habits of mind” (ibid., p. 4) in relation to algorithms. On this basis, an ideal "algorithm literate" person demonstrates an awareness of the operation of algorithms, has knowledge of how algorithms work, is able to critically evaluate algorithms or their results, and has skills to actively engage with them (Dogruel et al., 2022; Swart, 2021). Becoming literate in algorithms must not only be understood as an effect of pedagogical efforts. Based on socio-phenomenological assumptions about the importance of everyday experience for the formation of competencies (Berger & Luckmann, 2016), the acquisition of algorithm literacy can also be seen as rooted in everyday interaction with algorithm-driven platforms. Although algorithms are not directly visible to people in everyday use, they make sense of their output – even without knowing the mathematical-technical details or the term "algorithm" (Bucher, 2018). In this sense, several empirical studies have addressed aspects of algorithm literacy in adolescents and adults (Bell et al., 2022; Brodsky et al., 2020; Swart, 2021). However, while the role of algorithms, including recommender systems, in children's digital "ecosystems" has been extensively discussed (Cotter & Reisdorf, 2020), there are few empirical studies that contribute to knowledge about children's algorithm literacy.

Against this background, I conducted a qualitative study on children's algorithm literacy in primary school. The aim of the present study is to address this desideratum by empirically investigating children's algorithm literacy based on the following research questions:

  • What are children's everyday experiences with algorithmic recommendations? (RQ1)
  • In which ways do they explain algorithmic recommendations? (RQ2)
  • How and to what extent do they criticize algorithmic recommendations? (RQ3)

The presentation will cover key findings of the study and implications for the teaching of algorithms in the context of media education in primary schools.


Methodology, Methods, Research Instruments or Sources Used
To investigate children’s algorithm literacy, I chose a qualitative research approach. In total, I have conducted about 26 group discussions with 120 children between the ages of 11 and 13. The groups have been recruited in cooperation with different primary schools in the canton of Zurich. To create a diverse sample with respect to the region, the schools have been selected from urban, sub-urban as well rural districts.
In the absence of empirical research specifically on children's algorithm literacy that would have allowed the formulation of falsifiable hypotheses, and in the absence of a suitable instrument for quantification, this study is exploratory. Instead of deductively applying a set of skills, algorithm literacy was analyzed from the bottom up, starting with children's everyday experiences and their life-world situated assessments.
Understanding the acquisition of algorithm literacy as located in everyday media usage, was reflected in the design of the focus groups: Conceptualizing algorithms as “experience technologies” (Cotter & Reisdorf, 2020), the discussions focused on a specific phenomenon where algorithmic systems appear in users' daily lives: video recommendations on the platform YouTube, which continuously enjoys great popularity among the majority of children in German-speaking Switzerland (Waller et al., 2019). This included authentic screenshots of recommendations as well as recommendation bars. Furthermore, in contrast to other studies (e.g. Bell et al., 2022; Dogruel et al., 2020; Gran et al., 2021), the moderators did not explicitly ask about "algorithms". Instead, the focus groups discussed everyday experiences with the phenomenon of video recommendations, possible explanations for their genesis, and wishes for change related to these. All focus group sessions have been video-taped. The analysis is carried out through a combination of open coding of the video material and ethnomethodologically oriented fine analysis of selected sequences transcribed for this purpose (Garfinkel, 1984).

Conclusions, Expected Outcomes or Findings
The presentation will outline the results of the study. So far, the analysis has revealed the following key experiences that children have with algorithmic recommendations on a daily basis (RQ1): First, children in all groups report experiencing both pleasant and unpleasant emotions when interacting with algorithmic recommendations. While pleasure and fun are described when recommendations match one's interests and situational motives for use, negative emotions occur when recommendations do not match expectations. Second, children report that they observe certain quasi-algorithmic "logics" of recommendations: Orders and hierarchies in the way recommendations relate to each other on the surface of a platform. These experiences are described as platform specific. Children's explanations of algorithmic recommendations (RQ2) focus on the activities of different actors: The appearance of a recommendation is explained in terms of (a) their own media use, (b) the use of parents or siblings, or (c) the actions of more vague 'others' such as 'YouTube'. In the discussion of algorithmic recommendations, criticism also arises (RQ 3): On the one hand, negative effects of age-inappropriate video recommendations on children were discussed. In addition to "better" personalization through the platform, there was also a discussion about regulation and the platform's responsibilities. On the other hand, the entanglement of one's own time and attention with the commercial functionality of the platforms also became an issue.
Overall, the results point to a variety of manifestations of algorithm literacy. In the focus groups children show awareness not only for algorithmic operations, but also for the affects that recommendations might trigger. Also, children demonstrate knowledge on how recommendation algorithms might work, especially their socially intertwined character. Furthermore, the questions raised in the discussions about the attention economy of platforms and the protection of children from harmful effects also demonstrate the ability to critically evaluate algorithms in societal contexts.

References
Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(1), 1–13. https://doi.org/10.1080/1369118X.2016.1216147
Bell, A. R., Tennfjord, M. K., Tokovska, M., & Eg, R. (2022). Exploring the role of social media literacy in adolescents’ experiences with personalization: A Norwegian qualitative study. Journal of Adolescent & Adult Literacy, n/a(n/a). https://doi.org/10.1002/jaal.1273
Berger, P. L., & Luckmann, T. (2016). Die gesellschaftliche Konstruktion der Wirklichkeit: Eine Theorie der Wissenssoziologie (M. Plessner, Trans.; 26th ed.). Fischer.
Brodsky, J., Zomberg, D., Powers, K., & Brooks, P. (2020). Assessing and fostering college students’ algorithm awareness across online contexts. Journal of Media Literacy Education, 12(3), 43–57. https://doi.org/10.23860/JMLE-2020-12-3-5
Bucher, T. (2018). If...Then. Algorithmic Power and Politics. Oxford University Press.
Cotter, K., & Reisdorf, B. C. (2020). Algorithmic Knowledge Gaps: A New Horizon of (Digital) Inequality. International Journal of Communication, 14(0), Article 0.
Dogruel, L., Facciorusso, D., & Stark, B. (2020). ‘I’m still the master of the machine.’ Internet users’ awareness of algorithmic decision-making and their perception of its effect on their autonomy. Information, Communication & Society, 25(9), 1311–1332. https://doi.org/10.1080/1369118X.2020.1863999
Dogruel, L., Masur, P., & Joeckel, S. (2022). Development and Validation of an Algorithm Literacy Scale for Internet Users. Communication Methods and Measures, 16(2), 115–133. https://doi.org/10.1080/19312458.2021.1968361
Garfinkel, H. (1984). Studies in Ethnomethodology (1st ed.). Polity.
Gran, A.-B., Booth, P., & Bucher, T. (2021). To be or not to be algorithm aware: A question of a new digital divide? Information, Communication & Society, 24(12), 1779–1796. https://doi.org/10.1080/1369118X.2020.1736124
Hobbs, R. (2021). Media Literacy in Action. Questioning Media. Rowman & Littlefield.
Just, N., & Latzer, M. (2017). Governance by algorithms: Reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39(2), 238–258. https://doi.org/10.1177/0163443716643157
Louridas, P. (2020). Algorithms. MIT Press.
Mascheroni, G., & Siibak, A. (2021). Datafied Childhoods. Data Practices and Imaginaries in Children’s Lives.: Vol. Vol. 124. Peter Lang.
Noble, S. U. (2018). Algorithms of Oppression. How Search Engines Reinforce Racism. New York University Press.
Schrage, M. (2020). Recommendation Engines. MIT Press.
Swart, J. (2021). Experiencing Algorithms: How Young People Understand, Feel About, and Engage With Algorithmic News Selection on Social Media. Social Media + Society, 7(2), 20563051211008828. https://doi.org/10.1177/20563051211008828
Waller, G., Gregor, W., Lilian, S., Jael, B., Céline, K., Isabel, W., Nicolas, M., & Daniel, S. (2019). Ergebnisbericht zur MIKE-Studie 2019.
Willson, M. (2019). Raising the ideal child? Algorithms, quantification and prediction. Media, Culture & Society, 41(5), 620–636. https://doi.org/10.1177/0163443718798901


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ECER 2023
Conference Software: ConfTool Pro 2.6.149+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany