Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Please note that all times are shown in the time zone of the conference. The current conference time is: 17th May 2024, 06:21:04am GMT

 
 
Session Overview
Session
06 SES 14 A: Normalizing the Body. Addressing the Lack of Diversity in Digital Technologies and What It Means for Educational Science
Time:
Friday, 25/Aug/2023:
9:00am - 10:30am

Session Chair: Klaus Rummler
Session Chair: Aline Nardo
Location: Gilbert Scott, G466 LT [Floor 4]

Capacity: 114 persons

Symposium

Show help for 'Increase or decrease the abstract text size'
Presentations
06. Open Learning: Media, Environments and Cultures
Symposium

Normalizing the Body. Addressing the Lack of Diversity in Digital Technologies and What It Means for Educational Science

Chair: Klaus Rummler (PH Zurich, CH)

Discussant: Aline Nardo (University of Edinburgh, GB)

Social media platforms and other online spaces form a large part of today’s (everyday) culture. Therefore, scholars across various fields are increasingly concerned with the entanglement of media practices, self-presentation, and body images on social media (Aparicio-Martinez et al., 2019; Chua & Chang, 2016; Cruz-Sáez et al., 2020; Mahon & Hevey, 2021). What is still missing from the literature, however, is the intersection between body images, algorithmic systems, and power relations in the context of educational research.

This symposium thus explores ways in which algorithmic systems can shape body images by taking data-informed discrimination (Chun, 2021a), network gaps (Chun, 2021b) and AI/ML systems (Crawford & Paglen, 2019) into account. Our assumption is that algorithmic systems with their recommendations (Seaver, 2022) of whom to follow and what to see next are producing a mostly affirming and normalizing social stream that might have a significant impact on what body images are circulating. So rather, it can be assumed that there is a lack of diversity due to algorithmic influence. And this is where educational science should respond.

With the theme ‘Normalizing the body’, we focus on the role of algorithms in processes of constructing body images that fit societal - often ‘Western‘ - norms and expectations. We critically question whether the initially widespread promise that digitalization and, specifically, the internet and social media platforms, will make participation and representation more diverse can be fulfilled when algorithmic systems are based on discriminatory data.

As such, this symposium addresses the potential biases in and limitations of algorithmic systems, and how these may impact the re-presentation and portrayal of diverse bodies. We do this through three papers: The first, Designing the ‘normal’ body, critically reflects on the normalization of menstruating bodies in the context of self-tracking apps and socio-technical feedback loops. With media educational and biopolitical considerations in mind, the paper argues that algorithmic recommendations within menstrual cycle tracker apps have a disciplinary effect since they (re-)produce norms and normalities of (menstruating) bodies. The second paper, Damn Data!, explores the practices and complex entanglements of AI in creative articulative processes as part of media education. By doing so, on the one hand, the paper highlights the explorative potential of AI/ML Systems in the creative play on re-presenting bodies, on the other hand, it reflects the inherent contingency of digital media practices. The third and final paper, Beauty and the biased, explores diversity on TikTok, focusing on issues of content regulation and biased data used in recommendation systems. To respond to this from an educational perspective, the paper looks at how media education can help us recognize when actions and policies take different forms than intended when it takes into account the power relations inherent in content regulation and media practices.

All contributions start from an educational perspective by also including such approaches that consider the interdisciplinarity of the subject. By doing so, they pursue the same pressing question: How can and must we think and research diversity in educational science, taking into account increasing algorithmic influences?


References
Aparicio-Martinez, Perea-Moreno, Martinez-Jimenez, Redel-Macías, Pagliari, & Vaquero-Abellan. (2019). Social Media, Thin-Ideal, Body Dissatisfaction and Disordered Eating Attitudes: An Exploratory Analysis. International Journal of Environmental Research and Public Health, 16(21), 4177. https://doi.org/10.3390/ijerph16214177
Chua, T. H. H., & Chang, L. (2016). Follow me and like my beautiful selfies: Singapore teenage girls’ engagement in self-presentation and peer comparison on social media. Computers in Human Behavior, 55, 190–197. https://doi.org/10.1016/j.chb.2015.09.011
Chun, W. H. K. (2021a). Discriminating data: Correlation, neighborhoods, and the new politics of recognition. The MIT Press.
Chun, W. H. K. (2021b). The Space between Us: Network Gaps, Racism, and the Possibilities of Living in/Difference. Catalyst: Feminism, Theory, Technoscience, 7(2). https://doi.org/10.28968/cftt.v7i2.34903
Crawford, K. & Paglen, T. (2019). “Excavating AI: The Politics of Training Sets for Machine Learning (September 19, 2019) https://excavating.ai (last access: 14.12.2022)
Cruz-Sáez, S., Pascual, A., Wlodarczyk, A., & Echeburúa, E. (2020). The effect of body dissatisfaction on disordered eating: The mediating role of self-esteem and negative affect in male and female adolescents. Journal of Health Psychology, 25(8), 1098–1108. https://doi.org/10.1177/1359105317748734
Mahon, C., & Hevey, D. (2021). Processing Body Image on Social Media: Gender Differences in Adolescent Boys’ and Girls’ Agency and Active Coping. Frontiers in Psychology, 12. https://doi.org/10.3389/fpsyg.2021.626763
Seaver, N. (2022). Computing taste: Algorithms and the makers of music recommendation. University of Chicago Press.

 

Presentations of the Symposium

 

Normalizing the Menstruating Body. Self-Measurement, Algorithmic Recommendation and Little Tools of Knowledge

Lilli Riettiens (Johannes Gutenberg University Mainz)

The increasing measurement of life worlds and the “quantification of the self” in the present are accompanied by “new[] technologies of self-measurement” (Mau, 2018, 167). Self-tracking apps or corresponding tools are “continuously integrated into the course of life” (ibid.) under the digital condition (Stalder, 2018). This is also where the so-called menstruation apps can be classified into, which enable users to translate bodily and embodied states in relation to their menstrual cycle into data by means of predefined categories. The paper takes the phenomenon of menstruation apps as an opportunity to critically reflect on the normalization of the menstruating body in the context of sociotechnical feedback loops. In a first step, it is therefore argued that the apps with their predefined categories can be read as little tools of knowledge (Hess & Mendelsohn, 2013) that have a disciplinary effect on a subjection-theoretical and biopolitical level (Foucault, 2008) insofar as they (re-)produce conceptions of norm or ›normality‹ of menstruating bodies. The crux is: By (un)consciously making their data available, users contribute to generating ›(a-)normality‹ via socio-technical feedback loops (Chun, 2021) and are constantly confronted with having to ask themselves: Is my cycle (currently) ›normal‹? Do my ›symptoms‹ - as the app Flo calls it - correspond to what is ›normal‹ for my menstrual status? Accordingly, what might initially appear as ›personalisation‹ and thus diversification, reveals itself as normalization in the course of socio-technical feedback loops and algorithmic recommendation. Starting from this problematization, the paper concludes in a second step with critical reflections on the concept of Bildung as a “response to the possibilities of things” (Zirfas & Klepacki, 2013, 43). For while “ideas about people and their behavior are inscribed in technical objects”, it is nevertheless “only the relations in actual activity” (Allert & Asmussen, 2017, 41) within which or through which subjects are formed. In qualitative educational research we must therefore surely ask whether Bildung under the digital condition (Stalder, 2018) thus ultimately emerges at the ›edges of the (little) tools‹.

References:

Allert, H., & Asmussen, M. (2017). Bildung als produktive Verwicklung. In H. Allert, M. Asmussen, & C. Richter (eds.). Digitalität und Selbst. Interdisziplinäre Perspektiven auf Subjektivierungs- und Bildungsprozesse (pp. 27–68). Transcript. Chun, W. H. K. (2021). Discriminating data: Correlation, neighborhoods, and the new politics of recognition. The MIT Press. Foucault, M. (2008). The Birth of Biopolitics: Lectures at the Collège de France, 1978-1979. Palgrave Macmillan. Hess, V. & Mendelsohn, J. A. (2013). „Paper Technology und Wissensgeschichte“. NTM 21 (1): 1–10. Mau, S. (2018). Das metrische Wir. Suhrkamp. Stalder, F. (2018). The digital condition (V. Pakis, Trans.). Polity. Zirfas, J., & Klepacki, W. (2013). Die Performativität der Dinge: Pädagogische Reflexionen über Bildung und Design. Zeitschrift für Erziehungswissenschaft, 16(2), pp. 43–57.
 

Damn Data! On the Explorative Role of AI in Artistic Processes

Juliane Ahlborn (Bielefeld University)

Complex algorithmic, data-driven infrastructures have rapidly inscribed themselves into almost all processes of everyday life. And although they remain hidden underneath the perceptible surface, they shape the way we perceive the world around us, but also ourselves, how we think and act under the digital condition (Stalder, 2018). Art is not unaffected by this either. The application of AI in creative, artistic processes is becoming increasingly popular, which at the latest became obvious with the introduction of Dall-E 2 or stable diffusion in the spring and summer of 2022. This raises the question of how artists deal with complex algorithmic and data structures in such articulative processes and to what extent new forms of the subjection result from this. As art always operates under prevailing socio-technical as well as socio-cultural conditions, it reflects on social values and norms (McLuhan, 1964). For example, art also deals with diversity issues and discrimination problems that are reproduced or even reinforced by digital technologies (Bajohr, 2022). This bias is caused by the underlying training data with which the models were fed. „They are central to how AI systems recognize and interpret the world. These datasets shape the epistemic boundaries governing how AI systems operate, and thus are an essential part of understanding socially significant questions about AI“ (Crawford & Paglen, 2019). The paper demonstrates in three steps to what extent the role of "AI" in articulative processes can be understood as explorative in relation to diversity issues and discrimination problems: In the first step, the relation between machine learning and big data is discussed. Following on from this, the second step highlights the extent to which this data can be seen as biased (Stark et al., 2021; Crawford & Paglen, 2019). In the third step, empirical material in the form of qualitative interviews and ethnographic observations is used to show how the artistic approach to such complex infrastructures and technologies is shaped and what the role of data is in this process. The resulting works not only make the inscribed complexity visible, but can also be experienced beyond it. They can contribute to important discussions about AI, diversity, and discrimination in the public sphere and sensitize the audience to these issues. In turn, from a media-educational point of view, these artistic practices may provide access to the complexity of data-driven algorithmic systems (Verständig & Ahlborn, 2020).

References:

Bajohr, H. (2022). Malen nach 0 und 1. REPUBLIK. https://www.republik.ch/2022/05/07/malen-nach-0-und-1 (last access: 14.12.2022) Crawford, K. & Paglen, T. (2019). “Excavating AI: The Politics of Training Sets for Machine Learning (September 19, 2019) https://excavating.ai (last access: 14.12.2022) McLuhan, M. (1964). Understanding Media: The Extensions of Man. Mentor. Stalder, F. (2018). The digital condition (V. Pakis, Trans.). Polity. Stark, L., Greene, D., & Hoffmann, A. L. (2021). Critical Perspectives on Governance Mechanisms for AI/ML Systems. In J. Roberge & M. Castelle (eds.), The Cultural Life of Machine Learning (p. 257–280). Springer International Publishing. https://doi.org/10.1007/978-3-030-56286-1_9 Verständig, D., & Ahlborn, J. (2020). Decoding Subjects? Über Subjektivierung und Kreativität im algorithmischen Zeitalter. In J. Holze, D. Verständig, & R. Biermann (eds.), Medienbildung zwischen Subjektivität und Kollektivität (Vol. 45, p. 77–94). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-31248-0_5
 

Beauty and the Biased: How Content Regulation on TikTok Diminishes Diversity and What Media Education Can Do About It

Dan Verständig (Bielefeld University)

Social media platforms such as Instagram or TikTok experience great popularity, especially among adolescents. The body is constantly a key player in an individual’s self-representation on social media - and TikTok is no exception (Liu, 2021). TikTok as a short video social media platform can have a significant impact on how people perceive and feel about their bodies (Rodgers & Melioli, 2016; Maes & Vandenbosch, 2022). For media educational research and practice, it is relevant to understand how the configuration of body images and media practices are established and how the representation of diverse groups in media content, including race, ethnicity, gender, sexual orientation, ability are impacted by algorithmic systems. Social media has the potential to amplify diversity, greater range of voices, perspectives, and experiences to be represented and heard. This can lead to more inclusive and representative content, and can also foster a better understanding and appreciation for different cultures and groups. However, social media platforms, such as TikTok, have faced criticism for not doing enough to promote diversity and inclusion. There have been instances of discrimination and mistreatment of marginalized groups on these platforms, as well as a lack of representation among content creators and in the algorithms that recommend content to users (Civila & Jaramillo-Dent, 2022). There are efforts being made to address these issues, including calls for more diverse hiring practices at tech companies and efforts to raise awareness and improve the representation of underrepresented groups in data, algorithms, and content recommendations. The paper addresses the outlined problems in three steps: In the first step, a literature review is done to establish the theoretical basis at the intersection of diversity, social media and content regulation. In a second step, a systematic conceptual analysis will be unfolded in order to discuss the specific quality of face-based filters and other algorithmic features of TikTok which are supposed to promote diversity but in fact a) reinforce normalized body images, b) reproduce binary body configurations, and therefore c) shape exclusive social spaces. Finally, the third step is to clarify how media education can contribute to uncover the inversion of the intended actions and measures of the relations on platforms, content creators.

References:

Civila, S., & Jaramillo-Dent, D. (2022). #Mixedcouples on TikTok: Performative Hybridization and Identity in the Face of Discrimination. Social Media + Society, 8(3), 205630512211224. https://doi.org/10.1177/20563051221122464 Khattab, M. (2019). Synching and performing: Body (re)-presentation in the short video app TikTok. WiderScreen, 21(2). Liu, J. (2021). The Influence of the Body Image Presented Through TikTok Trend-Videos and Its Possible Reasons: 2nd International Conference on Language, Art and Cultural Exchange (ICLACE 2021), Dali, China. https://doi.org/10.2991/assehr.k.210609.072 Maes, C., & Vandenbosch, L. (2022). Adolescent girls’ Instagram and TikTok use: Examining relations with body image-related constructs over time using random intercept cross-lagged panel models. Body Image, 41, 453–459. https://doi.org/10.1016/j.bodyim.2022.04.015 Rodgers, R. F., & Melioli, T. (2016). The Relationship Between Body Image Concerns, Eating Disorders and Internet Use, Part I: A Review of Empirical Support. Adolescent Research Review, 1(2), 95–119. https://doi.org/10.1007/s40894-015-0016-6


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ECER 2023
Conference Software: ConfTool Pro 2.6.149+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany