Monthly Archives: November 2024

Data Life Conference recap!

Authors: Juliette Davret  

On November 5th, Juliette Davret organized the DATA LIFE conference, which brought together researchers and professionals for a day of thought-provoking discussions on the multiple dimensions of data life. Supported by INTERSSECT knowledge hub, alongside the DATA LOSS and DATA STORIES projects, the Centre for Culture and Technology and the Aesthetics of Bio-Machines, the event provided a platform to critically explore the multifaceted dimensions of data. As big data and artificial intelligence reshape our world, the questions surrounding data production, management, and usage become ever more critical.

Credit: Kathrin Maurer 

The conference began with a keynote address by Stefania Milan from the University of Amsterdam, a leading voice in critical data studies. Milan’s talk explored the dilemma of democracy through data, addressing issues of governance, data infrastructure and citizen action. She showed how data infrastructures make and un-make publics by producing data that guide and shape regulation, as well as by disciplining individuals as they participate in social and political processes. Her presentation set the tone for the day, emphasizing the importance of a critical perspective on the role of data in shaping social and political structures.

Credit: Juliette Davret  

The main discussions were organized around three sessions, each addressing different ethical and societal implications of data in our world.

The first session critically explored the socio-political implications of data systems and communication across various societal contexts. The session consisted of 5 papers. Danielle Hynes and Samuel Mutter (Maynooth University) presented an analysis of ‘data narratives’ of the Irish housing and planning pipeline across a range of documents produced by different stakeholders, identifying three different narratives – governance, commercial and ideological – critically analysing their affordances and considering potential valences. The paper examined how these narratives, often overlapping and partial, shape the discourse around housing, planning and property, and highlighted the socio-political and theoretical implications of combining data and narrative in this context.

Then, Matthias Leese’s (ETH Zurich) paper examined how police information systems, often characterized by makeshift “silos,” reflect a creative, improvisational approach to managing data, particularly in relation to vulnerable populations. It argued that such bottom-up, bricolage practices, rather than rigid top-down control, can effectively support the care functions of the police, particularly in social services.

Irina Shklovski’s (University of Copenhagen) paper explored the complex and paradoxical nature of achieving data quality in the creation of training data for medical AI systems. Through an empirical investigation of data experts working in medical AI, the paper examined how dimensions of data quality—accuracy, structure, timeliness—are pursued within the constraints of regulatory compliance and practical limitations. It found that data quality functions as an aspirational yet unattainable ideal, shaped by compromises inherent in the production process.

Jef Ausloos (University of Amsterdam) paper critically examined the concept of the “academic data gaze,” exploring how academia’s engagement with data and digital infrastructures is shaped by, and reinforces, power dynamics and extractive logics rooted in historical and contemporary political economies. Inspired by Beer’s notion of the data gaze, it argued that academia’s increasing reliance on data-driven methods legitimizes claims of objectivity, neutrality, and universality, while obscuring the historical complicity of scientific research in oppressive systems. The paper called for a reflective praxis that interrogates the costs and exclusions of data-centric knowledge production, urging academia to confront its role in perpetuating inequities and epistemic harm.

Klaus Bruhn Jensen’s (University of Copenhagen) paper proposed a model of human-machine communication (HMC) that adapts Stuart Hall’s encoding/decoding framework to explore how humans and machines co-construct meaning through socially and computationally contextualized codes. By incorporating metacommunication, the study examined the political, ethical, and discursive dimensions of HMC, addressing challenges in aligning human-human, human-machine, and machine-machine interactions within communication theory.

From police and urban planning data management to the role of data in academic research and healthcare, this session showcased the tensions between ideals of transparency and objectivity and the realities of data practices, while highlighting the ethical and political challenges associated with contemporary data infrastructures.

Credit: Juliette Davret 

The second session brought together diverse explorations of data through artistic, curatorial, and critical research, examining how data practices intersect with human experience, memory, ethics, and psychology. Magdalena Tyzlik-Carver (Aarhus University) introduced Fermenting Data, a curatorial and research project that blends fermentation practices with data processing to explore what it means for data to “get a life”. Through workshops, exhibitions, and open-access tools, the project reclaims data as a common, accessible practice, using fermentation as both metaphor and method to challenge extractive data practices. It proposes a symbiotic, more-than-human approach to data processing, inspired by the transformative properties of bacteria, to foster ethical, tangible, and inclusive engagements with data.

Then, Kristin Byskov and Tina Ryoon Anderson (Norwegian University of Science and Technology) examined the intersection of memory, intimacy, and technology through Labyss by The Algorithmic Theatre, an experimental performance that critiques “digital amnesia” and explores the implications of AI-driven memory-learning software. By combining performing arts, visual arts, and programming, the project investigates how memories can be digitized, the ethical and social dimensions of such practices, and how artistic research methods can deepen understanding of our evolving relationship with algorithmic technologies.

Shirley Chan’s (Lund University) presentation investigated the challenges of preserving and understanding the data generated by online fandom communities over time, focusing on how platforms like Reddit and Tumblr shape data creation, circulation, and representation. Through ethnographic methods, the work explored the infrastructure of fandom, its dynamic contexts, and critical events, offering insights into how preservation efforts can enable meaningful future access and interpretation of today’s digital cultural practices.

Finally, Paul Heinicker’s (FH Postdam) paper introduced the concept of “data sadism” to explore the unconscious desires driving data production, arguing that alongside rational motivations like knowledge, economics, or power, irrational and pleasure-driven dimensions shape our engagement with data. By extending Jacques Lacan’s sadistic schema, it critiqued the often-overlooked psychological underpinnings of data processes, aiming to make these hidden dynamics visible and enrich psychodynamic critiques of data practices.

From reclaiming data through fermentation as a more-than-human, ethical practice to exploring the implications of AI-driven memory and digital amnesia, this session highlighted innovative approaches that challenge traditional data processing and question the emotional and psychological dimensions behind our engagement with data in contemporary society.

Credit: Juliette Davret  

The third and last session of the day explored the evolving practices and political implications of data deletion and retention within state and law enforcement contexts, focusing on how digitalization reshapes governance, accountability, and power dynamics. First, Frederik Schade’s (University of Copenhagen) paper examined the shift in state bureaucracies from “cultures of destruction” to “cultures of deletion” within the context of digitalization, focusing on how deletion, unlike destruction, is intrinsic to computational systems, reversible, invisible, and framed as sustainable. Using the Danish government as a case study, it explored the political and administrative implications of deletion’s programmability and automation, highlighting its dual potential to enhance efficiency while complicating oversight, sovereignty, and accountability in digitalized democratic governance.

Next, Megan Leal Causton (Virje Universiteit Brussel) examined “archival frictions” in European law enforcement data governance, focusing on the tensions between Europol and the European Data Protection Supervisor (EDPS) over data curation and deletion from 2019 to 2024. Using a transdisciplinary approach combining archival studies, criminology, and critical data studies, it highlighted how these frictions shape institutional power dynamics and the socio-political dimensions of data governance, contributing to debates on data power, politics, and transparency.

Vanessa Ugolini (Virje Universiteit Brussel) concluded the third panel of the conference with a presentation on how data dies. This paper explored the concept of the “death” of data within EU security and border management, analyzing socio-political and technical aspects of data retention, anonymization, depersonalization, and erasure. It highlighted the need to consider the lifecycle and decay of data, emphasizing its implications for power structures and the re-purposing of data across large-scale information systems.

From the shift in bureaucratic cultures from destruction to deletion, to archival frictions in European data governance, and the “death” of data in security and border management, the session critically examined the socio-political and technical challenges of managing data lifecycles, highlighting the complex relationships between data, authority, and transparency.

Credit: Juliette Davret  

The conference concluded with an inspiring talk from Rob Kitchin of the University of Maynooth, who discussed the concept of data ecosystems and mobilities using the case study of the Irish planning system. He emphasized the fact that the data flow metaphor does not adequately reflect the sharing and circulation of data. Kitchin encouraged participants to develop more theoretical and conceptual frameworks for understanding data mobility, given its essential role in the management and governance of society.

Credit: Juliette Davret  

The DATA LIFE conference underscored the importance of taking a multidimensional approach to understand the complex impact of data on our world. By examining power structures, quality, biases, historical contexts, and regulatory challenges, speakers offered vital insights into the influence of data. The gathering of researchers and practitioners highlighted the need for critical and collective reflection, emphasizing that data, far from being a simple flow of information, is both a reflection and a driver of social dynamics, setting the stage for future exploration in the field of critical data studies.

A very special thanks goes to Kristin Veel, Nanna Bonde Thystrup and Louis Ravn from the University of Copenhagen for their help in organising this conference.

Event page: https://artsandculturalstudies.ku.dk/research/daloss/events/2024/data-life-conference/ 

 

Want to share this? (click + below for more options)

Exploring Activist Narratives through Immersive Storytelling

Oliver Dawkins (Data Stories – Maynooth University) and Gareth W. Young (TRANSMIXR – Trinity College Dublin)

Last week, we had the privilege of presenting a series of XR Masterclasses at Dublin’s BETA Festival. The workshops were designed to help participants explore the possibilities for creating and sharing activist narratives and stories using extended reality (XR) technologies like virtual and augmented reality (VR/AR).

The sessions were proposed by BETA to support their presentation of the immersive augmented reality experience Noire. Noire tells the story of Claudette Colvin, a 15-year-old black girl who refused to give up her seat on a segregated bus in Montgomery, Alabama, one day in March 1955. Until writer Tania de Montaigne retold this story, it had largely been forgotten and overshadowed by a similar encounter involving Rosa Parks nine months later, made famous through the support of Martin Luther King. Noire uses Microsoft Hololens 2 headsets and spatialised sound to restage Claudette’s earlier encounter in holographic form for six simultaneous participants who share that space in mixed reality.

Taking a creative lead from Noire, our workshops invited participants to explore the use of similar technologies to create and share their stories about activist causes. In particular, we focused on demonstrating the potential for new forms of previsualisation and immersive storyboarding using VR headsets (Meta Quest 2) with open-source software (Open Brush), which is free to use and enables users to draw scenes and environments from the inside out. Participants draw or paint a scene around them in three dimensions. In this way, they get an immediate sense of what the scene will feel like when they share it as an immersive experience for others. The tool can be used to quickly sketch ideas in 3D but offers excellent scope for painterly expression. Using Open Brush with the Meta Quest headset’s ‘passthrough’ mode also lets designers test what their creations might look like in mixed reality at a fraction of the cost of the more expensive Hololens headsets.

Each session started with an introduction to Noire and a discussion with their team members. In our Tuesday session, we were joined by Emanuela Righi and Louis Moreau, who discussed the production and technology involved in Noire. Tania De Montaigne joined us for our Wednesday session to discuss narrative and storytelling. After a brief break, we moved to a broader discussion of XR technologies and their use in storytelling with the aid of technologies like volumetric video capture and 360° cameras. While volumetric video is fully spatialized, it is costly to produce and generates unwieldy volumes of data that must be processed. 360° video, by distinction, is cheaper to produce but typically limits movement to three degrees of freedom, with the viewer effectively stuck in a bubble. Both have different affordances with different implications for accessibility, interaction, and immersion, impacting the types of experiences that can be created and how they are produced. The unique characteristics of XR require adjustments to traditional storytelling methods.

We also considered the importance of realism and artifice with reference to the documented experiences of users in VR who have reported great feelings of immersion and empathy even toward 3D animated content, suggesting that these are not as dependent on realism as we might suppose. Hence, immersive media show great potential for engaging creators and users in their capacity to affect and be affected by digital media in performative virtual and mixed-reality environments through which users can enact their imagination in ways that can support empathy and identification with a character or cause. At the same time, creators need to be authentic and take responsibility for the stories they tell. They must engage with their subject fully to ensure its validity and veracity. In particular, they must ensure that their production does everything possible to respect the ethics and privacy concerns relevant to their subject material, mainly when representing individuals. These concerns extend to issues of accessibility and inclusivity by ensuring that creators recognize the needs, capacities, and diversity of their intended and potential audiences.

To introduce the practical component of the workshop, Gareth demonstrated the use of Open Brush by streaming the video feed from his VR headset to a shared screen in real-time. Visitors then put on the headsets we provided and worked on their own scenes for about 50 minutes. While some participants took our prompt and worked on storyboarding a scene with activist or empathetic intent, others were satisfied exploring the capabilities of the tools. Both Louis and Tania from the Noire team participated. While Louis was familiar with the technology, Tania had less experience with headsets. Tania was initially skeptical about how her experience would be due to her prior understanding of VR as an isolating technology. However, the activity felt much more connected and collective with the headset’s passthrough feature enabled. Tania enjoyed painting in 3D as much as our other participants, who all became deeply engrossed in the scenes they were creating.

For the workshop’s final part, we asked each participant to talk about what they created and share it with the other participants. On the first day, technical difficulties prevented each person from streaming their video feed, so each took turns trying each other’s headsets after a brief description of what they would see. On the second day, we fixed the issue with streaming the video so each participant could provide a tour of their creation from within the headset.

In each case, the embodied testing of each other’s scenes, rather than merely seeing them on the screen, had the most impact for our participants. What came across in the session was the unique value of being able to both create scenes and share the creations of others in a fully spatialised and embodied way. We also saw the potential for the development of unique personal styles of expression by way of comparison.

We concluded the session by suggesting creative next steps for participants who wish to develop these new workflows further. We thanked both the participants and the team from Noire for their inspiration and kind participation. Moving forward, Gareth and I are excited to explore the potential for immersive storytelling in new research and hope to encourage others to pursue their own journeys in the XR field.

Want to share this? (click + below for more options)