Dr Tarez Graban at the Florida State University and her colleagues in the ‘Digital Scholars’ group invited me and Dr Anais Nony to discuss data surveillance from the perspective of digital humanities.
I have been involved in so many projects throughout my career, and sometimes lost track of the trajectory I have been making towards researching data ethics and data surveillance. The conversation with Dr Nony and the audience allowed me to (re-)visit and (re-)think the work I have produced in the past years, using theoretical concepts proposed by Dr Nony to look at my engagement with theory and practice in my not-so-short academic career so far.
I tried to summarise our conversation below:
Situated data ethics and data practices
Designing and creating ‘artefacts’ has become an important part in my teaching and research practices. I particularly enjoy developing artivist projects that require physical interactions as they prompt ‘affect’ (e.g., asking students to do street interviews to ask people to read out terms and conditions of a social media platform that they use). In a way, I think I am tackling the problem with desensitization about data surveillance, clicktivism, slacktivism. I exercise what Nony terms ‘cultural agency’ in her work ‘Nootechnics of the Digital‘.
But why these physical interactions are effective? The materiality of the objects would have a role to play (see Jane Bennett’s work on materiality and non-human agency).
What actions can media users take to ensure a “healthy” relationship with augmentation?
A good question followed up from the audience was on how to address data ethics in the uptake of AR technologies. That allowed me to talk about situated veillance practices and contextual data ethics, agency and control over technologies. We choose to sacrifice something in order to gain certain benefits. It’s a trade-off to think about how we negotiate data ethics.
But are we really in control? Are we really at peace with consent? Or, are we forced to have peace with consent, accepting the conditions? When we leave digital footprints on the internet, are we really conscious about what we are doing?
Meanings of data
Nony followed it up by cleverly picking up the ‘data journey’ framework Dr. Jo Bates, I and others developed in the AHRC-funded project ‘The Social Life of a Weather Datum‘, and using that to question (un)certainties of data when they travel, mutate. We don’t know what is going to happen after we give away our personal data, what meanings or practices would emerge after data are re-mixed, re-configured. Given the immateriality of data, once it’s gone, it’s no longer the one we used to own. Data mutate, all the time. It’s paradoxical that we thought big data would help us to predict the future, but on the contrary, as data being remixed, we are observing ‘uncertainties’ and how facts and established meanings being challenged.
Another good question about why physical interactions are affective. I think a lot of sensory experiences cannot be captured or datafied. There are a lot more we don’t know even if we thought we were collecting and processing massive amount of data already. There are also data that are unprocessable (for the time being). And these unattainable data, uncomputable data allow us to think about ‘invisibility’ of data practices. A lot of digital labour are invisible: for example, preparation, cleaning, training, checking, monitoring, etc. However, they are important for making sense of the data, making data visible. Why people engage in these invisible data labour when they don’t get any reward (e.g., parkrunners, digital slaves).
And this is linked with power relationships and power dynamics in the big data realm. Who are privileged by big data? How gender, race, ethnicity, class shape participation in data surveillance or souveillance (a concept proposed by Steve Mann)? Who has the rights to be forgotten? Who has the rights to know? Whose memories can be preserved, downloaded and archived (as seen in the BBC drama ‘Year and Years’)? Who has access to emergent technologies? Unfortunately, often we find those richer and better connected people can and benefit more.
A good question from Nony was about ’emancipatory media’. I felt I was empowered when I could figure out how to build an Arduino prototype, how I could build a Raspberry Pi weather station. Even if it took me a long time to learn (as I’m not a developer or a coder), but I felt empowered and a sense of achievement once I did it. It’s about self-development. But, today’s digital data-driven society expects efficiency, immediacy, instancy, rapid responses from people (and non-humans). Technologists and scientists, unlike humanists, expect accuracy and efficiency. They follow clock time, machine time. However, if humanistic approaches require us to slow down, to have some ‘Me time’, to have time for learning, appreciating and reflection. We’ll probably feel more liberated, emancipated if we enjoy ‘slow computing’ more. It’s about appreciating diversity, playfulness.
I asked the audience what data they were afraid of giving away. One said ‘Intellectual productions and personal data (ssn, address, passport number, etc)’. And one responded ‘forced consent aspect of applications.’
I personally learned a lot from participating in this webinar. I’m grateful to Tarez for this opportunity, to my co-speaker Anais for the enlightenment and intellectual inspiring conversation, and to the audience for the great questions and responses. I wish there will be another outlet where we can continue the discussion.
If you’d like to listen to the recorded webinar, visit here.