Previously in our ‘AI and data ethics’ series, Belinda Middleweek asked What happens if we develop social attachment to a machine? and Suneel Jethani considered the implications of applying a kind of Hippocratic oath to the field of Data Science. In this third and final post, we explore Maureen Taylor’s research on how organisations have been engaging with online communities during the pandemic – were they engaging in genuine dialogic communication or simply engaging in surveillance capitalism?
Maureen Taylor has been researching expressions of Corporate Social Responsibility (CSR) on Facebook during the first 16 months of the COVID-19 pandemic. This research tracked how organisations have been expressing their CSR to engage with online communities. In particular, it looked at whether they were engaging in genuine dialogic communication or simply engaging in surveillance capitalism.
Crisis communication vs surveillance capitalism
Crisis communication is usually related to a crisis caused by companies. Think BP’s Deepwater Horizon oil disaster or the numerous ExxonMobil disasters. COVID-19 was not caused by any one company or group of companies, so they all had a unique opportunity to show their CSR to the world via social media. This often took the form of key pandemic-related topics like customer safety, social isolation, and mental health.
If you have nothing to hide, then you are nothing.
Shoshan Zubbiff, The Age of Surveillance Capitalism
On the other hand, surveillance capitalism is taking user-generated data, packaging, and selling it – a concept that proposes our data as stolen goods. It highlights data-driven inequity and the commodification of who we are.
Public as end vs Public as means
Maureen’s social media analysis looked at CSR-related behaviour on Facebook along a continuum stretching from dialogic engagement to surveillance capitalism. Dialogic engagement is nice. It’s about building relationships, listening to people, and acknowledging them – this is the ‘public as end’ in itself. On the other end, surveillance capitalism sees the ‘public as means’ to extracting data & dollars.
By topic modelling, the study was able to investigate what kinds of topics companies were addressing. It also analysed how and why they changed topics over the 16-month period. This was contrasted with the quality of public affect they tired to evoke and interact with.
Following the trends
In the first few months of the pandemic, companies seemed to enact dialogic communication. All kinds of CSR topics were raised and there was a degree of openness to engage with challenging issues and topics that didn’t always garner positive emotions.
As time went on there was a distinct change in behaviour. A form of risk-averse CSR communication started to emerge. Companies responded to data from their initial postings by only sticking with what generated a positive emotive response: the thumbs-up and love-heart stories. There was trend towards less engagement and response to the community.
Perceived negative emotions on the other hand got ignored or were not addressed in depth. Companies steered clear of topics that garnered any kind of negative response and did not engage in dialogue with the community about more challenging topics for CSR. Complicated issues drifted off the radar, replaced with stories about corporate giving, social activities and sponsoring. Essentially, companies tended to use social media to drive positive emotions around their CSR and by extension their company itself.
So what? Isn’t that what data-driven marketing is all about?
Yes and No. The idea of Corporate Social Responsibility has nothing to do with positive of negative emotions. It’s about companies operating in ways that improve rather than degrade society and the environment. When it gets used solely as a marketing tool to extract more data from customers and sell them stuff, the purpose of CSR ceases to be.
Data-driven decision making now influences every act of corporation-public interaction, including conversations about corporate social responsibility. The danger is that this data-process has supplanted dialogic communication between companies and publics in order to control public responses.
Implications for educators
Analytic tools and training programmes that help business managers create false engagement have grown into a huge industry. In turn, Higher Education has responded with great enthusiasm to this growing requirement for data-driven training programmes. Teaching how to do surveillance capitalism is now a ‘learning outcome’ in some management courses.
In a related vein, we’re asked to consider making data-driven decisions about student and educator performance. But analysis in primary and secondary education from the US reveals that studying student data more doesn’t really improve student outcomes in most of the evaluations. In spite of all the professional learning time teachers have spent on student data analysis, the expected increased scores for students have not shifted. Education isn’t always like sportball and sometimes data-driven decisions don’t work out.
What does UTS do in data and AI ethics?
The Bachelor of Communication at UTS teaches students how to use social media monitoring tools, evaluate public reactions, and adjust campaigns with data-driven insights.
CIC’s report on EdTech Ethics proposed a draft of principles for the use of data-driven Edtech at UTS. You can read that report at EdTech Ethics Deliberative Democracy | UTS:CIC. The first principle ensures UTS is accountable to the university community and its stakeholders in the open, transparent, fair and equitable use of AAI-EdTech. This means that:
Educators are made aware if the AAI-EdTech systems are ever used to measure or evaluate their performance as educators – and are able to raise concerns or complaints about these systems that will be used when the university is reviewing the use of those systems.
What is unclear to me still is if UTS performs CSR via social media and how much these practices are used for dialogue with our society and how much is surveillance capitalism? This might be something that professional staff need to interrogate for themselves.
For a detailed insight into the abstracts for this session, check out TMS Research Group Kickoff Data ethics panel.pdf (sharepoint.com).