At the first meeting of the new Technology, Media and Strategy Research Group in FASS, a panel presented on the topic of ‘Data Ethics: Implications for Traditional Media, Social Media and the Communication Industry.’ In Part 1 of our ‘AI and data ethics’ series, Belinda Middleweek asked What happens if we develop social attachment to a machine? This week, Suneel Jethani considers the implications of applying a kind of Hippocratic oath to the field of Data Science.

Do no harm and put the wellbeing of the people first.

The core message of the Hippocratic oath

Many think of the Hippocratic oath as a key solution to the unethical practices that plague the Data Science industry. However, it is not clear what the immediate cost would be in Data Science for a breach of such an oath. Moreover, anyone can produce a data-harm – you don’t need to be employed as a Data Scientist or even within its industry.

3 complications of porting the Hippocratic oath

  1. An oath needs to be anchored to a professional body – unlike Medicine, there is no single professional body using data; there is also no self-regulating professional community that is backed by state powers.
  2. Blaming bad apples – adherence to a Hippocratic oath implies that those who breach the oath are in the minority; this doesn’t really work for Data Science because there are complex systemic issues at play.
  3. Defining and anticipating harms – in Medicine, a patient will have some idea of who has harmed them and what the harm was; in Data Science, a victim may not know a harm has been done, let alone be able to indicate the harmer.

The limits of responsibility

The notion of a Hippocratic oath for data scientists has some merit but also many limits. There are still questions about who bears responsibility to maintain and enforce such an oath. Likewise, do institutions need to take such an oath as well as individuals or is responsibility being devolved away from systemic change?

Tech workers are being urged to take on the responsibility while tech companies are rarely, if ever, called to account to take any formal responsibility.

Oaths represent one small but meaningful element of broader change. Combined with external regulation this could begin to address systemic problems in the data science community.

An alternative to the oath at UTS

UTS has homegrown comprehensive alternatives to the ancient Hippocratic oath. In Jan 2022, CIC published a report on EdTech Ethics that specifically looked at Analytics and AI-powered Educational Technology. This deliberative democracy process brought together the UTS community of both students and staff to produce 7 draft principles of AAI-EdTech. Each principle is elaborated with examples of what students, educators and the university can do to uphold it.

This deliberative democracy model represents an alternative approach to ethics in Data Science. It is one that draws in all stakeholders, is educative, and produces output for all stakeholders to take on. This is in stark contrast to the Hippocratic Oath model that centres individual responsibility without institutional accountability.

What does all this mean for UTS?

Responsible use of technology is explicitly mentioned in UTS 2027 strategy. This is why UTS is extremely rigorous with regards to data security in our selection Educational Technology systems. For me in the LX.lab, I focus my work with teaching staff only on officially supported systems. I also provide caution and caveat whenever staff ask me about unsupported tools. This can be frustrating at times when you want to experiment with new tools in teaching and learning. But you can rest assured that when we give advice we measure educational efficacy equally against data security.

For a detailed insight into the abstracts for this session, check out TMS Research Group Kickoff Data ethics panel.pdf (

Join the discussion