This post is co-authored by Kirsty Kitto, Lucy Blakemore and Mais Fatayer.
The UTS Learning Design Meetup recently gathered online to share reflections and examples of learning analytics impact from practitioners Danny Liu (University of Sydney), Kirsty Kitto (Connected Intelligence Centre, UTS), and Donna Rooney (Faculty of Arts & Social Sciences, UTS).
Fostering human relationships through learning analytics was our first re-cap from this event and captured highlights from speaker Danny Liu. This second post features some key points from Kirsty’s presentation on ethics in learning analytics. Hear more and re-visit the recording from her session below.
Ethics is already baked in…
Raw data is both an oxymoron and a bad idea; to the contrary, data should be cooked with care”
(p.184, Bowker, Memory Practices in the Sciences)
This famous quote guides a lot of the work at CIC: there is no raw data, our assumptions are baked into how we use it and even how we collect it. So we need to be careful about how we ‘cook’ data into our work with learning and learners. A lot of work has already been done here; many well known LA researchers have written papers at some point laying out their point of view on ethics and privacy – see the further reading below.
The field is actually very concerned about ethics, privacy and treating data well, and many different frameworks have been developed to this end. One example is the ‘DELICATE’ checklist which emerged from the Learning Analytics Community Exchange (LACE) European project in 2016.
Great principles – but how do I apply it?
When we’re building out learning analytics systems, it is often the case that the many different principles and frameworks that stand well by themselves are in conflict with other principles that we might also consider important. Student privacy should be preserved, for example – that’s a great principle, but respecting privacy might make it impossible to link data from a student’s learning over a lifetime to personalise their learning journey – something that is increasingly important in an era of workforce disruption and adjustment.
Another example can be found if we give students the choice to opt out of data collection – which seems like a great idea until we realise that there might be whole cohorts of students who choose to opt out, at which point our model is likely to end up biased!
Practical ethics: privacy at the core, and at the edges
A few years ago we started to explore a principle of practical ethics for LA (Kitto & Knight, 2019), which aims to help people developing LA solutions to navigate these ethical tensions. We need to make sure that LA practitioners are thinking about ethics right at the core of their work, rather than leaving the space for ethicists and and lawyers to worry about. We believe that everyone in the LA cycle needs to be thinking about what possible harm might come from developing analytics systems, and planning to mitigate against those harms, while also considering what kind of benefits they could engineer into well designed LA systems and tools.
It’s useful to think about ethical edge cases here: where do principles conflict with each other, and what are the unintended consequences of our work? A Canvas analytics dashboard seems like a great idea for identifying students at risk, but it could also easily be misinterpreted if we do not know the learning design for a scenario. For example, the course might be running in block mode, with highly engaged students in occasional intensive modes on-campus. Those students were highly engaged in on-campus learning at those points. Those same students may be learning across lots of different environments, using a wide variety of different tools and systems that are not necessarily integrated into the LMS or other analytics measures. Are we collecting all the data that actually describes that rich learning experience? If we’re trying to make predictive models based on this incomplete data, then we might ask how accurate they are likely to be?
Understanding data through human interaction
Even when we have access to lots of data, it is not always useful (Kitto et al. 2020). The best person to interpret a complex data trace is often the person who generated it – the student. If they don’t show up to class for a couple of weeks, they might know it’s because they had a sick grandmother and that they’re planning to catch up now. Alternatively, they may be really struggling. The context for complex data traces can mean everything, and yet it is rare that a data analyst has all of this necessary information.
A word of warning, though: if we’re going to return data to students we have to be really careful. For example, a predictive model automatically alerting a student who may already be feeling worried could just exacerbate the situation – indeed it could generate the very outcome we were trying to avoid and lead to the student dropping out! And yet as Prinsloo and Slade (2017) famously said, we have an obligation to act if the data suggests that a student might be struggling… Experienced student support staff around the university are likely much better placed to work through any issues or interventions in ways that show empathy, understanding and can point the students towards potential solutions.
View the full recording of this presentation, and brief Q&A from the Meetup audience:
Further reading and resources
Stay connected with the UTS Connected Intelligence Centre (CIC) to see their latest work on data science and human-centred design, including learning analytics. Read more about the ‘deliberative democracy‘ approach mentioned in the session and the report on edtech ethics from CIC.
Other papers and resources that are recommended include:
- Bowker, G. C. (2005). Memory practices in the sciences (Vol. 205). Cambridge, MA: MIT Press.
- Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510-1529. https://doi.org/10.1177/0002764213479366 (In fact anything by Prinsloo and Slade!)
- Prinsloo, P., & Slade, S. (2017). An elephant in the learning analytics room: The obligation to act. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 46-55). https://doi.org/10.1145/3027385.3027406
- The invited dialogue on Neil Selwyn’s 2018 LAK keynote “What’s the problem with learning analytics?” which appeared in the Journal of Learning Analytics Vol 6, no 3.
- Cerratto Pargman, T., & McGrath, C. (2021). Mapping the ethics of learning analytics in higher education: A systematic literature review of empirical research. Journal of Learning Analytics, 8(2), 123-139. https://doi.org/10.18608/jla.2021.1
- Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British journal of educational technology, 45(3), 438-450. https://doi.org/10.1111/bjet.12152
- Corrin, L., Kennedy, G., French, S., Buckingham Shum S., Kitto, K., Pardo, A., West, D., Mirriahi, N., & Colvin, C. (2019). The Ethics of Learning Analytics in Australian Higher Education. A Discussion Paper. Available at: https://melbourne-cshe.unimelb.edu.au/research/research-projects/edutech/the-ethical-use-of-learning-analytics
- Kitto, K., & Knight, S. (2019). Practical ethics for building learning analytics. British Journal of Educational Technology, 50(6), 2855-2870. https://doi.org/10.1111/bjet.12868
- Kitto, K., Whitmer, J., Silvers, A., Webb, M., (2020). Creating data for learning analytics ecosystems. SoLAR position paper. Society for Learning Analytics (SoLAR). Available at: https://www.solaresearch.org/publications/position-papers/
- Pargman, T. C., McGrath, C., Viberg, O., & Knight, S. (2023). New vistas on responsible learning analytics: A data feminist perspective. Journal of Learning Analytics, 10(1), 133-148. https://doi.org/10.18608/jla.2023.7781
- Knight, S., Shibani, A., & Buckingham Shum, S. (2023). A reflective design case of practical micro‐ethics in learning analytics. British Journal of Educational Technology.
- Drachsler, H. & Greller, W. (2016). Privacy and analytics: it’s a DELICATE issue. A checklist for trusted learning analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK ’16). Association for Computing Machinery, New York, NY, USA, 89–98. https://doi.org/10.1145/2883851.2883893