Artificial Intelligence is here, and it’s making an impact on student learning. From a tool that analyses students’ academic writing and gives feedback in real-time to a nursing simulation ward with an AI ‘nervous system’ to support post-simulation debrief, Simon Buckingham Shum, Professor of Learning Informatics and leader of the Connected Intelligence Centre explains the place of AI in UTS teaching.

Imagine you’re studying a new subject, and you’re a natural! The introductory assignments are easy, and you’re hungry for more advanced stuff. But bad luck, you need to wait for your struggling classmates to catch up before the academic can move on.

Or imagine how it feels if you’re the one struggling with the basics, panicking that you’ll never catch up, let alone find the confidence to tackle more challenging material.

Now imagine an infinitely patient expert personal coach, working at your pace and giving you the right level of challenge until you’ve mastered the material that stretches you.

Welcome to ‘adaptive learning platforms’, a form of artificial intelligence (AI) tutor that’s making an impact on student learning.

AI is already here

In a university accelerated learning program, students using an AI statistics tutor (and supported by a tutorial group), learned a semester’s worth of material in half the time, at least as well or better than students on the normal course.

Although this is just one instance of how AI is starting to change teaching and learning, AI can understand speech, recognise people and objects in the room, chat with you online, recommend interesting reading or events, analyse your posture, and much more.  

Understandably, this raises intriguing questions for both teachers and students, so let’s unpack some of the implications for education.

A question of trust and agency

Each time a computer does something we never thought possible, it gives us pause for thought  — another thing we thought was ‘essentially human’ turns out perhaps not to be (assuming it’s not just hype).

But while educators are among the least likely to be automated away, there remain nonetheless some big questions for the sector and they boil down to trust and agency:

  • What kinds of learning does AI improve, and where will people always be needed? (machines don’t understand how a student’s personal situation is affecting their concentration, or know how to support them)
  • How can we ensure that institutions are literate enough to properly resource staff upskilling? (professional roles may well change)
  • Where is all the data in “the cloud” going, and who owns it? (governments and companies need to be held to account)
  • How do I know if I can trust an educational AI? (you entrust your life to planes you don’t understand, so why not your learning to an AI aid?)

What’s important to remember is that there is nothing inevitable about any technology: it is designed. Someone, somewhere, has a vision of what a ‘good education’ is, and is designing learning technologies to deliver that: our task is to ensure that those assumptions are valid. The power and money behind ‘Big Data’ and AI raise concerns about the agendas driving digital infrastructure that enable surveillance of any sort, especially if it’s unclear how they raise the quality of teaching and learning.

So, the challenge is to ensure that any learning technology we use aligns with our educational and ethical values, and this is exactly what we’re doing in UTS.

AI for instant feedback with AcaWriter

A challenge for many students, regardless of university or faculty, is learning how to write academically.

Understanding the hallmarks of good writing, and how to use language to achieve these effects, takes practice. It also requires good feedback, which is both impossible to provide instantaneously and time-consuming for already busy teaching teams.

This challenge has made writing a strategically important focus in UTS’s Connected Intelligence Centre (CIC) for the last few years, and we now have an AI-powered tool giving students instant feedback, called AcaWriter (note that this isn’t grading – it still takes a human to make the complex judgement about a grade!).

AcaWriter uses natural language processing to identify significant patterns in sentences to produce feedback to help students reflect on how to ‘make their thinking visible’ in their final submission.

Through continuous evaluation and refinement, the tool has been piloted with several thousand students, and the results are encouraging enough for us to make it available (soon!) to all UTS undergraduate and postgraduate students.

Take a tour of AcaWriter

UTS is, to our knowledge, the first university in the world to provide instant feedback to students on reflective writing, and one of just a handful to provide such feedback on other writing genres.

AcaWriter research is winning international awards, and piloting it has helped several UTS academics win UTS Learning and Teaching Innovation grants and awards. Universities around the world are now enquiring about its availability, and we’re now looking at tools to help students struggling with more rudimentary aspects of their writing.

It’s been critical to understand what it takes for faculty staff to trust AcaWriter, and our Writing Analytics team has documented how we work closely with educators to co-design the software. This way of designing educational AI underpins other tools in the CIC pipeline.

AI for personalised feedback with OnTask

Continuing the key theme of improving feedback to students, OnTask enables a teaching team to send personalised feedback to every student. This week, every UTS Business School student in the 1500 strong Accounting for Business Decisions received a feedback email suited to their progress.

OnTask is similar in spirit to the “adaptive learning platforms” introduced at the start, but instead of coaching a small curriculum in depth, it can be used in any discipline, and can use student activity on a wide range of platforms to provide personalised feedback on study strategies. Check out this introductory video, and this briefing about a similar tool called ECoach which is used with large student cohorts in the US.

AI in the next generation of simulation nursing wards

A very different example is our work with UTS’s Health Faculty. We’re prototyping the next generation of their simulation wards, in which the room ‘knows’ what the nurses are doing when they engage in teamwork exercises with the manikin patients. The room can now sense who is doing what, where and when, because we’ve given it an AI ‘nervous system’: the sensory signals are integrated, analysed, and then fed back to the team as a visual timeline, to support post-simulation debriefing.

AI Ethics: co-designing with academics and students

It’s important to emphasise that we’re not parachuting these new tools onto faculties and requiring academics to teach with them. Each tool is based on the best available research (approved by our Human Research Ethics Committee), and has been co-designed with academics, and piloted with students to get their feedback on what works for them. Faculties must decide which tools, if any, they want to try.

This is how CIC works as a research-inspired innovation centre for UTS. In the end, it’s our academics and students who will use these tools, so human-centred design and rigorous research are required to build trust in these novel tools.

The future is exciting

Hopefully this excites you about the potential of educational AI that we can trust, because we understand its strengths and weaknesses. CIC specialises in educational uses of data, analytics and AI so if you’ve got any questions, or think these tools might help your students, just get in touch via and we’ll connect you to the right people.

Go deeper:

Join the discussion