There has been a whirlwind of talk about ChatGPT and its effects on learning and teaching. We may have to wait for the dust to settle before we see the potential impacts of Generative AI on academic integrity. Yet there are aspects of AI tools that academics can include in their classes today. Assessments too might be due for a refresh.
We spoke to three academics in the Faculty of Engineering and Information Technology at UTS about how they plan to use ChatGPT to prepare students for an AI-literate workforce.
- Dr Jun Li is a Senior Lecturer in the School of Computer Science and teaches Machine Learning
- Dr Baki Kocaballi is a Senior Lecturer at the School of Computer Science
- Professor Keith Willey is the Director of Innovation in Engineering & IT Education at UTS
Learning from AI
Baki wants students to engage with AI. As he sees it, “once our students graduate there will be expectations to use AI tools in their work, to make it efficient and effective.” He warns that “if we just ignore it and create a stigma around it, our graduates will become less relevant. We need to embrace these technologies.”
Baki points to existing technologies as an example. “For instance, pilots rely 95% on autopilot. They are there to intervene when needed. As a passenger, would you want to get on a plane that’s fully autopilot? No. Or completely human operated? No. So, there is this collaboration model that we are going into as well.”
Jun finds ChatGPT a helpful ideation springboard, with limits.
“My experience is that ChatGPT enhances creativity. It works like a mirror. The AI responses help form the context of your idea, connect it to the relevant background, fill gaps in some technical details, facilitate to make a big picture. But it does not innovate new ideas beyond the user’s imagination, so I’d call it a mirror.”
Keith believes AI can help students think outside the box. He believes the capacity of AI to introduce users to multiple perspectives and viewpoints should enhance creativity. “Accepting an existing solution, without thinking, stifles creativity,” he suggests. “You can’t blame AI for that.”
ChatGPT in our classes
Baki plans to incorporate ChatGPT into his subject as part of ideation activities. “Two things that ChatGPT does amazingly well is ideation and summarisation. Ideation is a stage where you are open to errors and it does it remarkably well. As a designer, you should use it.” In his own research, Baki is looking at asking AI to generate prototype images for UI layouts based on prompts.
Jun would like to encourage students to use ChatGPT in every learning scenario. “I would try to demonstrate how to utilise tools like it, and I very likely would learn some innovative usage from the students.
Keith wants students to graduate ready to use all the available tools in their professional practice, including AI tools. “AI has the potential to significantly streamline preliminary work, allowing students to focus on developing and demonstrating higher-level learning outcomes,” he believes. “The only way students are going to develop their skills for effectively using AI and to recognise its strengths and weaknesses is to use it.”
Rethinking assessments
AI tools may fast track improving how we assess students. Baki thinks students need to be ethically responsible and cite when they use ChatGPT. “Maybe we need an overhaul of the entire syllabus, the topics might be the same, but the way we teach them can be different. There would still be some situations where we need to test knowledge. In those scenarios we could isolate students, with no digital access.”
Jun thinks ChatGPT makes it more difficult to continue the current practice of assessing “human (students’) intellectual efforts using tools of machine intelligence that are inferior to ChatGPT”. He explains that ChatGPT has “democratised machine intelligence so that the public tool is superior to some institutional automatic anti-plagiarism tools.” Even if ChatGPT is used to plagiarise, there can be errors as ChatGPT can give “completely wrong information” but sound convincingly correct.
Looking ahead, Keith suggests we consider “focusing learning and assessment on the process students use to learn rather than just the product they produce. I would only discourage AI use in instances where it would inhibit the learning and demonstration of subject learning outcomes.”
Moving forward
Jun hopes that AI continues becoming more “democratised and decentralised”, and that we address AI literacy and the legal models regarding data and privacy.
Keith suggests first steps to assist academics to appreciate both the impact of AI and to develop learning activities for students to use AI. He recommends that academics submit their previous semesters’ learning and assessment activities to ChatGPT. “This will allow them to experience firsthand how the AI responds.”
Baki envisages students benefiting in an AI future. “Tools like ChatGPT are here to stay. I want our students to be well-equipped for the not-too-distant future when AI tools will most likely be an integral part of work practices. Having more ideas empowers students. We can’t ban Google. Sooner or later AI will be part of it. But a bit of friction may be good, it gives us ideas and time to think.”