Last week’s AI x L&T program of events aimed to create a visible forum for facilitating informed action around the usage, ethics and issues of incorporating AI into higher education at UTS. Below, with the help of Lucy Blakemore, Rhiannon Hall and Jenny Wallace, I summarise the key things that stood out to us across the nine events.

Missed out on an event? Check out videos from the week

Videos of the events from AI x L&T Week are now available on our YouTube channel.

1. Having hallucinations? You’re not alone – GenAI is still unreliable

We were constantly reminded that tools like ChatGPT are language models, not knowledge models. Try generating an biography of yourself via AI and you might be surprised to read you have completed a degree you have never even started or appeared in a documentary you have never seen – both true (false) stories from panellists at the midweek Do androids dream of HDs? event. Others have even found themselves falsely labeled terrorists by AI.

So, despite it sounding authoritative, GenAI can still be wildly inaccurate. A guest speaker who responded to questions in the guise of ChatGPT (‘Chatty G’) at the aforementioned ‘Androids’ panel discussion stressed its unreliability several times within its predictable, long-winded answers. For all its potential usefulness, there is still a lot to filter through and not all of it is obvious in its wrongness. Going forward, our students will need the critical skills to validate what gets churned out for them and not get caught out by false information.

2. Everyone wants to know how AI can be used positively for accessibility

The most-attended event of the week was AI for Accessibility: Empowering students through technology in education. AI’s transformative role in education offers up pros and cons when it comes to inclusion. Concerns about equity arose often throughout the week, particularly with regard to accessibility to increasingly expensive platforms, but many positives have come from this disruption that will enhance student support and accessibility.

AI tools have potential to make a huge difference for learners and professionals with disabilities. They’re not perfect and bring risks (particularly ethics) but, as part of a suite of support tools, provide a practical means of levelling the playing field. Fiona Given, who has Cerebral Palsy and uses multiple tools for writing in work/communications, said “If I had gone through university in the GenAI era, I may not have had those battle scars [from having to produce so much written text].”

3. Tools need to continue to adapt to try and keep up with GenAI

Getting the right tool to work with adaptable, evolving AI is not simple – as we found when we were about to adopt the TurnitIn AI detection tool, but quickly realised the risks in turning an untested tool on too early. (The door is not completely closed on that particular AI detection tool, but that’s another blog post for another day.)

At the Bias, fact-checking and evaluating information presentation, the UTS Library explained how they already had an established information validation tool for students in the form of CRAAP (Currency, Relevance, Authority, Accuracy, Purpose). Advances in AI led to the LibrAiry’s development of a new tool specifically for AI platforms – ROBOT (Reliability, Objective, Bias, Ownership and Type). There is currently no such thing as the perfect evaluation tool, so the UTS Library will continue to adapt. A combination of the two models (CRAAP ROBOT?) is one consideration. But, either way, the race is on to adapt tools as fast as AI adapts itself.

4. There’s more to explore with how educators themselves are using ChatGPT

It’s not just students using ChatGPT – there are myriad ways GenAI can create shortcuts or provide a starting point for professional and academic staff too. During the Teaching in the age of AI: Experiences at UTS case studies, Anna Lidfors Lindqvist explained that AI was being used as a feedback tool for tutors – the tutor writes dot points, which then gets expanded on and structured by ChatGPT. The tutors can also train the AI to elaborate more effectively, thus making the task of giving written feedback quicker.

At the Research on AI for teaching and learning event, Angela Hoo showed us how GenAI was being used as interactive tutor for students to have constructive master/novice conversations and role-plays with. The School of Computer Science are exploring how a customised Q&A can respond to students in a personalised and interactive way, with students able to give feedback on the quality of the responses.

5. The students have spoken – they want UTS to continue to encourage them to use GenAI in their studies

Deep in the bowels of UTS, a fiery debate between students and staff took place on Thursday night. The topic: Should UTS encourage students to use generative AI in their studies? While there were strong arguments either side, the students came out on top with the affirmative. And this was echoed strongly in the last event of the week, the student-voiced AI: what’s integrity got to do with it? The message from the students on the panel was that it’s important that they are given detailed guidance and support on using AI in their subjects. An established understanding of the best ways to use AI and maintain academic integrity is needed for both students and staff to move confidently into the future.

Feature image by Fidel Fernando

Join the discussion