UTS provides students with a lot of information to help them prepare for online (especially AI-invigilated) exams. Centrally coordinated emails from UTS Exams are plentiful and there is a comprehensive website. But do students read these emails? Or act on them?

This year I’ve taken on the role of designing, coordinating and teaching Accounting and Accountability, a new first year Business subject. We use flipped learning in our course which involves guided online learning and activities, with tutorials focused on problem solving in small groups. Our large group workshops (350 students) are used for co-curricular activities: diving deep into academic integrity, understanding academic writing, building a solid LinkedIn profile, and most recently, understanding our online exams process at UTS.

Unpacking the mysteries of online exams

As soon as the exam timetable was released, there was apprehension and anxiety amongst the student population. What exactly does AI-invigilated mean? How is ‘closed book’ different from ‘restricted open book’? What am I permitted to do? What will the system flag as cheating?

Misunderstandings are common! Most of my students come to UTS straight from high school where there is consistency across exam types and terms have the same meaning across all subjects. However, it is slightly more complicated at university! My fellow first year coordinators Drs Simone Faulkner, Mihalja Gavin and Ruth Weatherall encountered this earlier in the session; we all have an assessment that requires written work, but we all have different requirements.

So when it came to exams, I knew we needed to provide clear comparative information about our first year subjects and about the different types of exams. I worked with my fellow Bachelor of Business first year coordinators to collate the following slide, and designed a 1.5 hour workshop to guide students through some key points.

A sample timetable for the Bachelor of Business exams and assessments

Less anxiety, more head space for exams

I worked with the UTS Exams team to develop an extensive workshop agenda which included explaining consent to AI-invigilated exams, showing how to create a ProctorU account and noting what is flagged by AI invigilation and the investigation process for flagged incidents (which may not be misconduct at all). Factoring in time to access UTS systems and field questions, we needed the full 1.5 hours to work through everything.

Our goal was to reduce student anxiety and apprehension towards these exams, allowing students more mental space to focus on exam content and skill preparation. Those of you familiar with Sally Kift’s work on transition pedagogy will know that these types of activities are part of the transition component, helping students gain a common understanding around university practices. Dr Kathy Egea, our First and Further Year Experience Manager also notes that this type of practice aligns with Australian Higher Education Standard 1.3 about good practice in online learning contexts.

Was it 1.5 hours well spent?

According to anonymous student polling data, yes! We used Mentimeter to gauge pre- and post-workshop confidence from students; the pre-workshop assessment identifies that a significant proportion of students had low confidence in their understanding, followed by a strong shift after the workshop.

Column chart showing student confidence in online exams pre and post workshop. Pre-workshop almost all students are low in confidence, post-workshop most students feel fairly confident

Sessions were conducted on-campus and also online via Zoom, with a recording posted along with the slides.

Over 5 x 1.5 hour workshops, we fielded over 200 individual questions related to AI-invigilated exams that we are now in the process of collating as a set of Frequently Asked Questions on Canvas.

What’s in it for academics?

Whilst information and instructions are clearly set out in communications, students tell us that they are overwhelmed with institutional emails. Making time to support them makes sense from a student perspective, but there are plenty of potential staff benefits, too:

  • Reducing the risk of flagged incidents in ProctorU – thus reducing the amount of work for the Exams investigation team, and our work as subject coordinators in having to withhold and then release certain results. This will also reduce emails related to withheld results because of flagged incidents.
  • Reducing the number of queries we get via Teams, email or Canvas discussions (I’ve already prepared some standard text with links back to our slides, recording and FAQs that we can use as a copy and paste)
  • Providing this information centrally through workshops will reduce the queries made to tutors
  • It provides consistent messaging to students, rather than students relying on their own networks or different tutors having incomplete or imperfect information about the processes.

…so will it work?

Time will tell! Our exam is coming up at the end of May, when I’ll be closely tracking the following information:

  • number of queries we receive on the topic
  • number of incidents flagged by ProctorU
  • number of views of the video recording
  • number of page accesses for our Frequently Asked Questions
  • any free text comments from the SFS

As a brand new subject, we don’t have much data to compare to – but hopefully with this kind of support and guidance, we can improve both student and staff experience for online exams.

Join the discussion