This post was co-authored by Ann Wilson, Jenny Wallace, Caroline Havery and Rosalie Goldsmith.
In the previous blog in this series we offered a three tier strategy for thinking about assessment in the age of GenAI, and suggested that the three tiers might be assessments:
- where GenAI could be used as an assistant
- where GenAI could be used
- where GenAI could not be used
(From UCL: Using AI tools in assessment).
In this blog we will explore those assessments when GenAI can be used as an assistant such as:
- drafting and structuring content
- supporting the writing process in a limited manner
- as a support tutor
- supporting a particular process such as testing code or translating content;
- giving feedback on content, or proofreading content
There are many examples of these uses and you may be aware of ones that are specific to your discipline. We would love to hear from you about those, as sharing practice will be a great way to build our capacity in this new era.
In Assessment redesign for generative AI: A taxonomy of options and their viability, Jason Lodge from UQ sets out the need to rethink assessment in the age of GenAI, offering the options of embracing GenAI in assessment redesign. In The digital evolution of higher education: From high-cost failures to high-risk futures, the authors point out that assessment is already compromised with the advent of GenAI, and that one of the positives from this is the opportunity, in fact the deadset need, to rethink our assessments.
GenAI in assessment at UTS
At UTS we have a number of examples of assessment that use GenAI in an assistive capacity. Consider this one from High Performance Science in the Masters of Sports and Exercise Science, in the Faculty of Health.
Example from Sports and Exercise Science
Students are asked to write a research report in which they critically evaluate an innovative technology/method or intervention that could be implemented in a high-performance program for a particular set of athletes. The research report needs to draw on evidence from the literature and expert opinion. Students need to conclude by recommending whether the intervention is suitable for use with their group of athletes. Students then undertake the same task using ChatGPT and compare and critically evaluate the recommendations the student reach based on evidence and expert opinion with the recommendations suggested by ChatGPT.
Example from FEIT
Another example of how GenAI is being used as a learning resource is in the FEIT subject Communication for IT Professionals. As part of their group project for the Engineers Without Borders Challenge, students need to ideate project ideas and to investigate possible solutions. Students are given guidelines about using ChatGPT for prompting advice through the ROSE Model (Role, Objective, Style, Exemplar), which encourages them to be specific in their prompt questions, and to engage in dialogue with ChatGPT to refine their search.
Example from Speech Pathology
Professor Bronwyn Hemsley’s students in Social Media in Speech Pathology were allowed and encouraged to use GenAI in an assessment item designing a strategic plan on the use of social media in a private practice. While not required, the use of GenAI was supported by an allowable use statement in the subject outline. The only requirement of students was to declare any usage of the GenAI in an appendix and a short reflection on how the output of the GenAI (ChatGPT) had assisted them in their task or influenced their final submission.
Ultimately, only one of the 12 groups opted to use the GenAI (ChatGPT) in the process of doing the assignment, primarily to ‘get ideas’ in the early stages and not in the drafting of content submitted. However, it was apparent that all groups may have been assisted in relation to their structure and content of the assignment being within the word limits and according to the rubric elements. When asked about their reticence to use the GenAI in the assignment, students commented that they were unsure if it would affect their quality of written work and mean that their mark would be at risk of being lower; or considered that it would take more time to learn how to use it, fix its factual inaccuracies and further edit the outputs, and do the appendix and reflection.
They were also concerned about plagiarism, given the widespread attention this problem is having in the media and academic discourse. In short, the students did not perceive a net gain in using the GenAI, and too much risk to their marks at a busy time of their training and final session in their course. From this case example, it is possible that students may be more likely to avoid than embrace legitimate and allowable use of GenAI. They may benefit from GenAI being a required element of an assignment in order to (a) grapple with its inadequacies and demonstrate competence in use (e.g., prompts, critical appraisal, correction of outputs, attribution of the content and transparency). Students might also need modelling of usage and some examples to demonstrate more explicitly various ways the GenAI can be used to improve written outputs in real world situations and scenarios.
So, what next?
There are a number of issues at play in using GenAI. We need to equip our students to work in a world with GenAI, and as such we need to teach them how to use it, use it well and to employ evaluative judgement as to whether the machine has produced a good version of whatever it has been asked to do.
We need to provide specific guidance to our students about the strengths and limitations of AI, and to make our expectations clear about how students can use AI as an assistive tool in their studies. We all need to learn how to prompt GenAI to deliver the content that we want and at the quality that we require.
We need to explore what are the best ways to use GenAI in your discipline – what is happening in your industry? How is GenAI being used in the workplace?