Still need help?
Get in touch with the LX.lab team by logging a ticket via ServiceConnect. We'll be in touch shortly.
Log a ticketWant to provide feedback on this resource? Please log in first via the top nav menu.
Strategies and approaches for assessment design that facilitate academic integrity while providing an engaging learning experience.
While AI has created challenges, it also presents opportunities to re-imagine assessment, particularly in the longer term. Below are some considerations for assessment re-design that use AI to enrich the student learning experience while maintaining academic integrity.
Adopting these strategies and approaches would likely require major assessment redesign. This takes time and needs to comply with faculty committee approval processes before implementation. Discussing your ideas with peers and seeking support from IML, LX.lab or PGLD (as appropriate) is strongly recommended for major re-designs in this evolving space.
AI is already part of acceptable, even expected, professional practice in many disciplines. By keeping up to date with AI use in our own disciplines, we can build authenticity by aligning our teaching and assessment with these uses as they evolve. Explaining how new forms of assessment prepare students for their future careers should build engagement and enthusiasm for the tasks.
What makes an assessment ‘authentic’? Find out in our resource on authentic assessment.
As detection tools develop, we may be able to differentiate between assessments, or parts of assessments, that allow AI and those that don’t. Scaffolded assessments delivered in steps across the session might lend themselves to this approach. AI use could be allowed in early formative steps but not for the final submission.
At program or degree level, AI might be allowed in earlier subjects as students develop disciplinary knowledge, but they will be required to demonstrate their own skills and knowledge without using AI in the final stages. Reverse-scaffolding might also be appropriate in some disciplines, where students may need to build their own disciplinary knowledge first to be able to use or interrogate the AI effectively later on.
This approach is focused on assessment for learning and provides greater opportunity for the students to demonstrate evidence of their learning journey; for more information see Assessment for learning.
Assessments could start with an AI output, then require students to ‘show their working’ in developing it: how have they interrogated the AI initially (what prompts did they put in), and how have they fact-checked and corrected the output?
This might involve students submitting the initial raw output, including their prompts, an annotated version with comments indicating what needs further checking, then an explanation of their research process (academic literacy, library searches, etc) and track changes across multiple version/s showing their corrections to the final piece. To keep marking time reasonable, these could be appendices to the final version, or an executive summary or reflection piece on how the final version was generated.
The potential for AI use in assessment invites us to think deeply about exactly what we are assessing – what do we want students to learn from this task, beyond what could be performed by AI? How can our marking criteria and rubrics reflect this?
The increasing usage of writing tools like Grammarly and Word Editor have downplayed the importance of submissions being well written. It’s now a baseline expectation, but remains in marking criteria as a de minimus requirement.
What aspects of our marking criteria should be less important because they can be performed by an AI? And which should now be more important? Consider whether areas where AI is weak, like factual accuracy and referencing, should be more explicit.
ePortfolios are digital collections of work that provide evidence of a student’s learning, growth, and achievements over the period of a subject or course. They can be used to demonstrate the student learning process throughout a subject or course in the following ways:
Overall, ePortfolios are an assessment type that can be utilised to maintain academic integrity and enhance the student learning process throughout a subject or course. This is done by providing a space where a comprehensive and reflective record of a student’s academic journey can progress while also promoting critical thinking, goal setting and collaboration.
Consider assessing students with an interactive oral assessment such as a semi-structured or unscripted dialogue between the student and assessor. You could also include authentic assessment elements that simulate a workplace scenario or task, such as an interview or negotiation. Filming or audio recording live assessments is recommended where possible to allow review in the event of a disputed mark.
Video presentations offer an asynchronous alternative to live class presentations. The video presentations can be used as a basis for class discussions, or uploaded to Canvas discussion boards for peer review and questions from other students. Be aware that AI could be used to generate scripts, and even to voice the videos. Requiring the student to appear on the screen (eg, in the top right corner) and including slides or infographics as part of the assessment may help offset these risks.
What you need to know about developing video assessment tasks
Consider including a class participation mark or giving an existing one greater weight. This will encourage both attendance and engagement with synchronous class discussions, but AI use would be difficult in those settings at the current time. A broader ‘course engagement’ mark would allow students to contribute asynchronously at any point during the subject. This could combine live class participation with Canvas analytics on site use (students should be advised of this tracking), and assessment of the quality of contributions to discussion boards and other activities. Students could also complete a self-assessment, submit a sample of their best replies or produce a short reflection as part of this task.
Get in touch with the LX.lab team by logging a ticket via ServiceConnect. We'll be in touch shortly.
Log a ticketWant to provide feedback on this resource? Please log in first via the top nav menu.