Event details

Date: Monday 29 November

Time: 9:30am – 4pm

Location: Program will be held online on Zoom with opportunities for Covid-safe social gatherings on campus for those registered.

The 2021 UTS Learning and Teaching Forum is a cross-faculty, University-wide event for the UTS teaching and learning community. This year, the focus will be on enhancing the student experience by exploring the ways we create opportunities and support learning through ‘feedback’. 

The forum aims to model effective feedback processes, allowing spaces for dialogue throughout concurrent sessions and the MS Teams site, bringing multiple perspectives from emerging research, academics and professional staff, students, and industry, as well as inviting you to plan how you might put your own learning to practice in your teaching or professional work. There will also be the opportunity to consider how feedback might be enhanced through current and emerging tools and technologies.

Zoom

To participate in this forum, please make sure you have the latest Zoom update version 5.8.4. You can update your Zoom following these instructions.

MS Teams site

Join the MS Teams site to view the full program with Zoom links.

Program

TimeProgram Activity
9:30 – 9:35amAcknowledgement of Country
9:35 – 9:45amOpening Address: Feedback at UTS

Professor Shirley Alexander 
Deputy Vice-Chancellor and Vice President (Education and Students) 
9:45 – 10:30amEnhancing feedback processes in higher education through the aggregation of marginal gains

Professor Naomi Winstone, Director of the Surrey Institute of Education

Even the highest-quality feedback on students’ work will not have an impact on their learning unless students actively engage with and implement the advice. My recent programme of research has focused on students’ cognitive, motivational, and emotional landscapes and how they influence the ways in which students receive, process, and implement feedback on their work. In this talk, I will argue that maximising the impact of feedback is fundamentally an issue of design, where opportunities for students to develop the skills required for effective use of feedback, and opportunities to apply feedback, can transform the role of students in assessment. In particular, I will argue that meaningful change can be achieved through seeking to improve small elements of feedback processes, where such improvements combine to create significant impact on learning.
10:30 – 10:40amOverview of the day
10:40 – 10:50amMeet our students
10:50 – 11:05amMorning tea

Concurrent sessions 1 (11:05 – 11:50am)

Presentations will be held concurrently in different Zoom rooms.

1. View the long program on Teams

2. Click on the corresponding Zoom link

Work Integrated Learning & Feedback Loops

Achieving dual outcomes in client facing software development studio

Asif Gill

FEIT 

Achieving duality of student learning and client project outcomes is difficult in client facing software development studio subject. Software development studio subject has been re-organised in 2021 with a view to achieve such dual outcomes. From student perspective, a balanced and cross-skill teaming approach was applied to tackle the complex client requirements. From client perspective, clear deliverables and expectations were set in the beginning and continuously managed thought out the semester via 360 feedback loops. This resulted in enhanced study learning and client experience. The 3-key take outs:

(1) create a balanced and cross-skill team structure,

(2) be realistic about deliverables, and

(3) focus on proof of concept level delivery rather professional production level outcomes.

Making it real: How feedback improved simulated practice

David Kennedy, David Anderson, Georgia Fisher,  Peter Rizkallah, Jacqueline Benson and Carolyn Hayes

Health

‘Simulation education is becoming commonplace in health education (3, 6) as it can provide realistic learning experiences that enhance clinical reasoning and performance (2, 4, 5, 9, 12). Acute hospital-based care can be daunting for physiotherapy students who have little or no experience in this area. Physiotherapy students must integrate theoretical knowledge with the practical environment of the hospital (lines, tubes, etc.), recognize and respond to changes in the patient and environment, communicate as part of the interdisciplinary team, and do so while safely providing care. Simulation education is one way to provide authentic, case-based experiences that are safe testing grounds for physiotherapy students in the acute setting(1, 7, 8, 10, 11, 13).

We developed a series of simulations using a mix of high-fidelity manikin practicals and role-play sessions where students alternate playing the patient and physiotherapist roles providing students with an authentic, practice-based learning experience. Students are also assessed in this environment, which occurs just before their first clinical placement and provides insight into how they may perform.

Student feedback has been outstanding, with multiple requests to expand the time in simulation. They have also found the simulation sessions good preparation for the real thing.
“The case studies done in the simulation rooms were incredibly helpful and made me feel more confident moving into placement and work in the future.” However, we also received feedback to make the sessions better managed and more realistic.” (take a) time focused approach in the simulation rooms, which would help get more people working through the practice scenarios and more closely replicate the demands of our workplace.” Thus, each year, we have modified these sessions by taking on student feedback about how the simulations run, their organisation, how role-play is enacted, and the environmental setup.

Although we continue to get positive feedback about how the sessions are run, we don’t know how well these sessions prepare students for placement or the workforce. Indeed, since introducing the sessions in 2018, a few students have failed their acute-care clinical placement. These students also did poorly in the subject and, importantly, did poorly on the simulated assessment. Hence, we are undertaking a research project to assess physiotherapy students’ knowledge and self-efficacy in acute care physiotherapy before and after the subject runs and, importantly, after their first placement. The data from the survey will inform us how to modify and enhance the simulation sessions going forward.

Seeking and utilizing feedback as a student and UPASS facilitator 

Calvin Kok and Lin Htet Aung

U:PASS

How has feedback shaped Lin’s career path and how do UPASS facilitators support students who are not as proactive? Lin’s personal experience and story is about how he has embraced his difficulties and evolved to carry on with his studies and university life. Lin’s story shows the influence and importance of verbal feedback for a student. It shows how feedback from mentors can be beneficial and powerful for the growth of a student and assist in the decision making of a student’s career progression. It reminds us that feedback is a constant dialogue.

This student presentation then goes on to flip this idea and ask: what if students are shy and not proactive as Lin? How can we elicit feedback from shy students? UPASS leader, Calvin, will show the different ways he has tried to gather feedback from students and some of the reasons why students may not participate in these discussions with another student. Utilizing the tools Calvin uses to elicit feedback from students, this discussion explores what worked best and suggests how we can integrate this in to our current learning space.

Feedback Literacy & Peer Feedback

Finding feedback opportunities in online teaching in education

Leonie Seaton

FASS

Having undertaken several workshops at UTS exploring feedback to students I wanted to put my new understandings into practice. Initially my subject was to be taught face-to-face, but lockdown meant that the subject was online. Weaver (2006) indicates that students value feedback but that it needs to be well-connected to the assessment task rather than general in nature. I wanted to provide feedback to my students prior to assessment submission so that they had an opportunity to learn from the feedback and to improve their assessments. I also wanted to develop students’ feedback literacy through peer feedback activities. Peer feedback supports students to develop the skills of knowing what a good example looks like and exposes them to new ideas (Winstone & Carless, 2019).

The subject included 3 opportunities for students to receive feedback. The first activity involved students sharing their ideas on a group assignment in class. Peers provided feedback on the topic. I provided a background of a staff meeting as students will become teachers.The second activity involved students submitting an essay to me for feedback prior to submission. The final activity was the use of ACAWriter to gain feedback on a reflection piece.

The first activity had limited success with few comments provided, and typically by the same confident students. This may be due to the class being online however, I feel that I needed to provide guidelines for the feedback such as topic, relevant of ideas, and resources. I had assumed students would be able to use the rubric provided to give feedback. I also assumed this would not be a new activity. In future to assist students to provide feedback to each other I will explicitly provide the rubric sections as areas to be used for feedback.

The second activity went well with about 50% of students asking for feedback. The comments were taken up by students and their assignments improved as a result, in terms of closer focus to the criteria.

The third activity was to introduce the students to the AcaWriter program and show them how it works to provide feedback on a reflection essay. Reflection writing may be a new style for students so I wanted to provide them with feedback which would assist them to understand the style, the moves needed to do this well and also to meet the assessment criteria to a high standard. I provided a video lecture on the topic to further support their work. I felt that the lecture, coupled with the instant feedback from the program would provide useful supported for their writing. This is currently the focus for an assignment due on November 19. I will be able to share the outcome after that date.


Using technology-enhanced feedback to build advanced physical assessment skills in nurses online

Gail Forrest and Michelle Hrlec

Health

Comprehensive Physical Assessment is a core subject in the OPM course, Master of Advanced Nursing. OPM courses present logistical and pedagogical challenges because subjects are delivered over 6 weeks and in online mode. The challenge: How to teach a skills-based subject such as advanced physical assessment in online mode? Traditionally, this subject is taught using live demonstrations, manikins and simulated patients. Historically, feedback between the teaching team and the students occurred in real time.

The academic team, in collaboration with the Postgraduate learning design team, designed and developed an authentic learning experience that utilised the affordances of Canvas and other technology to provided regular opportunities for students and teachers to give and receive feedback.

The fundamental pedagogical design is based on case-based learning through simulated patients, case studies and the use of authentic artifacts such as sounds, visuals, video, animation and diagnostic data and reports.
Types of feedback included:

1. Automated feedback to ensure mastery of important concepts before progressing
2. Asynchronous feedback from both the teaching team and other students.
3. Realistic feedback from authentic interactions (including sound and visual elements e.g. breath sounds).
4. Observing the feedback other students received was also key to the learning process.

Activity types: drag and drop, multiple-choice, text entry, click and reveal, poll questions, comments boxes, resource sharing, video responses, interactive diagrams, interactive case studies, click and play sounds and video, interpreting diagnostic data. Technology used: Canvas, H5p, padlet boards, Genialy, Canvas LTIs such as comments boxes and social polls, video, image and media software.

Students are taken through a modular learning sequence that scaffolds the student learning in preparation for completion of their assessments, one of which is an online Observed Structured Clinical Examination (OSCE) that is usually conducted face-to-face. Students found the content engaging and those who had previously experienced a face-to-face OSCE reported that the virtual OSCE was comparable. In fact, a number of students acknowledged aspects of the virtual OSCE that enhanced their examination experience and provided an opportunity for deeper learning, which has been recognised in the literature (Coyne et al., 2021). These included having to use more complex critical thinking when ‘thinking aloud’ and clinical reasoning when being asked to provide rationales for their decision making. The introduction of virtual OSCEs provides education institutions an opportunity to robustly assess and provide feedback to students in remote and rural communities. A key learning from this work is that an online experience can be as rich, engaging, and as effective as one that is delivered face-to-face. It is important not to re-create the face-to-face experience but develop an online learning experience that is equally effective especially in the current evolving field of digital health technologies, telehealth and virtual care.

Peer review of architecture crit presentations: a communication skills focus

Emily Edwards

IML (ALL Team)

As part of the Embedding English Language (EEL) program, I design and teach Language Development Tutorials (LDTs) each session for first-year undergraduate students in the School of Architecture who have been identified as needing additional language support. In 2021, I focused on improving the communication skills these students need to perform their final oral ‘crit’ presentations, in which they present their designs to a panel of tutors and industry professionals.

With this focus in mind, one of the LDT activities I created was a peer review activity around communication skills. Peer review involves students providing feedback on each other’s work in relation to criteria or standards, and this has been conceptualised as a form of ‘feedback as learning’ (Moore & Teather, 2013). Peer review helps students to develop their critical evaluation skills, their understanding of the criteria or standards, and then apply this learning to their own work (Moore & Teather, 2013) – in short, to improve their ‘feedback literacy’ (Hoo, Deneen & Boud, 2021).

First, I created a list of six criteria for doing a successful crit presentation, based on the communication skills we had been practicing in the LDTs over Spring session. These criteria formed the basis of a worksheet, which was shared via Google docs. Some students volunteered to practice their final crits, and the other students took notes to evaluate their peers’ presentations using the criteria, and to specify what each presenter did well and what they could improve. After each presentation, students went into breakout rooms to discuss their evaluations. Following this, we had a class discussion to reflect on how students could improve further.

The activity helped consolidate what students had learnt about communicating in architecture, and also gave them timely support to feed into their final crit presentations (1-2 weeks later). Some students were able to evaluate other’s work very clearly, pinpoint specific areas of improvement and pose feedback questions in relation the criteria, such as “did your design follow any key concepts?” and “what makes it unique?”.

I learnt that peer review of crit communication skills was a very useful activity for architecture LDT students. In future LDTs, I would do this activity more than once and encourage all students to practice presenting. The use of one class Google doc was a bit clunky, so I would create separate docs for each group next time for a smoother experience in online LDTs.

Automated Feedback & Personalised Feedback

Providing Automated Video Feedback to Increase Student Engagement

Keith Heggart

FASS 

The Graduate Certificate in Learning Design is a new course in the Faculty of Arts and Social Sciences. During COVID19, the course shifted to a fully online mode. While this mode allowed for wider participation I was concerned about being able to foster students’ belonging in the course, and therefore their engagement and motivation.

To address this challenge, I was keen to explore ways to provide students with personalised feedback. I had two aims: 1) develop a workflow that would be sustainable as GCLD student numbers increased, and 2) demonstrate to students the role of automated feedback as an aspect of learning design and learning analytics.

Working with Dr Lim, I deployed the OnTask rule-based messaging system to send students personalised emails. The emails were designed to increase individual students’ feelings of belonging and engagement by demonstrating that they were valued members of the learning community. At the start of each module, students completed a reading/ listening task, and posted a related question or reflection on a discussion forum. At the end of each module, students also completed a reflection rating how confident they felt that they had met the module learning objectives. Emails were tailored to each student based on this information.

After some initial experimentation, we decided to make us of videos to provide engaging and personalised feedback. To do this, students’ confidence about their progress were solicited via an embedded survey. This was used to send a personalised video message at four different levels (no response, low confidence, medium confidence, high confidence).

The immediate outcome was increased engagement with the course material. Students commented that they appreciated the videos, and felt as if I was talking directly to them. This meant they felt confident undertaking the further modules. Many students also responded by email – assuring me that they were engaged, and were going to remain on track or catch up. Interestingly, students wanted to watch all levels of the video, and not just the one assigned to their responses!

Given this was a reasonably small-scale study, I plan to continue experimenting both with student comments within the personalised email, to see if that stimulates more student-student discussion, and also with different ways of offering the video content to students in order to provide them with agency in the process.

Building individual connections online: using OnTask to personalise feedback

Nicole Sutton and Lisa-Angelique Lim

Business and Connected Intelligence Centre (CIC)

Remote, online learning raises the stakes for students’ capacity for self-directed learning, with an expectation that they will independently complete preparatory tasks before attending online classes. Although Canvas allows for cohort-wide announcements, I wanted to trial a more personalised approach to encouraging students to engage in effective self-directed learning tasks outside of class.

In a postgraduate subject, I trialed using OnTask to create automated emails to students (n=43), providing personalised feedback, reminders and advice about how best to approach the subject. During semester, I sent 6 email batches, which were personalised to students’ attendance in class, engagement with material on Canvas, and formative assessment outcomes. I then tracked students’ subsequent engagement with materials on Canvas.

Improving self-directed learning

The OnTask emails appeared successful in increasing the proportion of students who completed the preparatory tasks before class. In the weeks in which reminders were sent, an average of 29.5% of students completed the preparation before class, compared to an average of 25.6% students in other weeks.

Positive student perceptions
19 students (44% of the cohort) responded to a survey soliciting their perceptions of the personalised emails. Students reported that the feedback and support “Improved my overall learning experience for this subject” (average 4.37 out of 6).

Building individual connections with students
An unexpected outcome was the unsolicited individual email responses I received from students, particularly following feedback on the first assessment task (an online quiz worth 15%). Nine students (20% of cohort) individually wrote to me in the days following, thanking me for the feedback, expressing their personal feelings about their result and reflecting on their approach to their study. By responding to each of these emails, a series of authentic, individualised dialogues emerged with each student about their progress in the subject.

Somewhat paradoxically, automated feedback can build authentic connections between teachers and students online, when the feedback is personalised and timed appropriately. The first assessment item provides an opportune moment to engage with students about their learning habits and approaches. I look forward to exploring whether such approaches can scale to larger cohorts.

AI is OK. Automated feedback in a large first year Business subject

Simone Faulkner and Lisa-Angelique Lim

Business and Connected Intelligence Centre (CIC)

‘Timely and personalised feedback is necessary for students to be able to revise their work and to learn from mistakes (Dawson et al. 2019; Li & Di Luca 2014). However, in the Bachelor of Business program, large cohorts and a sizeable teaching team present significant challenges to being able to support all students with timely and personalised feedback on their written assessment drafts before final submission.

In this paper I present my experience of coordinating Managing People and Organisations in 2021, specifically how I have addressed the challenges noted above in the Autumn 2021 teaching session. In that session, I had 780 students, 15 tutors and 25 tutorials. I amended the scaffolding of my assessment task – a 1500 word essay – to include the AI feedback tool AcaWriter* which was tailored for the task. Students were required to upload a draft of their essay to AcaWriter, review and act on the automated feedback, then write a 500 word reflection on the effects of this process. The three key intentions for taking this approach were: (1) to give students timely access to feedback/feedforward, (2) to standardise the feedback students receive across tutorials, and (3) to introduce good time management practice to students.

We surveyed students at the end of the Autumn teaching session. From the analysis of students’ responses, we learnt that the three intentions above were satisfied, but there were other outcomes. Using AcaWriter during the drafting process helped students improve their academic writing, and gave them the skills to critique their own work. Additionally, there was a small but significant improvement in the grades of those students who accessed AcaWriter at least once.

Based on the findings of this pilot implementation of AcaWriter, our preliminary conclusion is that, for first year students who are new to the systems and structures of tertiary education, a tool such as AcaWriter is very beneficial for real-time feedback on academic writing. However, in order to advocate its usefulness, there is a need to embed feedback literacy within the curriculum. This will be our aim for the implementation of AcaWriter in the next teaching session.

Evaluative Judgement & Feedback Literacy

Developing evaluative judgements in 3rd year accounting students through collaborative activities

Amanda White

Business

The study of auditing is one that revolves around studying how to make complex judgments with imperfect information. The regulatory guidelines are based around principles rather than rules, and students often struggle navigating this process.

In 22522, Assurance Services and Audit, students work collaboratively (in Zoom breakout rooms) and transparently in classes using Microsoft Teams to help develop their skills at making these complex judgments. After working groups, students come back together for a quick lesson on how to provide FEEDBACK to their peers. They they return to their groups (in breakout rooms) and provide feedback to one other peer group. Students then return to the main session where the coordinator and co-teacher lead students in discussion – providing feedback on their feedback, and also feedback on the submissions. We also mark the most correct solutions with a clear graphic marker to help students come back to those works and compare them against their own.

Students who stayed for this part of the workshop found it very beneficial (as per Menti polling), however not all students choose to stay for this activity – instead, most choose to opt out. The class is taught online with students overseas – so a recording is provided for equity reasons and because many experience drop outs – but I believe this also influences many students to think they can delay this practical learning task until later. It does work for some, because they do demonstrate adequate abilities in making professional judgments during exams, but many international students who are offshore, certainly struggle.

In 2022, where we will continue to have classes offshore – I’ll need to consider what strategies can be implemented to help encourage international students to participate (perhaps in native language groups so they would feel more comfortable?)

Developing students’ feedback literacy via peer feedback: FEIT Studio insights

Amara Atif and Beata Francis

FEIT

This abstracts includes the insights of two enabling activities to develop students’ feedback literacy via peer feedback from a new FEIT subject (Social Impact of IS Studio). In this studio, students work collaboratively to conceptualise, develop and complete a project evaluate and give peers feedback, and reflect on the process and learning experiences. The intention is to engage students in the feedback process so that they understand and appreciate the role of ‘effective feedback’ in their learning and encourage reflection to make sound academic judgments about their own work and the work of others (Joughin et al. 2021). We define effective feedback here as feedback that is kind, justified, specific, and constructive. The two enabling activities are described as below:

1) Pre-engagement of students with interactive videos to understand peer feedback – in each video the person receiving feedback gives feedback on the feedback to close the feedback loop and thereby improve on how to give advice:

Scenario 1: Using feedback to foster good group work
Scenario 2: Giving feedback and guidance to a peer creating project artefacts
Scenario 3: Giving feedback on a project presentation/brief

2) Summative assessment tasks related to peer feedback
At various time stamps during the semester, students had to do:

(a) intra-group evaluation (10%) to exercise critical thinking and creative provocation within one’s own team pertaining to enhancing a specific prototype

(b) inter-group evaluation to provide inter-group review and evaluation for the purpose of building team capacity in enhancing a specific prototype.

Students were provided with separate feedback rubrics in Canvas for both assessment tasks. The factor of ‘anonymity’ was considered in the inter-group evaluation to remove peer pressure from students. Instructions were provided to mark on the available rubrics and provide examples as justification of peer feedback by writing free-text comments. Students providing evaluations were evaluated based on the effectiveness of providing advice to peers using constructive feedback.

The impact on student learning is evidenced in student comments during the semester. Students did mention that allowing them to participate in the process of giving feedback was very helpful as they received “more” feedback and from different viewpoints. Also, providing feedback to their peers on the feedback given, provided them the opportunity to improve on collaboration, communication, creativity, and critical thinking. Students noticed that this feedback loop nudged them into self-reflecting on what they could do better. Moreover, students giving each other feedback, gave me more time to focus on moderating the iterative learning process and offering assistance to those students that were struggling with the process of learning.

Feedback from the ground-up: assess video-based tasks

David Yeats, Natasha Sutevski and Andrew Francois

LX.lab

In 2021, there were close to 500 video assessment tasks found across UTS faculties. There is little information available about assessing video tasks using a rubric that focuses not only on content but also on communication skills. There is also wide variance on how closely video assessments align with course outcomes and subject objectives.

For each type of assessment, evaluation criteria can change slightly, however for most assessments, our rubric can be used to evaluate the audio-visual language & communication, ethics, discipline knowledge & professional readiness and interpersonal communication.

A workshop called ‘Design, implement and assess video-based tasks’ is now a regular part of the LX program of events.

Take an action research approach where we work with academics on their video rubrics before session starts. We can collaboratively identify where tasks require scaffolding and the opportunities to embed feedback processes to better prepare students to meet the rubric criteria.

Inclusion/Belonging, Personalised Feedback & Feedback Literacy

Building Inclusion through a Whole of Institution Language Support program

Rosalie Goldsmith

IML

The Embedding English Language program provides a whole-of-institution approach to developing all UTS students’ disciplinary and professional language. The program screens all commencing students for their academic language level, and then provides targeted, subject-specific language development for students identified as requiring additional support, whether they are domestic or international. Those students who are identified as needing additional academic language support are required to attend language development tutorials (LDTs) attached to one of their core subjects. The tutorials focus both on developing students’ disciplinary language and on understanding and meeting the requirements of the subject assessment tasks.

Building inclusion through feedback in language development activities

The LDTs help students to develop confidence and to develop a sense of community where they feel comfortable to actively participate. Students frequently work with a partner or in a small group, to practise and give feedback on their own and others’ language development. Although students may initially find group work challenging, they develop confidence with practice: “At first it was so hard to me to talk with … classmate. But step by step my tutor helped us to improve our speaking and feel free to talk with … our classmates” [Design student focus group interview, Autumn 2020]. The sense of community developed in the LDTs helps students to learn, “because when you’re under the relaxed environment you will got a lot of information. If you’re nervous you just panicking … oh no I cannot do this” [Health student focus group interview, Autumn 2020].

The EEL program also enables all students to track their language development throughout their degree, and to get explicit feedback on their language by completing a series of milestone tasks, that is, existing assessment tasks within disciplinary subjects that are used to identify students who may need further language development.

Snapshot of EEL program across UTS

This presentation will include a snapshot of how the EEL program is implemented in each faculty.

Impact

Feedback from faculty staff and students has been positive – they are very grateful for the provision of the language support; feedback from students indicates that they have gained confidence and competence in responding to assessment tasks; feedback from tutors indicates that a greater sense of belonging was evident: e.g. students understood what other services are available at UTS.

 

Mentors as a bridge

Anna Furlong, Eleftheria Papadopoulou, Laurence Cai and Paris Oudomvilay

Business (IBP Mentors)

Integrated Business Perspectives (IBP) has been a part of the Business core, enrolling over 2000 mostly first-year students each year. In 2014 the IBP Mentor program was created to incorporate former students as mentors into the subject as a support mechanism for the subject. Since these beginnings, the IBP Mentor program has evolved to be fundamental to the success of the subject, in many ways with student mentors being a bridge between tutors and their students, improving feedback practices and learning outcomes. Moreover, mentors provide diverse perspectives to students and tutors, translating vital information and experiences, providing effective feedback, smoothing the university transition for students, and a level of empathic understanding that teaching staff cannot necessarily provide. These elements contribute beyond just the transition to university, to provide a human-centred foundation through personalised mentorship, to support student success throughout their time at university and beyond. In this session you’ll hear from four experienced student mentors, with a total of 19 semesters as IBP Mentors, who will share their wealth of experiences, providing deep insights into the learning process within a mentor-facilitated learning environment, the dynamics of peer feedback in the tutorial environment, and how these experiences play a crucial role in shaping their personal futures.

Student to student feedback using interactive video

Sally Creagh

FEIT 

We noticed that there did not seem to be many resources on student-to-student feedback available.

So, we took an opportunity to make three short videos with the LX Media team with the fictional context of a student group doing an assignment. The scenario for video one was two group members chatting after a group meeting, discussing how the meeting went and modelling useful feedback. Later, each member of the fictional group makes a portfolio and gives a presentation, so the scenarios for videos two and three show the same two students modelling giving each other feedback about that. The videos include three sections:

1) A voice-over & slide introduction covering the principles of feedback we wanted to convey.

2) The acted scenes modelling the language, body language and essential elements of helpful feedback.

3) The acted scene is then replayed and is punctuated with pop up interactive activities testing the comprehension of the principles. The interactive elements were made using H5P.

The interactive videos were placed in the UTS-wide and FEIT shared repositories on Kaltura. They have been embedded in UG & PG subjects in FEIT and DAB Canvas sites, including in subjects for fashion, architecture and systems thinking for managers.

We had technical issues for one of the scenarios where the footage was unusable and had to be replaced with animation. In future, I would check rushes before ending the shoot. I would also test the videos on a focus group before releasing them to test the efficacy of interactive elements. Finally, I would plan how to promote the existence of the videos so they could be more widely utilised.

Inclusion/ Belonging

Reflections on embedding indigenous knowledge and cultural capability into the teaching of Corporate Law

Robin Bowley

Law

‘Over recert years the need for law graduates to have the knowledge and skills to interact in a culturally competent manner with Indigenous Australians has been widely recognised. There has accordingly been a concerted effort to incorporate indigenous knowledge and cultural capability graduate attributes into subjects across the whole gamut of Australian law courses.

This paper reflects upon the experience of embedding indigenous knowledge and cultural capability into the teaching of Corporate Law at the University of Technology Sydney.
Corporate Law is ordinarily taught in the latter stages of Australian law courses, as the subject both builds upon the knowledge developed in earlier subjects and introduces students to business processes and terminology. As a continually evolving content-heavy subject, incorporating new content into Corporate Law might appear challenging to law teachers.
The paper discusses how Indigenous knowledge has been incorporated into the teaching of Corporate Law at UTS through tutorial discussions and research essay questions on the function of the Office of the Registrar of Indigenous Corporations; analysis of cases relating to the governance of Indigenous corporations regulated under the Corporations (Aboriginal and Torres Strait Islander) Act 2006 (Cth); and discussions of Indigenous cultural heritage matters that companies should factor into their decision-making processes. It concludes with practical suggestions for law teachers who are planning to incorporate Indigenous knowledge and cultural capability into the teaching of Corporate Law and also law for business courses more broadly.

“Storytelling – Listening and Learning” – Developing a space for dialogue and critical reflexivity across the embedding of the Indigenous Graduate Attribute in curriculum

Associate Professor Annette Gainsford 

Associate Dean Indigenous Teaching and Learning – Office of the PVC Indigenous Leadership and Engagement

‘“Storytelling – Listening and Learning” – Developing a space for dialogue and critical reflexivity across the embedding of the Indigenous Graduate Attribute in curriculum

As the Associate Dean Indigenous I provide leadership of Indigenous teaching and learning initiatives, primarily related to the implementation of the Indigenous Graduate Attribute. My role is to enhance the quality of Indigenous teaching and learning including professional development relating to course design, development, and delivery. My work is multi-disciplinary and uses a range of strategies to initiate connections between academics and embedding the Indigenous Graduate Attribute across discipline specific courses. One effective strategy is the use of storytelling; storytelling can be implemented in several ways including to provide a space for dialogue between Indigenous and non-Indigenous peoples perspectives in curriculum and as a means of reflective practice to assess Indigenous course intended learning outcomes.

Oral storytelling is a way in which Indigenous peoples pass on their traditions from one generation to the next (Archibald, 2008). In the same way that storytelling effectively passes on Indigenous traditions, storytelling creates a space for dialogue between Indigenous and non-Indigenous peoples to share knowledge as an effective teaching and learning methodology (Gainsford and Robertson, 2019). For example, at the level of course design and subject development storytelling can be used as a way in which course teams can openly consult with Indigenous academics, community members and industry experts in relation to Indigenous course intended learning outcomes and culturally appropriate resources. This interaction has dual benefits, firstly for the cultural authenticity of the embedded Indigenous content across the course and secondly to enhance the professional knowledge and capabilities of non-Indigenous academics who are delivering the content at subject level. Storytelling can also be included as valid form of assessment (Kovach, 2015). Used as a form of assessment deep listening and learning through storytelling can enable students to critically reflect on how the knowledge of Indigenous contexts and the nature of the profession can inform their future professional capability.

Leveraging the power of Interactive systems for enhancing students’ experience

Morteza Saberi

FEIT

‘Nowadays, our students use advanced information systems in their personal and professional lives. Thus they are pretty familiar with the features of these systems. During my coordination and teaching the 31266, I found that teaching the theoretical aspect of information systems without going into its real application is challenging for teaching staff and boring for students, on the other hand. Our students who are exposed to advanced information systems daily would like to see them in their subject too. Otherwise, they look at the subject as boring, unattractive, and old one. To this end, I revised the tutorial content to make it more engaging and gave students access to interactive systems.
Leveraging the power interactive systems assisted me in receiving the students’ feedback during the tutorial when working with these interactive systems. I also realized that referring to those systems during the lecture has a direct and positive impact on the students’ engagement. That’s why I plan to use them in my lecture material for the following teaching round to provide the student with a more impactful and engaging lecture.

TimeProgram Activity
11:50 – 11:55amStretch break

Concurrent sessions 2 (11:55am – 12:40pm)

Presentations will be held concurrently in different Zoom rooms.

1. View the long program on Teams

2. Click on the corresponding Zoom link

Work Integrated Learning, Self-Feedback & Personalised feedback

Professional reflective journey of SER students

Kellie Ellis and WIL Team

Health

We wanted to build capacity in our students across both developing Professional Skills and creating a Professional portfolio. The challenge / opportunity arose from wanting assessments to reflect real world work ready skills.

The students developed professional skills in CV writing and interview techniques, which they enhanced and developed across their professional journey (pre and post placement CV submission and interview with subject staff). Work placement learning objectives were set in consultation with workplace supervisors and revisited in a mid point check in and also the final supervisor evaluation. The students wrote reflective pieces on both these experiences / key learnings / agility in the workplace, along with keeping a log of workplace activities.

The outcomes showed what we had hoped- the students went not only on placement – but a professional learning journey of enhancing both their professional skills and professional portfolio. The assessments demonstrated where students skills were at in regards to both these areas ie. some students had a great professional portfolio (and would get an interview), but may fail to impress with their professional skills.

Moving forward we are looking to improve the logbook and are currently working on ways to make this have more meaning for the students rather than listing / outlining daily activities. The team is also investigating the potential for a student mentoring program from past students. UTS Careers and Industry experts assist with the delivery of content in a Coursework day presented to students at the beginning of the subject.

Industry practice-oriented formative feedback in the online environment

Gabriela Quintana Vigiola

DAB

Urban Planning is a field that is in continuous change. Due to the practicality and applicability of this discipline, industry engagement and feedback is fundamental in the enabling of a meaningful and practice-orientated learning process in the Planning practice. The instability experienced by Sydneysiders (including students, academics, and industry) since the start of COVID in 2020 forced the planning program to reinvent how to engage with industry in order to continue to enable the same high-quality learning experience to our students.

To guarantee an enhanced learning experience that continued to engage industry to provide feedback and build professional practice capabilities in the online environment, I encouraged all planning staff, as well as in did in my subjects, to continue to invite industry partners in delivering online lectures, both live and pre-recorded. In this process, we needed to adapt not just to the staffs’ IT capabilities, but also to each of the industry invitees’. When pre-recorded lectures were delivered through Canvas, live Q&A sessions were organised, for industry guests to provide formative feedback on student queries, as well as to deepen into the skills developed through the recorded lecture. Also, industry guests were invited as panel members of students’ presentations, to provide live feedback that shaped the students’ professional capabilities.

Students watched pre-recorded lectures, participated in the discussion boards providing great insights on the different concepts addressed by guests, and most importantly, they interacted with practitioners which enabled them to apply all the concepts and skills learnt into professional reports (assignments). Students also had the opportunity to present in live sessions to industry partners, providing them the opportunity to gain practical presentation skills which are fundamental in the planning discipline. Finally, students got to improve their assignments through incorporating the feedback to make them more industry-relevant.

The online environment brought several opportunities and challenges. Some industry experts were more open to participate in the live sessions as they could do this at the convenience of their place. Students had the advantage to have real-life case studies discussed and recorded for them to watch at their own pace both before and after the scheduled session. Additionally, having the industry feedback recorded, allowed them to better implement the suggestions into their assignments. The main challenge was the need to be flexible around staffs’ and guests’ technical skills with regards to the online technology.

Stillbirth and grief – responding to feedback, preparing midwives for practice

Loretta Musgrave, Annabel Sheehy, Kathleen Baird and Vanessa Scarf

Health

‘Midwives care for women who experience perinatal loss such as stillbirth and neonatal death. Care in these sentinel events is emotionally complex and clinically challenging. Midwifery students have minimal clinical exposure to perinatal loss. As a result, many new midwives feel unskilled and unprepared for bereavement encounters.

To address the gap in their clinical learning, three subjects in the 2nd year of the Bachelor of Midwifery program were redesigned to deliver scaffolded education sessions on the topic. Utilising multiple subjects provided considerable time for the complex and emotionally-challenging subject matter to be considered using different teaching and learning approaches.

In the Subject 92363, Complex Labour, Birth and the Puerperium, students attended an online tutorial with a clinician experienced in perinatal loss who employed case-studies for small group-work to help students examine personal beliefs about death and grief, as well as the role of the midwife. The students listened to an episode of the podcast, The Unthinkable (rnz.co.nz), which prompted discussion among the students. Students also completed the Safer Baby Bundle, an e-Learning module developed by the Australian Centre of Research Excellence in Stillbirth. This module has a focus on practice gaps in stillbirth prevention. Student’s also completed the Stillbirth CRE IMPROVE module, which is designed to support healthcare professionals in responding to women who have experienced stillbirth.

Clinical skills and midwifery practice were also explored in 92365, Midwifery Practice 4. Using birth stories as practical examples, referencing the Safer Baby Bundle, students were provided with information regarding communication and conversations that arise in clinical practice. These skills were then simulated in the labs.

Finally, in 92372, Working with Diversity, psycho-social complexities of pregnancy were explored to learn about trauma-informed care. This was followed by a presentation from a couple and a perinatal loss midwife, who shared personal stories of stillbirth. After this, a ‘Q&A’ session allowed students to ask questions and deepen their understanding.
Student engagement during the learning activities was considerable and positive. Students were offered support if they found the class content distressing. Verbal reports from students highlighted that the blended teaching and learning was beneficial and enhanced learning. Students highly valued the use of women’s and midwives’ stories and appreciated the lived experience and truthfulness of the guest speakers. Students were asked to reflect on the presentation and delivery of the content and provide feedback via the Student Feedback Survey, which is still underway.

Feedback Literacy & Continuous Loop

The Feedback Roadmap – visually representing Feedback in the GDMLP

Christine Giles

Law

In the Graduate Diploma in Migration Law and Practice the focus is on providing authentic assessment tasks. All formative non-assessable tasks and the feedback provided on those tasks feed forward into the assessable tasks in the subject and in following subjects in the course. In turn the assessable tasks feed forward to the Independent Capstone Assessment that graduates must pass for industry registration, and beyond to professional practice.

There are three main challenges. The first is to ensure our students appreciate from the outset the value of completing the essential formative tasks so as to encourage them to complete the tasks, even though they do not directly carry any marks. The second is to inspire students to embrace all the opportunities for feedback and to value it in all the forms they receive it. The third is to assist students to implement the feedback they receive to help them judge the quality of their own work and continually improve it.

In Spring session students in our introductory subject (78300) were provided with a short video presentation containing a visual representation of integrated feedback and an explanation of how the different opportunities for feedback related to upcoming assessable tasks in their first subject, in subjects to come, to the Capstone Assessment and finally to future practice. In Autumn 2021 session 73% of students in 78300 completed the formative tasks. This increased to 85% in Spring 2021.

Many student find feedback confronting but we need them to engage with it. We provide a number of resources to encourage to them to make the most of feedback and to help them understand that feedback is a two way process, and that they need to be active players in the process of working with and applying information from others to future learning tasks. This session we introduced a task inviting students to create an Action Plan explaining how they would implement the feedback provided on their written work when revising their work. We reintroduced our ‘Feedback Forecast’ which aims to make students more receptive to feedback.

We are currently working on a simple visual representation of feedback opportunities to provide to students in each of our six subjects. This will be in the form of an ‘at a glance road map’ aiming to provide a quick reference guide and a clear understanding of the benefits of completing the formative and assessable tasks.

Feedback in an IT Professional and Ethics subject

Faezeh Karimi

FEIT

‘Providing “constructive” and “timely” feedback in “large” subjects is challenging. In a postgraduate IT subject, with ~400 students annually, we have utilized different forms of feedback (peer feedback/review, verbal feedback, feedback on assignment) on a range of tasks (project ideas, presentations, class discussion) to facilitate better learning. Our experience shows peer feedback requires informative context for students on why they are providing feedback or review to their peers, detailed instruction on what and how review other’s work and finally have a chance to reflect on and apply the feedback in related tasks. The teacher feedback also requires coordinated effort from the teaching team to provide a seamless experience to students.

The Practice of Reframing Feedback for Student Learning and Wellbeing

Clarice Tan and Rachel Khalef

UTS Learning Hub

Student Learning Hub Guides Clarice and Rachel have facilitated micro-training sessions at the Hub to help students answer questions, learn a skill or make a connection.

Drawing from their personal experiences as students and running the ‘Embracing and Using Feedback’ micro-training session, they discuss the strategy of reframing different forms of feedback and its value as a resource for future learning. They also provide insight into a range of issues that may have shaped the way students perceive feedback.

Automated Feedback & Feedback Literacy

Measuring and enhancing Student Feedback Literacy through Automated Feedback

Ian Farmer, Lisa-Angelique Lim and Amara Atif

FEIT and Connected Intelligence Centre (CIC)

‘Student Feedback Literacy is an essential part of student learning and significantly contributes to a student’s ability for a lifetime of learning. Feedback literacy frameworks (e.g., Carless & Winstone, 2020) have helpfully defined what competencies students and teachers need to be feedback literate. Timely and personalised feedback fosters student feedback literacy by helping students know what specific actions they should take and allowing them to improve their learning and performance while still in their studies. However, for teachers to scale this level of feedback and support to large-enrolment subjects is a momentous challenge. A viable solution is automated, data-informed feedback such as OnTask, which can facilitate instructors’ efforts to foster student feedback literacy.

As a first step towards improving students’ feedback literacy, we piloted the use of OnTask in a PG Business Intelligence subject (enrolment = 101). The primary motivation of the subject coordinator (Amara Atif-FEIT) is to use this automated feedback system to support students’ learning through timely and personalised feedback that could be enacted by students and foster a sense of belonging in this fully online subject.

We used OnTask to send five personalised feedback emails to students over the SPR-21 session. The feedback was personalised to students’ assignment submissions and engagement with key learning activities in Canvas. Feedback messages were crafted along with the principles of “warm, wise feedback”, especially in terms of the growth mindset. An HTML designed template was applied to the automated messages, with the objective of increasing student feedback response. In addition to OnTask emails, Canvas rule-based announcements were also used to provide additional feedback to students.

We are still in the process of collecting data to understand the impact of the feedback intervention. A mid-point check-in survey asking students to rate their feedback’s helpfulness and provide reasons for their ratings (optional) was deployed in Week 7. Of the 52 students who responded, 92% rated their feedback as “Somewhat helpful” or “Very helpful”. Simple thematic analysis of students’ reasons for their rating indicates that: (1) students found the feedback usable for knowing where they needed to improve; (2) students appreciated the support shown by the teacher as demonstrated through the personalised messages.

Students’ positive reactions in this pilot implementation of OnTask provide initial evidence of students’ feedback literacy in response to automated feedback. Most students appreciated their feedback, and importantly, they found their feedback to be usable, meaning that they could enact it.

Face-to-face teamwork: timely, personalised, and evidence-based feedback

Gloria Milena Fernández Nieto, Simon Buckingham Shum, Roberto Martinez Maldonado, Kirsty Kitto, Carmen Axisa and Doug Elliott

Connected Intelligence Centre (CIC) and Health

‘While online learning is a critical part of the UTS experience, face-to-face teamwork remains an important competency in many disciplines and professions. Our work is tackling the challenge of providing timely, personalised, evidence-based feedback on teamwork in collaborative classrooms. In collaboration with the Health Faculty, our co-design work with both students and academics has identified opportunities for learning analytics to augment feedback for nursing students as they engage in care simulations.

In the context of teamwork simulations, nursing students volunteered to wear location and physiological sensors, while the classroom recorded movement, audio and video. Multimodal learning analytics then aggregate and visualise these low-level data streams as static and interactive visuals as a form of formative feedback: evidence for learning outcomes in patient-centred care and effective teamwork, for students and instructors to discuss. We then evaluated the visualisations of teamwork with students and instructors, using ‘data storytelling’ as a design strategy to foreground the most important patterns in the activity data.

A growing body of evidence demonstrates that, through iterative co-design, we can provide new insights. Instructors have confirmed that visualising their own movement around the space is helpful as formative feedback on their professional learning, for reflecting on the strategies that different instructors take, and that they could envisage using the reports generated to assist in student debriefings. Students report that the availability of instant, timely, interactive visualisations of their teamwork has great value. Extensive discussions of data privacy are informing where students and staff consider appropriate boundaries for ‘embodied’ learning analytics of this sort. We are moving towards software that will enable this to scale robustly.

We have learnt that working hand-in-hand with educators and students is a powerful strategy to explore the new possibilities of multimodal learning analytics, in a safe manner that respects educators’ expertise, and ethical concerns. We, as researchers, should not assume that we know all our stakeholders’ needs, and that everything we generate can be useful or easy to interpret.

Teacher feedback literacy and the skillful use of automated feedback tools

Simon Buckingham Shum & Lisa-Angelique Lim

Connected Intelligence Centre (CIC)

Recent years have seen significant changes to (i) how feedback is understood by teachers, and (ii) how it can be supported by technologies. These two ‘tectonic forces’ are redefining the contours of both the conceptual and digital landscapes for feedback, and should be brought into productive dialogue, such that the effective design and deployment of automated feedback (AF) tools is (i) grounded in the scholarship of feedback literacy, and (ii) promotes feedback literate competencies (for both teachers and students). However, to date, there has been no systematic account of the nature of teachers’ feedback competencies when using AF tools. In the absence of clear guidance on how AF tools should be deployed responsibly, they risk being ‘bolted on’ to courses without due consideration of how they will affect student sensemaking and action, and used poorly, could even undermine students’ own feedback competencies.

First, we distinguished ‘closed’ AF tools from ‘open’ ones, where open tools give the teacher significant agency in designing the behaviour of the tool to support their feedback practices. We then reviewed the empirical evidence around the use of four AF tools (including OnTask and AcaWriter, which are used at UTS) through the lens of Boud and Dawson’s (2021) Teacher Feedback Literacy Competency framework. We ask:
(1) Which teacher feedback competencies are evident in the skilled use of open AF tools?
(2) Does the skilled use of open AF tools extend our conception of teacher feedback competencies?

Our analysis indicates that the skilled use of open AF tools requires teachers to demonstrate a wide range of competencies. Our review also draws attention to four new competencies associated with the skilled use of open AF tools, which extend our conception of teacher feedback literacy.

This provides a conceptual map for the future design, deployment and evaluation of open AF tools, to help create feedback-rich environments at UTS that can scale responsibly and effectively. Building on this analysis, we will be running training sessions next year to help build the feedback literacy of UTS academics and tutors through the skillful use of open AF tools.

Personalised Feedback & Peer Feedback

Assessment as learning – personalised peer feedback to develop effective and culturally aware communicators of science

Vanessa Crump and Yvonne Davila

Science

Scientists of the 21st century must be able to communicate with other scientists, decision-makers and with members of our culturally diverse public. This requires scientists to consider the background of their target audience so that the knowledges they are sharing will be clearly understood. Our aim was to support students enrolled in ‘Advanced Communication Skills in Science’, in the Master of Science, to become effective and culturally aware communicators of science.

Assessment as learning is a key feature of this subject, with regular peer review of drafts and work samples. The final assessment task is an oral presentation in the style of a ‘three-minute thesis’. We saw this as an opportunity to contribute to the development of the Indigenous graduate attribute, through awareness and appreciation of multiple ways of developing understandings of nature. In 2021 the task was modified and students were asked to communicate a scientific message related to a scientific perspective of Aboriginal and Torres Strait Islander Knowledges. Students prepared a draft slide and dedicated class time was set for peers and the tutor to provide personalised feedback to students for their final presentation.

Students demonstrated an outstanding ability to provide effective feedback to their peers and to communicate about Indigenous knowledges respectfully in a concise oral format. Students were surveyed at the start and end of the semester (ethics approved) to establish their understanding of concepts relating to Indigenous knowledges and their reflections on their learning experiences. Most students reported greater familiarity with concepts such as Indigenous Science by the end of the semester, and also provided richer definitions of what this means. When asked if understanding Indigenous knowledges and cultural practices might impact their practice as a scientist, many students felt their perspectives had changed and that reflecting on their cultural values and beliefs had helped improve their cultural capability. Most responded that this subject challenged (at least to a degree) some firmly held assumptions, ideas, and beliefs.

Feedback from students provided a guide for future development in the subject: ‘The examples used … were really interesting and helped me to get my head around the concepts – please include a few more to show how interlinked the two concepts (Western and Indigenous Science) are’. Following on, the subject coordinator will revise the subject content to include more opportunities to engage with Indigenous science knowledges as a key component of effective science communication.

Making the case for intensive courses for animation

Maria Juergens

DAB

I did research on intensive modes for a degree in animation while I was studying for the graduate certificate in teaching and learning. Based on feedback from students and teachers, I have come to the conclusion that intensive courses in context subjects will benefit everyone. Students with lower GPA’s prefer intensive modes. Students need more flexible study options. Students prefer intensive modes and have higher retention rates. Casual academics will have more work because students who fail a subject in autumn or spring semester can repeat it as an intensive course during summer or winter break and continue with their studies without falling behind. Our course is moving to FASS next year. I will present my findings to my course director and FASS.

Providing a human connection in online feedback for academic language support

Sang Eun Oh and Andrew Pyke

SSU: UTS HELPS

‘The purpose of the session is to share information about the video feedback program piloted in Spring 2021 and to outline why, how and what happened around the program. The concept of online written feedback is not new to HELPS as it became part of the HELPS Assignment Support Loop (i.e., initial enquiries: Drop-in Advice –> planning a draft as a group: WriteNow writing support –> feedback on drafts: Online writing review –> discussion on drafts: One-to-one consultations) since its trial in Spring 2018. We learned from the experience of providing written feedback that educational elements can be scaffolded to facilitate independent learning even through an online-based service when it’s placed within the Assignment Support Loop.

Disruption in communication
The Loop was disrupted when a remote service delivery became a norm in 2020. Until then, HELPS was able to communicate our caring attitude to generate a rapport during our first encounter (i.e., Drop-in sessions). However, with the absence of initial contact, it became clear that the message of caring often was lost in the written-only format of communications, particularly for students with a lower level of language proficiency. As the increasing number of students reported their emotional and environmental difficulties due to the absence of human connection, we had to act decisively to overcome the challenge.

What did we do?
We decided to ‘show’ that we cared by providing video feedback in addition to the annotated written feedback. The format of videos was to be short in length (less than 3 minutes) and it contained first name basis greetings with the summary of written feedback with a smile. We understood that providing the video feedback would replicate the successful student engagement seen in the previous study by Cavalry et al. in 2019.
Approximately 50% of online feedback sessions were scheduled to add the video feedback in Spring 2021. The additional video feedback was randomly sent to students who requested an online feedback on their draft, and the Zoom recording function was selected as a video tool. To evaluate the trial, students were asked to respond to a survey including their needs, their engagement level and what they learned, and a focus group interview will be scheduled to triangulate the survey result.

What did we learn?
The final result is yet to be analysed. However, there was a general sense of positivity from students as they started to request the video feedback. Nevertheless, there are many challenges for the program to be effective (e.g., scheduling issues and technical limitations), and further analysis is needed to understand the learning impact of the video format of feedback.

Student Feedback

Engineering Mathematics Assessment Outcomes: Learning how to learn and successful application to the Engineering Discipline

Danica Solina, Chris Wong, Kate Crawford and Mary Coupland

Health

‘Traditionally, educators evaluate the effectiveness of a new assessment regime via pass rates, grades and student satisfaction. The impact of the changes on the intended learning outcomes may not be apparent for several years until the new cohort of students can be compared with the prior cohort.

A more direct measure would be to look at the impact of the new assessment regime on a chain of subjects, especially when the subject being assessed is a pre-requisite.

In this presentation we report on the impact of changing assessment regimes (over 12 years) in an initial calculus subject (Mathematics 1) had on two subsequent subjects: Mathematics 2 and Fundamentals of Mechanical Engineering (FME).

We show how the introduction, in 2017, of Threshold Tests resulted in an inversion of the grade distribution in both Mathematics 2 and FME toward higher grades. Previously less than 30% of students obtained >75% overall in both subjects. With the introduction of Threshold Tests in Mathematics 1, more than 60% obtained >75% in these two follow-on subjects. This indicates students have learned how to learn mathematics and, more importantly, learned how to apply mathematics to their discipline area.

The implications for Destructive and Constructive Feedback

Aiza Khan and Devashree Veerappan

U:PASS

Constructive feedback – saying ‘well done’/giving precisely how to make better and access support (not only generic feedback but helpful feedback that we can utilise in other assessments) (use the feedback)

Destructive feedback: no feedback at all, feedback on group work not specific, cut and paste feedback

This presentation will discuss the positives and negatives of feedback that students receive. We will first explore how feedback can be destructive and its implications which hinders students’ progress to improve in future assessments. We will discuss our experiences with feedback, talking about how feedback should also be personalised instead of being generic and how no feedback given can be detrimental.

Then, we will consider how feedback can be constructive and the ways students can utilise this feedback to develop themselves. This constructive feedback can be positive and allow students to reflect upon themselves, but also be helpful for them to access support.

Use of feedback to enhance the quality of student-generated OERs

Mais Fatayer

LX.lab

Strategies that tap into student-generated content and renewable assessments to generate OERs have been in research for a few years (Wiley, D., Webb, A., Weston, S., & Tonks, D. 2017). Probably the biggest question that faces these studies concerns the quality of content, particularly when students as novice teachers, expert learners however, are given the role of OERs authors. There is empirical evidence that shows the value from tapping into student-generated content such as renewable assessments (Fatayer, 2016, Tualaulelei, 2020). Engaging students in constructing learning resources is a form of active learning, it contributes to their portfolios, and incentive for academics to repurpose student-generated content in assessment design to develop OERs. At the heart of this process is the feedback mechanism that scaffolds students’ knowledge production through a set of tasks, where students can use the information to improve their work, and make a difference in their learning (Ajjawi & Regehr, 2019).

Previous research proposed a number of approaches to evaluate the quality of OERs (Achieve, 2011; Vladoiu and Constantinescu, 2013; UKOER
Evaluation & Synthesis, 2014). The evaluation criteria that were developed in these studies cover instructional design, technology-related aspects of OERs, alignment to learning standards, quality of explanation of the subject matter, utility of materials designed to support teaching, quality of assessment, opportunities for deeper learning and assurance of accessibility. Using these evaluation criteria by academics to provide feedback on assignments is a common practice. However, when students are required to use evaluation criteria not only to assess their own work, but also assess its quality to teach others, further adjustment to evaluation criteria is needed. Additionally, even with the detailed instruments to evaluate OERs, the existing criteria may not be suitable for evaluating openness in learning resources that is significant for creating OERs. For example, the use of open publishing licences and currency of learning resources are important aspects of OERs, as openness of learning resources indicates the flexibility of content to evolve through reusability and contributions of others, hence these criteria need to be evaluated in an explicit manner.

This presentation will demonstrate an approach that uses a set of evaluation criteria for student-generated OERs that focuses on three aspects: (i) technical; (ii) openness; and (iii) educational. Importantly, this approach utilises feedback opportunities during the academic semester to enhance the quality of student-generated OERs. Furthermore, academics as co-creators and facilitators of student-generated OERs offer formal and informal feedback to ensure the accuracy and alignment of learning content to learning objectives. The presentation also presents results from a previous study that demonstrates the use of the evaluation criteria to evaluate student-generated OERs. At the end of the presentation, academics will be able to use a set of evaluation criteria to develop a rubric that assess student-generated OERs.

TimeProgram Activity
12:40 – 1:30pmLunch break – Picnic on the Alumni Green for those registered and pictures with Santa 🙂

Technology Showcase (1:30 – 2:05pm)

Presentations will be held concurrently in different breakout rooms.​ During the 30 minute tech showcase, you may attend up to TWO sessions to join. You will join the first room and be prompted to join your second room after 15 minutes. Each session will run for approximately 10-15 minutes.

1. View the long program and decide which sessions you wish to join​

2. Find and click ‘Breakout rooms’ on your Zoom bar at the bottom of the screen​

3. Click ‘Join’ for your chosen session​

Audio Feedback: Save Time and Create Connection

Marty van de Weyer

LX.lab

It’s easy to create audio clips instead of text responses in Canvas and students really appreciate the added feeling of connection and understanding. It’s also faster than adding typed comments using the framework we will cover.  

‘Message students who’: Tips for efficient responsive feedback in Canvas

Anna Stack

LX.lab

Use Canvas to send tailored messages to groups of students in a subject based on specific criteria. This feature can cut down on individual email time and still provides a targeted and personalised experience for students. 

Live feedback with Zoom polls

Richard Ingold and David Yeats

LX.lab

Zoom polls are a great way to gauge students’ prior knowledge, understanding & engagement, & to add some interactivity into your online classes. We’ll show you how to set them up.

Mentimeter: Increase engagement and gauge understanding

Delwar Hossain

LX.lab

Find out about Mentimeter (our online polling tool) to design activities to increase engagement, collaboration and reflection as well as gauge understanding and opinions in class. Mentimeter provides instant feedback and can be integrated with various pedagogical practices in an active learning space to suit your students and learning environment.

Collaborative digital tools for rapid feedback

Mais Fatayer

LX.lab

Collaborative digital tools (like MS Word, MS Whiteboard, and MS Teams) make it possible for students to carry out cooperative learning efficiently and effectively in different learning modes. Importantly, teachers can provide real time feedback using a range of features such as comments and reactions! 

AcaWriter for 24/7 instant feedback to support students’ academic writing

Simon Buckingham-Shum

Connected Intelligence Centre (CIC)

Academic writing is hard; students need feedback to know how they can improve. Academics don’t always have time to look at students’ drafts and provide formative feedback on drafts-in-progress. In this presentation, we showcase AcaWriter and how it can be used to support students’ writing through instant feedback any time of the day. 

Automated feedback to support students’ learning: OnTask

Lisa-Angelique Lim 

Connected Intelligence Centre (CIC)

As teaching and learning increasingly takes place online, how can we communicate timely and personalised feedback to support students’ learning, and at the same time foster a sense of belonging? In this presentation, we showcase how you can use OnTask to support your students’ learning, and to build a sense of belonging, in your subject. 

Using TRACK Learner to improve graduate employability

Joseph Jung, Alireza Ahadi, Georgia Markakis, Simon Buckingham Shum and Kirsty Kitto

Connected Intelligence Centre (CIC)

The TRACK project aims to help UTS prepare its learners with the skills needed for the workforce of the future. Student and staff facing tools help us to link our curriculum to in demand skills, and support students in working out gaps in their CV with respect to different career goals. Come along to this demonstration to learn how you might use these tools in your course design and class activities! 

Concurrent sessions 3 (2:05 – 2:55pm)

Presentations will be held concurrently in different Zoom rooms.

1. View the long program on Teams

2. Click on the corresponding Zoom link

Work Integrated Learning & Evaluative Judgement

Tea for Two: Generating professional identity as part of the learning-assessment cycle

Katherine Bates

FASS

The English suite of subjects in the Bachelor of Arts Bachelor of Education Degree actively encourage two-way investment practices. However, a challenge has been to find ways which emancipate students from one-way information transactions which replicate the bank model of education. This is particularly evidenced with an increase in assessment exemplar dependency. While the procedural component of assessment construction is necessary, cut and paste assessment exemplars are problematic in a number of ways.

Primarily, replicating an action without further cognitive processing decreases the potential of skill internalisation (Zimmerman, 2013). It also limits demonstrable independency in applying, evaluating and creating new products autonomously. This is of particular concern because the work of teachers requires them to apply curriculum and pedagogical knowledge for each class they teach as they design and adapt programs to suit learning needs. And finally, in regards to professional identity, teachers are required to create and lead two-way active learning communities and employers are looking for these graduate qualities from teaching professionals.

Implementing a pedagogy for empowerment was adopted to replicate professional practice and encourage students to invest in their own learning. It involved students in self-recognition, self-confirmation and reflexive thinking. To address this issue, I provided a nested practice that included:

1. Assessment tasks which contribute to professional practice = Compare and evaluate one’s own work that emulates tasks they undertake in their career.
2. Annotated excerpt exemplars = Developing understanding of what quality performance looks like.
3. Templates with instructions for each task element = Reducing cognitive load in organising ideas to free learners’ focus for applying and creating.
4. Recorded instructional bites aligning with constituent parts = Developing self-regulatory strategies by working on assessments in logical chunks.
5. T-4-2 podcasts with students who were pacing their study and assessment completion = Modelling self-regulated learning.
6. Self-marking to encourage self-recognition and self-confirmation through an evaluative judgement process = Judging the quality of work against a reference-based criterion and modifying performance accordingly.
7. Revisiting assessments across course work to include reflective and reflexive practices = Providing opportunities to improve product quality using constructive feedback cycles.

The approach provided necessary procedural knowledge about assessment tasks while encouraging the cognitive process dimension of learning. However, I would include T-4-2 podcasts at more regular intervals over the course and maintain self -regulatory discussion boards for sharing positive affirmations and successful study habits.

Feedback as professional practice

Roger Hadgraft and Beata Francis

FEIT

The FEIT Summer Studios ran during 2018-2020, with a pause in 2021. The Studios are opportunities for both academics and students to engage around complex challenges, in a collaborative environment. Key principles have included design thinking and agile methodology, both key aspects of professional practice in our disciplines. Agile enabled us to incorporate regular feedback into each studio, with groups receiving feedback on their progress in every class, with formal gateways established at the quarter points (sessions 3, 6, 9, and 12), when they submitted their portfolios for formal feedback.

Students worked collaboratively on their project across 12 classes of 3 hours each, spread across 6 weeks min Jan-Feb (2 classes per week, Mon and Thurs). The four phases of the session were: define the problem, explore the solutions, develop a prototype solution, and refine the solution for the final presentation, which was available as part of the exhibition day, when the work of all the studios was on display for interested onlookers. Students demonstrated considerable pride in their achievements at this final presentation.

Agile methodology relies on a process of continual improvement in the development of a product. It is a methodology developed in the software industry when the end requirements are difficult to define at the start, and it is better to build and refine than to use the traditional design and construct process.

Applied in our Summer Studios, this process enabled studio leaders to continually monitor student and group progress and to constantly nudge them towards the common goal. Given that students played a big role in defining their project, academics then became facilitators and project managers to ensure successful outcomes through regular feedback, both formative and summative. Assessment was by 100% portfolio, with four formal assessment points.

Feedback from students has been extremely positive, with many students wanting more of these subjects within the formal curriculum. We now have 80+ studios spread across the Faculty, with more under development.

We refined our staff training between the years. In particular, we realised in 2018 that academics needed a clearer model for how to manage students working on challenges where they have not been taught the answer. This student-centred, studio-like approach has then been adopted by many of those academics in their other subjects.

We also realised that we had to teach students how to work and learn in studios. If they don’t know the answer, they require process skills to work towards finding an answer. This is the nature of professional practice.

Portfolium: A Whole of Course WIL Approach to Reflective Practice Among Public Communication Students

Kate Delmo

FASS

Our undergraduate students in the Bachelor of Communication (Public Communication) are equipped with theoretical and practical knowledge in developing and executing communication strategy that address issues of social impact. Once they graduate, we, their educators, are confident that they have been introduced to outputs and skillsets expected of them by the industry. During COVID-19, we expected students to adapt and be flexible to the changing teaching and learning environment. However, our Discipline realized that we have not provided our students to reflect on the outcomes of their learning. Do we know what they truly learned? Have we asked them if working on assessments taught them other skills that, albeit not assessed, are important to them as future communication professionals? How do we know what they know?

Hence, in line with the reaccreditation of our undergraduate degree in 2022, I initiated a whole of course approach to reflective practice amongst our students in the newly revised Bachelor of Communication (Strategic Communication) degree through the use of Portfolium, UTS’ recently acquired digital portfolio software. Reflexivity as part of student learning is critical to producing graduates who are workplace ready. It is a key element of the Work Integrated Learning (WIL) initiative of the University that furthermore prepares students to think about industry practice whilst being in the university. With the approval of my Discipline colleagues, I pilot-tested Portfolium in Spring 2021 in the undergraduate first-year subject that I coordinate, 54042 Principles of Public Relations (175 students). I liaised with LxLab in embedding Portfolium in the subject’s Canvas site. I used Portfolium as a reflective tool for skills development amongst students enrolled in the subject. I created two Portfolium tasks in Canvas that served as hurdle tasks towards completion of both the theory and practical assessments required by the subject. I asked students to:

1) Create their Portfolium profiles

2) Write Reflective Notes in response to questions that prompted them to focus on the skills the assessments required, and

3) Upload digital artifacts as evidence of the work that they have done. The Portfolium tasks were not assessed. It was important that I asked students to focus on skills development because these are what employers look for, not actual assessment submissions.

Because the Portfolium tasks during the pilot test period were not officially assessed, I only had 15% compliance rate. What is notable, however, is what students (ones that constitute 15%) demonstrated in terms of skills reflection. Anecdotal testimonials from selected students and written comments from my teaching team members highlight that Portfolium can help students build their emerging professional identities right from their first year subjects all the way through their capstone subject. Students learned that having the space to think about skills relevant to industry while working on subject assessments allow them to make the connection between university and professional work. This is key to what WIL encourages students to appreciate. Also, preparing a digital portfolio that can be easily shared with potential employers build students’ confidence in developing their narrative Strategic Communication graduates at UTS.

1) There needs a Discipline-wide discussion if the Portfolium reflective tasks should be marked as components of assessments. This may increase motivation for students to be present in the Portfolium community.

2) It is important for our Discipline to map which assessment/s per subject in our degree has/have Portfolium component. This is critical to achieve the whole of course approach that can help students gradually build on their websites as they mature in their degrees at UTS.

3) I will bring in UTS Careers to help us identify skillset development touch points across subjects.

Work Integrated Learning, Peer Feedback & Self-Feedback

The introduction of industry and peer feedback to group design project proposals

Jeremy Lindeck, Eva Cheng, Tim Boye and Tanvi Bhatia

FEIT

‘Engineering Communication and Communication for IT Professionals are group work-based subjects teaching 1800 first year students yearly. Students learn and apply professional practice skills through a group design proposal for the Engineers Without Borders (EWB).

The main assessment is an end-of-semester group report. Previously, in mid-semester, groups submitted a proposed outline of their design solution (worth 15%). This outline was often largely unchanged for their final report, indicating that tutor feedback was minimally applied. We aimed for students to take ownership of their feedback, have a greater understanding of the task and the role of feedback ‘conversations’ in learning.

We recruited volunteer ‘Design Guides’ to give validity and more context to the feedback received. As higher year students or industry professionals, we felt our students would relate to the Design Guides’ similar profile and industry experience. The Design Guides’ role was to visit classes, ask questions and give feedback on the design processes.
To encourage student ownership, we instituted a mid-semester 3-stage peer review process (5% pass/fail mark for each stage):

Stage 1: 3-min group presentation outlining the proposed design
Stage 2: Each group’s draft report reviewed by another group, completing a checklist of open-ended questions. Then, presenting the feedback and checklist to the report’s authors. Finally, the group’s draft is discussed with the Design Guide.
Stage 3: Each group submits a summary of their feedback, and how they would apply it to their project.

Students were engaged in the process. Some groups said they learned more from giving feedback than receiving it: giving feedback clarified their understanding of their design, allowed comparisons of their work, and inspired ideas for improvement. As a pass/fail assessment, there was less pressure on students, so they were able to look at their work more objectively. Further, the Design Guides provided a different perspective and a glimpse towards what first year students can aspire to.

Overall, students engaged in and enjoyed the peer feedback process and the Design Guide. This gave them confidence in the validity of the feedback received, that they were “on the right track”. It also allowed students to reflect on the design process and its application in their project. The main difficulty in introducing peer feedback was the logistics of moving student groups and Design Guides online. We look forward to implementing this approach next year for in-person learning, especially as we are teaching in the large collaborative classrooms.

The role of feedback for integrating authentic research tasks into teaching practice to motivate students and develop the skills of self-directed learning

Diep Nguyen

FEIT

The art of teaching is to motivate students to develop the skills of self-directed learning. This has become increasingly critical, especially for fast-changing fields like engineering and information technology where the life-cycle of new technologies has become significantly shortened. In this presentation, we discuss on how to motivate and equip undergraduate students with self-directed learning skills via authentic research tasks and the roles of feedback in this process. Specifically, as part of authentic assessments in flipped learning, we expose undergraduate students with authentic research tasks/activities in which they collaboratively work in groups to investigate emerging technologies, replicate research results or an aspect of the technologies, then identify shortcomings, and offer ideas to improve the practical systems. This process allows students to learn through not only reading, experimenting, and but also by applying what they learn/read to work on state-of-the-art systems/technologies.

Unlike postgraduate/HDR students, the success of authentic research tasks for second- or third-year undergraduate students who don’t even have fundamental background knowledge of the field is underpinned by creating effective and timely feedback channels. First, the collaborative research task requires students to discuss and then give peer feedback to each other. Second, early feedback from teachers/instructors plays a pivotal role in motivating and shaping the students’ research directions and methodologies, guiding them through the early challenges of what to read, where to find and select the right materials, right simulations/coding tools. Students enjoy from actively discussing with their peers, tutors, and instructors for feedback on why their simulations/experiments do not work and how to fix them. In our teaching, especially during the COVID pandemic, we even created on-demand or 24/7feedback channels via Teams where students could reach their peers/instructors/tutors for answers/comments mostly whenever they had questions and in an interactive manner. Thirdly, to provide and design suitable research tasks for students, we regularly seek feedback from our colleagues and industry partners on emerging technologies. In this regard, our students benefit from UTS industry partnership as well as the collaborative teaching/coaching team (e.g., via UTS Rapido, a technology transfer initiative at our faculty). Being motivated by the research tasks, various students then work on open practical problems for their capstones, winning various awards, and presenting their working prototypes/systems at industry exhibitions (e.g., CEBIT Sydney).

With active mentorship and effective feedback points, as authentic assessment tasks, collaborative research projects can become the driver of students’ learning journey. Such an exploration journey helps train undergraduate students the skills to equip themselves with state-of-the-art knowledge to become innovative leaders.

Embedding career awareness into the Law curriculum: Preliminary Steps and future plans

Robin Bowley and Ruth Wilcock

Law

‘A Law degree can open a wide range of career opportunities, including but by no means limited to private practice as a solicitor or barrister as well as roles within large corporations, government, non-government organisations and the media. As Law graduates progress through their careers they can often experience career planning challenges. These can include deciding whether their current role is right for them; how they might transition into alternative roles; how they might draw upon their existing knowledge, skills and experience; and how they might identify and address gaps in their current skills.

This paper will discuss a project that is underway within the Law Faculty to educate Law students about useful resources and strategies to assist them in the often challenging process of planning their future careers.

The project is being embedded into subjects at various stages of the Law degree program. The targeted modules provide an opportunity for students to critically reflect on important issues such as career design and decision making; building a professional identity; appreciating the expectations of industry and professionalism in the legal services industry; and strategies for navigating different workplace scenarios.
Whilst the full implementation of the project has been somewhat complicated due to the COVID 19 disruptions, encouraging feedback on the targeted modules has been received through the use of student focus groups, which is being incorporated to further improve the resources for this project.

As well as assisting in the incorporation of work-integrated learning into the Law degree program, it is intended that this project will assist Law graduates to develop the professional skills and passion for life-long learning that will help them to succeed in the legal services industry of tomorrow.

Self-Feedback & Feedback Literacy

Object AND reflect: Encouraging student engagement with feedback

Catherine Robinson

Law

Requests for review of results as a ‘defensive response’ is an ongoing challenge for academics, professional staff, and students. The issue this presentation aims to address is improving the quality of requests by encouraging students to demonstrate engagement and reflection with feedback i.e., object and reflect. It will be delivered from the perspectives of diverse student cohorts and assessments: A research essay task in a postgraduate core law subject (weighting 35%), and a collaborative problem-solving task in an undergraduate law offering to the Business School (weighting 30%).

The presentation will examine how this was achieved, including through: strategic communication about feedback expectations, student opportunities to interact with feedback, and ‘showing vs telling’ (Nicol, 2010). In addition to improved quality, there was a reduction in volume of requests and further reviews to the Responsible Academic Officer. Importantly, students displayed a high level of self-awareness as demonstrated by their ability to reflect on their performance and identify areas for improvement.

The presentation will conclude with lessons learnt and implications for future feedback.

Using rubrics as an educational intervention

Sonia Matiuk, Judy Smith, Caroline Havery, Amanda Wilson, Carmen Axisa and Elizabeth Brogan

Health

We set out to see if embedding a learning activity in-class in which students examine a rubric used for marking and feedback on a written assignment could enhance clarity of the rubric and assessment requirements for students and consequently make marking easier for the teaching team. The subject this activity was piloted in is a first-year undergraduate nursing subject which runs intensively over 4 weeks with a cohort of 550 students.

The in-class activity involved students reviewing and critiquing a rubric to improve their understanding of an assessment task before they even started writing the assessment. Embedding this activity at the end of the first tutorial meant students could make connections with one another, learn through peer engagement and all had equal opportunity in the development of the rubric. Student comments were then incorporated into the final rubric that was made available to students for preparation of the assessment and used for marking.

It didn’t go quite as hoped with few students providing feedback on the rubric itself to modify it, however the questions they came up with in response the rubric criteria indicated they understood what was required of them in the assessment. Also, some comments made by students weren’t necessarily focused on the current rubric, but on the activity itself, which provided some interesting insight into student perspectives on what activities they would like to engage in to prepare them for academic writing.

Our preliminary conclusion is this is a worthwhile learning activity to explore further for development and implementation in this subject and others in the degree. To enable this, we have secured a small School based grant to explore student perspectives via focus groups once the ethics proposal has been approved.

Making feedback actionable for students

Joshua Dymock

UTS HELPS

At UTS HELPS, we support thousands of students every year across a huge range of disciplines and assignment genres. While many students come to us with the belief that we will “fix” their writing, our approach is instead to give feedback that prompts students to take action themselves to improve their writing. While this approach has been successfully honed over many years of face-to-face consultations, the challenge was to translate this success to online writing reviews, where the advisor and the student have no direct interaction. Writing reviews are popular with busy online students who may struggle to find a suitable time to book a consultation, but the sessions present a challenge in ensuring that the feedback is still actionable for students, and delivered consistently across a large team of learning advisors.

We developed a set of guidelines for giving actionable feedback, and combined this with peer review and scaffolded self-reflection amongst the UTS HELPS team. The aim was to ensure that feedback given to students is both consistent and actionable.

Feedback from students suggests that even those who just wanted someone to fix the problems in their writing came to appreciate the importance of learning to understand for themselves what their main writing issues are, and more importantly, how to address them. We learnt that scaffolding, where students need a high level of support in the early stages, but gradually require less support as their skill level and confidence increases, is still possible through written feedback.

Random reviews of feedback being given to students suggests that there is still sufficient inconsistency to warrant further action. Although scaffolded self-reflection has been useful in encouraging advisors to reflect on how well they were adhering to the guidelines, peer review is likely to be required on an ongoing basis to maintain quality and consistency of feedback.

Peer Feedback

Utilising Peer Feedback in Online Digital Marketing Subjects

David Waller, Kaye Chan, and Melissa Clarke

Business

The Graduate Certificate in Marketing and Digital Strategy is an OPM (Online Program Management) course that is for students that do not have a marketing background and would like to understand the basics of marketing and digital marketing. It started in session 5, 2021 with Customer Centric Marketing. In development it was known that it would not only be a foundations subject, but the students would either not have previously studied Marketing, or not studied for many years, and not a course that is 100% online.

The content on the Canvas pages was broken up with general questions and tasks. To encourage engagement, the students were asked to not only write their answers but to provide peer feedback on what their fellow students had written, including their main assessments. While it was greeted with doubt in the design stage, in practice the students posted a variety of interesting feedback/comments.

The outcome of the peer evaluation tasks provided a range of responses which greatly helped in bonding with the students and engaging with the content and each other. The peer feedback was categorised into the headings below, with an example for each:
Connecting with peers
– Based on name, pets, location or excitement – “Awesome Alex, nice meeting you :)”
Cheering Peers
– Encouragement – “Great analysis Kate!”
– Support – “Great insights about your company Jacqueline!”

Asking Questions
– Clarifying questions – “Have other airlines responded to the pandemic? if so, why did you choose Qantas?”
– Question prompting further discussion – “Do you have any insight on how the organisation plans to continue their growth and success post COVID?”

Personal Touch
– Personally relating to assessment case – “Interesting analysis for this segment! This sounds like it’s being targeted towards me haha…”
– Personally benefiting from the peer’s example – “Hopefully, these insights help me to make a purchase for running shoes in the future; I love how well explained you write. Thank you for sharing this!”
– Justifying a different view – “Apart from the sustainable focus, perhaps a more local focus with perhaps athletes in the local community.”
– Relating to their example – “I will definitely try this!”
– Learning from their example – “I hadn’t heard of situation with mislabelling products, that’s so interesting!”

The OPM subjects have a continuous improvement process that means that subjects are reviewed after every time it is run. While the subject Customer Centric Marketing is being reviewed, the online tasks with the encouragement of peer evaluation will remain. As one student said: “I am blown away by the answers and constructive feedback.”

Laboratory classes during lockdown: lessons in good feedback

Annette Dowd and Camille Dickson-Deane

Science

The COVID-19 pandemic imposed restrictions on laboratory classes resulting in highly modified course designs. We ran an optics laboratory course for 50 students in 2020 under “social distancing” restrictions (hybrid mode) and in 2021 fully online. Evaluation of these two courses for student learning outcomes and student feedback shows 2020 hybrid design was received by students as being successful but 2021 online design was not as well received as its hybrid version. The question is: why and what have we learnt?

In this presentation we look at the difference in student experience in the two courses from the perspective of feedback. Traditional science laboratory course design includes plentiful natural feedback opportunities, which is usually taken for granted by students and academics. . As more aspects of the course are transferred to online platforms the interfaces for feedback become mediated by technology. Attention needs to be paid to designing the way in which feedback is to be shared via the technologies.

Students in the hybrid course learnt vicariously 50% of the time by joining a synchronous session with one student partner in the laboratory, and they learnt face-to-face the other 50% of the time in a laboratory with demonstrator, peers, and their online partner. The experiments and theory-based activities were identical. Students in the fully online course learnt vicariously 100% of the time by joining a synchronous session run by the demonstrator in the laboratory. The first situation resulted in a highly engaged group of students but the second resulted in a group of students who rarely engaged with their peers or demonstrator and who eventually displayed more misconceptions in related assessment items.

When we understand feedback as a dialogue, it becomes obvious that the hubbub of the hybrid course laboratory (peer-peer and peer-demonstrator conversations) was an environment rich in just in time feedback loops. In contrast, the fully online course had feedback loop opportunities with demonstrator and peers in a Zoom session, a design created out of necessity and unfortunately seldom used. This presentation will outline the lessons learned from understanding the factors that contributed to the success of the hybrid offering.

How can students from minority groups thrive through feedback? 

Betty Mekonnen, Elham Hafiz

Inclusive Practice Fellowship

For students entering university with lower levels of academic readiness or with multiple competing commitments, the stressors associated with the academic demands of a course can feel overwhelming. Students should be able to attend their classes with the intention to be confident learners, however this is not always the case. Our sense of belonging tends to be displaced due to the Eurocentric and patriarchal curricula that society currently embraces. Through transformative feedback, peer mentoring and co-creation we can facilitate minority students to thrive. 

Continuous Loop Feedback, Closing the loop & Technology

Learning progress checks

Murray Elder

Science

The challenge in large first year maths subjects with diverse cohorts of students is to maintain engagement and to sustain a gradual building-up of skills week-by-week. Many subjects at UTS have at most 3 large assessment items that have the effect of students ignoring that subject until the large assessment comes in, then try to learn the content specifically for that giant task, and it is stressful for them and ineffective. My approach is to use short and frequent (weekly) assessment tasks called “learning progress checks” that students can confidently do, discuss and get rapid feedback (in terms of right/wrong answers, and suggesting some improvements on their method/proof style/setting out etc, or just a “well done”) from their tutor or me. Lecture content, tutorial practice tasks, and assessment are all explicitly linked in a weekly cycle, so the answer to the question in a lecture “will this be on the test?” is a definite “yes”.

I will show SFS comments, examples of the learning progress checks, and how Canvas is used for them.

Human X Institution X Feedback: Looking for Levers to Adjust Feedback Practice

Martin van de Weyer

LX.lab

Feedback as part of teaching is a deeply human process, however it is fundamentally embedded in an institutional framework. A balance between the affective, logistical, pedagogical, monetary and digital/physical can be difficult to find.

In order to understand the needs of the humans at the core of this issue, and identify the most effective levers for support and change from an institutional point of view, over the past year a team within the LX.Lab has been using a HCD methodology to map this problem space. Conversations with students, sessional teaching staff, subject coordinators, learning and teaching support staff have revealed a number of key foci to attend to at the current time: Reinforcing the rhetoric of ‘Feedback Ecosystem’; Considering ways to make feedback ecosystem changes relevant to stakeholders (by tying to priorities or reducing load); Finding ways to enable and encourage ‘feedback on feedback’; Creating institutional value for formative/’informal’ (non-assessment response) feedback; Focusing on student feedback literacy/agency; Supporting sessional staff in feedback practice, and finally, pursuing and promoting Audio/Video feedback.

This investigation and synthesis also led to the creation of a framework for addressing future learning and teaching problems of a similar scale and multifaceted nature.
This presentation will unpack, and provide the opportunity to discuss, these insights and outcomes with the aim of highlighting the most useful avenues you can pursue to grow and support your own feedback practice regardless of your role.

Teaming up in Zoom – an affordance based technology enhanced feedback system for Public Relations Pedagogy

Sameera Durrani

FASS

Zoom offered the key affordance of breakout rooms. Teams offered the affordance Worksheets/Posts/Slides – captured, documented text, that could be reviewed at student convenience, including those who missed the class due to poor internet.

– Collaboration in Teaching Teams: I designed one Team for teaching staff, and one for every tutorial. All teams had 12 channels, representing each week. A recurring zoom link was posted in the General channel for each tutorial. Tutors were encouraged to post in the Teaching Team each week, to let me know if they had any issues, or if their students were confused about anything. This was helpful for troubleshooting across a large cohort.
– Each week, I designed two documents for my tutors: Tutorial slides + a Word doc based worksheet. I uploaded the design in the Teaching Team (Files Tab). Tutors could download, amend, and upload into the respective channel for that week into their own team.
– Efficient Tutor and Peer Feedback: Students were assigned to approx. 5 breakout rooms. As a team, each breakout room worked on a challenge. Tutors could follow student progress in real time for ALL Teams in the word doc worksheet, as it happened. They typed feedback into a Tutor Feedback section, and dropped in on groups that seemed ‘stuck’, using zoom (i.e. were not typing). In certain activities, students were told to give each other feedback All feedback occurred in class time, synchronously.

1. SFS outcomes have been overwhelmingly positive. (e.g. in 54044, students expressed satisfaction with the design across all criteria at an average of 80 percent and above. Dissatisfaction with any criteria did not exceed single digits). Average Tutor SFS exceeded 4.5 for all six groups.
2. Synchronous feedback in worksheets saves time and creates a permanent record. I intend to repeat the format next year.
3. Linking Assessments to Tutorials has also proved effective for in class engagement.
4. The format still struggles to engage some international students. They tend to not post in Teams, as they are conscious of language proficiency issues. I’m thinking about testing tools like Mentimeter next year.

TimeProgram Activity
2:55 – 3:05pmAfternoon tea
3:05 – 3:35pmStaff and Students’ Reflections and Takeaways
3:35 – 3:50pm2021 Learning and Teaching Award Winners Announced
3:50 – 4:00pmClosing remarks
4:00 – 5:30pmCelebrations and canapés on the Alumni Green for those registered