For example, in Business Statistics students were previously given ‘clean’ data sets so they could focus on performing simple data analysis. One of the dangers of this approach is that it often presents an oversimplified perspective of what real-world practical tasks involve and often perceived as too abstract. This abstraction reduces student engagement with assessments as it is more difficult for them to see their tasks in context.

We argue actively engaging students and meeting learning outcomes requires inclusion of complexities underlying an assessment task. The expected outcome is that the complexity of a problem forces active engagement with the problem and, hence, enhances learning experiences. To do so, we have addressed relevance by allowing students to: (a) work with real data that contains ‘issues’ such as missing responses, outliers, and ‘noisy’ data; (b) a mixture of ambiguous and direct managerial questions that do not penalise interpretation but rewards justification of approach; (c) a context that is relevant to students; (d) present results that follow managerial demands improving written skills (e.g., efficient writing) and focus on outcomes for practice rather than simply presenting a result; and (e) research how the assignment context may be approached by someone from their chosen major (e.g., accounting).

The goal of this innovation is to improve student perceptions of subject relevance in the context of their degree studies permitting them to make more links between the different subjects’ learning objectives. This also allows students to understand the day-to-day challenges present in industry practice making them more work ready and valuable to potential employers. A benefit to educators is that student work is now better able to contribute to academic knowledge. Student analysis of more realistic problems can provide academics with a basis for examining similar issues in their own research offering synergies previously not achievable with student work.

The idea has evolved over several semesters, using suggestions from students, staff and colleagues. The structure of these ideas has been reviewed against literature in teaching and learning including adaptations of problem-based learning. Formal and informal measures of perceptions have been used to evaluate the practical implementation of the ideas and the beneficial outcomes for student learning and engagement. Quantitative measures in the form of the subject-feedback survey reveal immediate and significant improvements in the area of subject relevance. Qualitative measures in the form of discussion board postings by students reveal the idea offers demands on student learning (e.g., “it is hard”) but with recognised rewards (e.g., “I love the challenge and debating this with others in my group”). We continue to monitor the teaching practice, but hope to develop better measures in this regard.

Text by Paul F. Burke and Luke Greenacre from UTS Business School.

  • In my opinion, there is a limit to how much could be taught in a classroom environment.
    We’ve been in business for over 20 years and have employed a number of graduates during this time. There is a significant difference between the ones which have on the job experience (eg. via an internship) vs someone who has only studied in a classroom.

    These days, we are reluctant to hire people who are just joining the workforce for the first time because it is such a gamble. We like to see that the person has some work experience (even if it’s just a job at the local Macca’s).

Join the discussion