What is it with qualitative analysis? Why is it that such a beautiful, detailed and useful collection of data can feel so wildly untameable, bafflingly vague and definitely not very ‘scientific’, to those who live in a world of p-values and t-tests?

I was one of two qualitative research practitioners at a recent community of practice meeting with Science academics who were keen to understand more about researching their teaching practice. In Part 2 of this post, Lucy Blakemore also shares some of the practical qualitative observation techniques and advice she presented from past qualitative projects on student experience.

For now, read on to explore some key concepts we discussed in relation to qualitative data analysis, including what makes it so challenging, and how we can think about it differently.

What makes qualitative data analysis difficult?

The meeting started with a question for participants: ‘What do you think makes qualitative data analysis difficult?’. The answers, captured below in AnswerGarden, illustrate some common themes we hear about qualitative date analysis.

For some, it feels so much more subjective than quantitative analysis, even though (fallible) humans are involved in the design and analysis of both. Along the same theme, it feels more open to bias, it can’t be put into formulae and doesn’t feel so easily comparable as other types of data. Sample sizes are limited, and human data is just so messy.

In qualitative analysis, the problem is not that there are no rules; it is that there are so many sets of rules. How do you choose? What matters to you, in each project or point of inquiry? 

If you want to feel like you’re doing ‘proper’ objective science, then validity, reliability and transparency might be a priority. Looking for maximum insight? The ends (insights) may justify playful means in the analysis process. Likewise, approaches vary if you’re using research for political disruption or activism, or some other practical relevance and utility. Occam’s razor cuts differently, in some of these cases.

The limitations of coding

Many academics come to qualitative research through the pragmatic-looking lens of coding, where transcripts and verbal data sets can be scoured for recurrent themes and wrangled into limited lists of coded commonalities, sometimes reached through a series of discussions and compromises with co-researchers looking at the data together.

There are advantages to this approach, including software (such as Nvivo, available to UTS staff) that does some of the heavy lifting for you in the first stages of analysis. There’s also a lot of literature outlining clear processes to follow and suggested techniques for coding.

On the downside, however, that same software can sometimes unearth mounds of meaningless ‘insights’, and the guiding processes can act more as a life-raft to cling to rather than a lighthouse guiding you to insight. The apparent validation of multiple researchers coding and compromising can also lead to lowest-common-denominator thinking, stifling the rich personal insights which can lead to breakthrough moments.

Let’s get serious – and play!

In recent years, qualitative data analysis in academia has begun to embrace alternatives to coding, which take us into much more playful, creative and insightful places. Drawn from Pat Thomson’s blog post, ‘Play with your data‘ some examples include:

  • Creating/surfacing random associations between concepts (put three post-its in a triangle and try to link them)
  • ‘Scatter gun’ techniques (writing themes on scraps of paper, throwing them in the air, seeing where they land)
  • Using redactions (blot out words that are not interesting, and look for patterns in the remainder)
  • Juxtaposition (of images, what doesn’t fit – creating a cabinet of curiosities)

In fact, anything that generates new insights can be used as a technique for qualitative analysis; if it takes you to new and interesting places, then it’s a useful approach.

Re-framing what we search for in qualitative data

In considering different approaches to qualitative data analysis, we return to important questions about the focus of the research, the reason for the project and the intended outcomes.

If you tend to look for safety in technique, why not consider what freedom makes possible in the analysis process? If you know you crave a clear, neat resolution and narrowing things down, try opening up the question, actively seeking out messiness and things that can’t be resolved.

Perhaps you collect themes, codes and categories like a hoarder, reluctant to let a comment go? Force yourself to whittle down to only the most meaningful themes this time. If you’re searching for the security of a procedure, something concrete to do, but want to keep the idea of being open to subjective researcher insight, you could also consider using a synoptic units approach.

When you next get the chance to dive into a qualitative data set, have a think about how you can challenge the less helpful habits, and instead search for something different.

Drop-in Qualitative Data Analysis Clinics – join us!

In 2021, ResHub is trialling a monthly drop-in Qualitative Data Analysis Clinic for anyone working with qualitative data, where we workshop specific issues, and look together at data that attendees bring. If you have some data and have reached a sticking point, or think you’ve made progress and want to test out your interpretations or arguments, then please register and bring your data along (appropriately anonymised, if necessary)! 

Even if you don’t have your own data to bring at this stage, simply drop-in and be part of a community sharing interest in qualitative data. We look forward to seeing you at a clinic soon!

Photo by Kelli McClintock on Unsplash


St. Pierre, E. A., & Jackson, A. Y. (2014). Qualitative data analysis after coding. Qualitative Inquiry, 20(6), 715-719. https://doi.org/10.1177/1077800414532435  

Join the discussion