Grad Coach

Qualitative Data Analysis Methods 101:

The “big 6” methods + examples.

By: Kerryn Warren (PhD) | Reviewed By: Eunice Rautenbach (D.Tech) | May 2020 (Updated April 2023)

Qualitative data analysis methods. Wow, that’s a mouthful. 

If you’re new to the world of research, qualitative data analysis can look rather intimidating. So much bulky terminology and so many abstract, fluffy concepts. It certainly can be a minefield!

Don’t worry – in this post, we’ll unpack the most popular analysis methods , one at a time, so that you can approach your analysis with confidence and competence – whether that’s for a dissertation, thesis or really any kind of research project.

Qualitative data analysis methods

What (exactly) is qualitative data analysis?

To understand qualitative data analysis, we need to first understand qualitative data – so let’s step back and ask the question, “what exactly is qualitative data?”.

Qualitative data refers to pretty much any data that’s “not numbers” . In other words, it’s not the stuff you measure using a fixed scale or complex equipment, nor do you analyse it using complex statistics or mathematics.

So, if it’s not numbers, what is it?

Words, you guessed? Well… sometimes , yes. Qualitative data can, and often does, take the form of interview transcripts, documents and open-ended survey responses – but it can also involve the interpretation of images and videos. In other words, qualitative isn’t just limited to text-based data.

So, how’s that different from quantitative data, you ask?

Simply put, qualitative research focuses on words, descriptions, concepts or ideas – while quantitative research focuses on numbers and statistics . Qualitative research investigates the “softer side” of things to explore and describe , while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them. If you’re keen to learn more about the differences between qual and quant, we’ve got a detailed post over here .

qualitative data analysis vs quantitative data analysis

So, qualitative analysis is easier than quantitative, right?

Not quite. In many ways, qualitative data can be challenging and time-consuming to analyse and interpret. At the end of your data collection phase (which itself takes a lot of time), you’ll likely have many pages of text-based data or hours upon hours of audio to work through. You might also have subtle nuances of interactions or discussions that have danced around in your mind, or that you scribbled down in messy field notes. All of this needs to work its way into your analysis.

Making sense of all of this is no small task and you shouldn’t underestimate it. Long story short – qualitative analysis can be a lot of work! Of course, quantitative analysis is no piece of cake either, but it’s important to recognise that qualitative analysis still requires a significant investment in terms of time and effort.

Need a helping hand?

data analysis examples qualitative research

In this post, we’ll explore qualitative data analysis by looking at some of the most common analysis methods we encounter. We’re not going to cover every possible qualitative method and we’re not going to go into heavy detail – we’re just going to give you the big picture. That said, we will of course includes links to loads of extra resources so that you can learn more about whichever analysis method interests you.

Without further delay, let’s get into it.

The “Big 6” Qualitative Analysis Methods 

There are many different types of qualitative data analysis, all of which serve different purposes and have unique strengths and weaknesses . We’ll start by outlining the analysis methods and then we’ll dive into the details for each.

The 6 most popular methods (or at least the ones we see at Grad Coach) are:

  • Content analysis
  • Narrative analysis
  • Discourse analysis
  • Thematic analysis
  • Grounded theory (GT)
  • Interpretive phenomenological analysis (IPA)

Let’s take a look at each of them…

QDA Method #1: Qualitative Content Analysis

Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

With content analysis, you could, for instance, identify the frequency with which an idea is shared or spoken about – like the number of times a Kardashian is mentioned on Twitter. Or you could identify patterns of deeper underlying interpretations – for instance, by identifying phrases or words in tourist pamphlets that highlight India as an ancient country.

Because content analysis can be used in such a wide variety of ways, it’s important to go into your analysis with a very specific question and goal, or you’ll get lost in the fog. With content analysis, you’ll group large amounts of text into codes , summarise these into categories, and possibly even tabulate the data to calculate the frequency of certain concepts or variables. Because of this, content analysis provides a small splash of quantitative thinking within a qualitative method.

Naturally, while content analysis is widely useful, it’s not without its drawbacks . One of the main issues with content analysis is that it can be very time-consuming , as it requires lots of reading and re-reading of the texts. Also, because of its multidimensional focus on both qualitative and quantitative aspects, it is sometimes accused of losing important nuances in communication.

Content analysis also tends to concentrate on a very specific timeline and doesn’t take into account what happened before or after that timeline. This isn’t necessarily a bad thing though – just something to be aware of. So, keep these factors in mind if you’re considering content analysis. Every analysis method has its limitations , so don’t be put off by these – just be aware of them ! If you’re interested in learning more about content analysis, the video below provides a good starting point.

QDA Method #2: Narrative Analysis 

As the name suggests, narrative analysis is all about listening to people telling stories and analysing what that means . Since stories serve a functional purpose of helping us make sense of the world, we can gain insights into the ways that people deal with and make sense of reality by analysing their stories and the ways they’re told.

You could, for example, use narrative analysis to explore whether how something is being said is important. For instance, the narrative of a prisoner trying to justify their crime could provide insight into their view of the world and the justice system. Similarly, analysing the ways entrepreneurs talk about the struggles in their careers or cancer patients telling stories of hope could provide powerful insights into their mindsets and perspectives . Simply put, narrative analysis is about paying attention to the stories that people tell – and more importantly, the way they tell them.

Of course, the narrative approach has its weaknesses , too. Sample sizes are generally quite small due to the time-consuming process of capturing narratives. Because of this, along with the multitude of social and lifestyle factors which can influence a subject, narrative analysis can be quite difficult to reproduce in subsequent research. This means that it’s difficult to test the findings of some of this research.

Similarly, researcher bias can have a strong influence on the results here, so you need to be particularly careful about the potential biases you can bring into your analysis when using this method. Nevertheless, narrative analysis is still a very useful qualitative analysis method – just keep these limitations in mind and be careful not to draw broad conclusions . If you’re keen to learn more about narrative analysis, the video below provides a great introduction to this qualitative analysis method.

QDA Method #3: Discourse Analysis 

Discourse is simply a fancy word for written or spoken language or debate . So, discourse analysis is all about analysing language within its social context. In other words, analysing language – such as a conversation, a speech, etc – within the culture and society it takes place. For example, you could analyse how a janitor speaks to a CEO, or how politicians speak about terrorism.

To truly understand these conversations or speeches, the culture and history of those involved in the communication are important factors to consider. For example, a janitor might speak more casually with a CEO in a company that emphasises equality among workers. Similarly, a politician might speak more about terrorism if there was a recent terrorist incident in the country.

So, as you can see, by using discourse analysis, you can identify how culture , history or power dynamics (to name a few) have an effect on the way concepts are spoken about. So, if your research aims and objectives involve understanding culture or power dynamics, discourse analysis can be a powerful method.

Because there are many social influences in terms of how we speak to each other, the potential use of discourse analysis is vast . Of course, this also means it’s important to have a very specific research question (or questions) in mind when analysing your data and looking for patterns and themes, or you might land up going down a winding rabbit hole.

Discourse analysis can also be very time-consuming  as you need to sample the data to the point of saturation – in other words, until no new information and insights emerge. But this is, of course, part of what makes discourse analysis such a powerful technique. So, keep these factors in mind when considering this QDA method. Again, if you’re keen to learn more, the video below presents a good starting point.

QDA Method #4: Thematic Analysis

Thematic analysis looks at patterns of meaning in a data set – for example, a set of interviews or focus group transcripts. But what exactly does that… mean? Well, a thematic analysis takes bodies of data (which are often quite large) and groups them according to similarities – in other words, themes . These themes help us make sense of the content and derive meaning from it.

Let’s take a look at an example.

With thematic analysis, you could analyse 100 online reviews of a popular sushi restaurant to find out what patrons think about the place. By reviewing the data, you would then identify the themes that crop up repeatedly within the data – for example, “fresh ingredients” or “friendly wait staff”.

So, as you can see, thematic analysis can be pretty useful for finding out about people’s experiences , views, and opinions . Therefore, if your research aims and objectives involve understanding people’s experience or view of something, thematic analysis can be a great choice.

Since thematic analysis is a bit of an exploratory process, it’s not unusual for your research questions to develop , or even change as you progress through the analysis. While this is somewhat natural in exploratory research, it can also be seen as a disadvantage as it means that data needs to be re-reviewed each time a research question is adjusted. In other words, thematic analysis can be quite time-consuming – but for a good reason. So, keep this in mind if you choose to use thematic analysis for your project and budget extra time for unexpected adjustments.

Thematic analysis takes bodies of data and groups them according to similarities (themes), which help us make sense of the content.

QDA Method #5: Grounded theory (GT) 

Grounded theory is a powerful qualitative analysis method where the intention is to create a new theory (or theories) using the data at hand, through a series of “ tests ” and “ revisions ”. Strictly speaking, GT is more a research design type than an analysis method, but we’ve included it here as it’s often referred to as a method.

What’s most important with grounded theory is that you go into the analysis with an open mind and let the data speak for itself – rather than dragging existing hypotheses or theories into your analysis. In other words, your analysis must develop from the ground up (hence the name). 

Let’s look at an example of GT in action.

Assume you’re interested in developing a theory about what factors influence students to watch a YouTube video about qualitative analysis. Using Grounded theory , you’d start with this general overarching question about the given population (i.e., graduate students). First, you’d approach a small sample – for example, five graduate students in a department at a university. Ideally, this sample would be reasonably representative of the broader population. You’d interview these students to identify what factors lead them to watch the video.

After analysing the interview data, a general pattern could emerge. For example, you might notice that graduate students are more likely to read a post about qualitative methods if they are just starting on their dissertation journey, or if they have an upcoming test about research methods.

From here, you’ll look for another small sample – for example, five more graduate students in a different department – and see whether this pattern holds true for them. If not, you’ll look for commonalities and adapt your theory accordingly. As this process continues, the theory would develop . As we mentioned earlier, what’s important with grounded theory is that the theory develops from the data – not from some preconceived idea.

So, what are the drawbacks of grounded theory? Well, some argue that there’s a tricky circularity to grounded theory. For it to work, in principle, you should know as little as possible regarding the research question and population, so that you reduce the bias in your interpretation. However, in many circumstances, it’s also thought to be unwise to approach a research question without knowledge of the current literature . In other words, it’s a bit of a “chicken or the egg” situation.

Regardless, grounded theory remains a popular (and powerful) option. Naturally, it’s a very useful method when you’re researching a topic that is completely new or has very little existing research about it, as it allows you to start from scratch and work your way from the ground up .

Grounded theory is used to create a new theory (or theories) by using the data at hand, as opposed to existing theories and frameworks.

QDA Method #6:   Interpretive Phenomenological Analysis (IPA)

Interpretive. Phenomenological. Analysis. IPA . Try saying that three times fast…

Let’s just stick with IPA, okay?

IPA is designed to help you understand the personal experiences of a subject (for example, a person or group of people) concerning a major life event, an experience or a situation . This event or experience is the “phenomenon” that makes up the “P” in IPA. Such phenomena may range from relatively common events – such as motherhood, or being involved in a car accident – to those which are extremely rare – for example, someone’s personal experience in a refugee camp. So, IPA is a great choice if your research involves analysing people’s personal experiences of something that happened to them.

It’s important to remember that IPA is subject – centred . In other words, it’s focused on the experiencer . This means that, while you’ll likely use a coding system to identify commonalities, it’s important not to lose the depth of experience or meaning by trying to reduce everything to codes. Also, keep in mind that since your sample size will generally be very small with IPA, you often won’t be able to draw broad conclusions about the generalisability of your findings. But that’s okay as long as it aligns with your research aims and objectives.

Another thing to be aware of with IPA is personal bias . While researcher bias can creep into all forms of research, self-awareness is critically important with IPA, as it can have a major impact on the results. For example, a researcher who was a victim of a crime himself could insert his own feelings of frustration and anger into the way he interprets the experience of someone who was kidnapped. So, if you’re going to undertake IPA, you need to be very self-aware or you could muddy the analysis.

IPA can help you understand the personal experiences of a person or group concerning a major life event, an experience or a situation.

How to choose the right analysis method

In light of all of the qualitative analysis methods we’ve covered so far, you’re probably asking yourself the question, “ How do I choose the right one? ”

Much like all the other methodological decisions you’ll need to make, selecting the right qualitative analysis method largely depends on your research aims, objectives and questions . In other words, the best tool for the job depends on what you’re trying to build. For example:

  • Perhaps your research aims to analyse the use of words and what they reveal about the intention of the storyteller and the cultural context of the time.
  • Perhaps your research aims to develop an understanding of the unique personal experiences of people that have experienced a certain event, or
  • Perhaps your research aims to develop insight regarding the influence of a certain culture on its members.

As you can probably see, each of these research aims are distinctly different , and therefore different analysis methods would be suitable for each one. For example, narrative analysis would likely be a good option for the first aim, while grounded theory wouldn’t be as relevant. 

It’s also important to remember that each method has its own set of strengths, weaknesses and general limitations. No single analysis method is perfect . So, depending on the nature of your research, it may make sense to adopt more than one method (this is called triangulation ). Keep in mind though that this will of course be quite time-consuming.

As we’ve seen, all of the qualitative analysis methods we’ve discussed make use of coding and theme-generating techniques, but the intent and approach of each analysis method differ quite substantially. So, it’s very important to come into your research with a clear intention before you decide which analysis method (or methods) to use.

Start by reviewing your research aims , objectives and research questions to assess what exactly you’re trying to find out – then select a qualitative analysis method that fits. Never pick a method just because you like it or have experience using it – your analysis method (or methods) must align with your broader research aims and objectives.

No single analysis method is perfect, so it can often make sense to adopt more than one  method (this is called triangulation).

Let’s recap on QDA methods…

In this post, we looked at six popular qualitative data analysis methods:

  • First, we looked at content analysis , a straightforward method that blends a little bit of quant into a primarily qualitative analysis.
  • Then we looked at narrative analysis , which is about analysing how stories are told.
  • Next up was discourse analysis – which is about analysing conversations and interactions.
  • Then we moved on to thematic analysis – which is about identifying themes and patterns.
  • From there, we went south with grounded theory – which is about starting from scratch with a specific question and using the data alone to build a theory in response to that question.
  • And finally, we looked at IPA – which is about understanding people’s unique experiences of a phenomenon.

Of course, these aren’t the only options when it comes to qualitative data analysis, but they’re a great starting point if you’re dipping your toes into qualitative research for the first time.

If you’re still feeling a bit confused, consider our private coaching service , where we hold your hand through the research process to help you develop your best work.

data analysis examples qualitative research

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Sampling methods and strategies in research

84 Comments

Richard N

This has been very helpful. Thank you.

netaji

Thank you madam,

Mariam Jaiyeola

Thank you so much for this information

Nzube

I wonder it so clear for understand and good for me. can I ask additional query?

Lee

Very insightful and useful

Susan Nakaweesi

Good work done with clear explanations. Thank you.

Titilayo

Thanks so much for the write-up, it’s really good.

Hemantha Gunasekara

Thanks madam . It is very important .

Gumathandra

thank you very good

Pramod Bahulekar

This has been very well explained in simple language . It is useful even for a new researcher.

Derek Jansen

Great to hear that. Good luck with your qualitative data analysis, Pramod!

Adam Zahir

This is very useful information. And it was very a clear language structured presentation. Thanks a lot.

Golit,F.

Thank you so much.

Emmanuel

very informative sequential presentation

Shahzada

Precise explanation of method.

Alyssa

Hi, may we use 2 data analysis methods in our qualitative research?

Thanks for your comment. Most commonly, one would use one type of analysis method, but it depends on your research aims and objectives.

Dr. Manju Pandey

You explained it in very simple language, everyone can understand it. Thanks so much.

Phillip

Thank you very much, this is very helpful. It has been explained in a very simple manner that even a layman understands

Anne

Thank nicely explained can I ask is Qualitative content analysis the same as thematic analysis?

Thanks for your comment. No, QCA and thematic are two different types of analysis. This article might help clarify – https://onlinelibrary.wiley.com/doi/10.1111/nhs.12048

Rev. Osadare K . J

This is my first time to come across a well explained data analysis. so helpful.

Tina King

I have thoroughly enjoyed your explanation of the six qualitative analysis methods. This is very helpful. Thank you!

Bromie

Thank you very much, this is well explained and useful

udayangani

i need a citation of your book.

khutsafalo

Thanks a lot , remarkable indeed, enlighting to the best

jas

Hi Derek, What other theories/methods would you recommend when the data is a whole speech?

M

Keep writing useful artikel.

Adane

It is important concept about QDA and also the way to express is easily understandable, so thanks for all.

Carl Benecke

Thank you, this is well explained and very useful.

Ngwisa

Very helpful .Thanks.

Hajra Aman

Hi there! Very well explained. Simple but very useful style of writing. Please provide the citation of the text. warm regards

Hillary Mophethe

The session was very helpful and insightful. Thank you

This was very helpful and insightful. Easy to read and understand

Catherine

As a professional academic writer, this has been so informative and educative. Keep up the good work Grad Coach you are unmatched with quality content for sure.

Keep up the good work Grad Coach you are unmatched with quality content for sure.

Abdulkerim

Its Great and help me the most. A Million Thanks you Dr.

Emanuela

It is a very nice work

Noble Naade

Very insightful. Please, which of this approach could be used for a research that one is trying to elicit students’ misconceptions in a particular concept ?

Karen

This is Amazing and well explained, thanks

amirhossein

great overview

Tebogo

What do we call a research data analysis method that one use to advise or determining the best accounting tool or techniques that should be adopted in a company.

Catherine Shimechero

Informative video, explained in a clear and simple way. Kudos

Van Hmung

Waoo! I have chosen method wrong for my data analysis. But I can revise my work according to this guide. Thank you so much for this helpful lecture.

BRIAN ONYANGO MWAGA

This has been very helpful. It gave me a good view of my research objectives and how to choose the best method. Thematic analysis it is.

Livhuwani Reineth

Very helpful indeed. Thanku so much for the insight.

Storm Erlank

This was incredibly helpful.

Jack Kanas

Very helpful.

catherine

very educative

Wan Roslina

Nicely written especially for novice academic researchers like me! Thank you.

Talash

choosing a right method for a paper is always a hard job for a student, this is a useful information, but it would be more useful personally for me, if the author provide me with a little bit more information about the data analysis techniques in type of explanatory research. Can we use qualitative content analysis technique for explanatory research ? or what is the suitable data analysis method for explanatory research in social studies?

ramesh

that was very helpful for me. because these details are so important to my research. thank you very much

Kumsa Desisa

I learnt a lot. Thank you

Tesfa NT

Relevant and Informative, thanks !

norma

Well-planned and organized, thanks much! 🙂

Dr. Jacob Lubuva

I have reviewed qualitative data analysis in a simplest way possible. The content will highly be useful for developing my book on qualitative data analysis methods. Cheers!

Nyi Nyi Lwin

Clear explanation on qualitative and how about Case study

Ogobuchi Otuu

This was helpful. Thank you

Alicia

This was really of great assistance, it was just the right information needed. Explanation very clear and follow.

Wow, Thanks for making my life easy

C. U

This was helpful thanks .

Dr. Alina Atif

Very helpful…. clear and written in an easily understandable manner. Thank you.

Herb

This was so helpful as it was easy to understand. I’m a new to research thank you so much.

cissy

so educative…. but Ijust want to know which method is coding of the qualitative or tallying done?

Ayo

Thank you for the great content, I have learnt a lot. So helpful

Tesfaye

precise and clear presentation with simple language and thank you for that.

nneheng

very informative content, thank you.

Oscar Kuebutornye

You guys are amazing on YouTube on this platform. Your teachings are great, educative, and informative. kudos!

NG

Brilliant Delivery. You made a complex subject seem so easy. Well done.

Ankit Kumar

Beautifully explained.

Thanks a lot

Kidada Owen-Browne

Is there a video the captures the practical process of coding using automated applications?

Thanks for the comment. We don’t recommend using automated applications for coding, as they are not sufficiently accurate in our experience.

Mathewos Damtew

content analysis can be qualitative research?

Hend

THANK YOU VERY MUCH.

Dev get

Thank you very much for such a wonderful content

Kassahun Aman

do you have any material on Data collection

Prince .S. mpofu

What a powerful explanation of the QDA methods. Thank you.

Kassahun

Great explanation both written and Video. i have been using of it on a day to day working of my thesis project in accounting and finance. Thank you very much for your support.

BORA SAMWELI MATUTULI

very helpful, thank you so much

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly
  • AI & NLP
  • Churn & Loyalty
  • Customer Experience
  • Customer Journeys
  • Customer Metrics
  • Feedback Analysis
  • Product Experience
  • Product Updates
  • Sentiment Analysis
  • Surveys & Feedback Collection
  • Try Thematic

Welcome to the community

data analysis examples qualitative research

Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic)

When we conduct qualitative methods of research, need to explain changes in metrics or understand people's opinions, we always turn to qualitative data. Qualitative data is typically generated through:

  • Interview transcripts
  • Surveys with open-ended questions
  • Contact center transcripts
  • Texts and documents
  • Audio and video recordings
  • Observational notes

Compared to quantitative data, which captures structured information, qualitative data is unstructured and has more depth. It can answer our questions, can help formulate hypotheses and build understanding.

It's important to understand the differences between quantitative data & qualitative data . But unfortunately, analyzing qualitative data is difficult. While tools like Excel, Tableau and PowerBI crunch and visualize quantitative data with ease, there are a limited number of mainstream tools for analyzing qualitative data . The majority of qualitative data analysis still happens manually.

That said, there are two new trends that are changing this. First, there are advances in natural language processing (NLP) which is focused on understanding human language. Second, there is an explosion of user-friendly software designed for both researchers and businesses. Both help automate the qualitative data analysis process.

In this post we want to teach you how to conduct a successful qualitative data analysis. There are two primary qualitative data analysis methods; manual & automatic. We will teach you how to conduct the analysis manually, and also, automatically using software solutions powered by NLP. We’ll guide you through the steps to conduct a manual analysis, and look at what is involved and the role technology can play in automating this process.

More businesses are switching to fully-automated analysis of qualitative customer data because it is cheaper, faster, and just as accurate. Primarily, businesses purchase subscriptions to feedback analytics platforms so that they can understand customer pain points and sentiment.

Overwhelming quantity of feedback

We’ll take you through 5 steps to conduct a successful qualitative data analysis. Within each step we will highlight the key difference between the manual, and automated approach of qualitative researchers. Here's an overview of the steps:

The 5 steps to doing qualitative data analysis

  • Gathering and collecting your qualitative data
  • Organizing and connecting into your qualitative data
  • Coding your qualitative data
  • Analyzing the qualitative data for insights
  • Reporting on the insights derived from your analysis

What is Qualitative Data Analysis?

Qualitative data analysis is a process of gathering, structuring and interpreting qualitative data to understand what it represents.

Qualitative data is non-numerical and unstructured. Qualitative data generally refers to text, such as open-ended responses to survey questions or user interviews, but also includes audio, photos and video.

Businesses often perform qualitative data analysis on customer feedback. And within this context, qualitative data generally refers to verbatim text data collected from sources such as reviews, complaints, chat messages, support centre interactions, customer interviews, case notes or social media comments.

How is qualitative data analysis different from quantitative data analysis?

Understanding the differences between quantitative & qualitative data is important. When it comes to analyzing data, Qualitative Data Analysis serves a very different role to Quantitative Data Analysis. But what sets them apart?

Qualitative Data Analysis dives into the stories hidden in non-numerical data such as interviews, open-ended survey answers, or notes from observations. It uncovers the ‘whys’ and ‘hows’ giving a deep understanding of people’s experiences and emotions.

Quantitative Data Analysis on the other hand deals with numerical data, using statistics to measure differences, identify preferred options, and pinpoint root causes of issues.  It steps back to address questions like "how many" or "what percentage" to offer broad insights we can apply to larger groups.

In short, Qualitative Data Analysis is like a microscope,  helping us understand specific detail. Quantitative Data Analysis is like the telescope, giving us a broader perspective. Both are important, working together to decode data for different objectives.

Qualitative Data Analysis methods

Once all the data has been captured, there are a variety of analysis techniques available and the choice is determined by your specific research objectives and the kind of data you’ve gathered.  Common qualitative data analysis methods include:

Content Analysis

This is a popular approach to qualitative data analysis. Other qualitative analysis techniques may fit within the broad scope of content analysis. Thematic analysis is a part of the content analysis.  Content analysis is used to identify the patterns that emerge from text, by grouping content into words, concepts, and themes. Content analysis is useful to quantify the relationship between all of the grouped content. The Columbia School of Public Health has a detailed breakdown of content analysis .

Narrative Analysis

Narrative analysis focuses on the stories people tell and the language they use to make sense of them.  It is particularly useful in qualitative research methods where customer stories are used to get a deep understanding of customers’ perspectives on a specific issue. A narrative analysis might enable us to summarize the outcomes of a focused case study.

Discourse Analysis

Discourse analysis is used to get a thorough understanding of the political, cultural and power dynamics that exist in specific situations.  The focus of discourse analysis here is on the way people express themselves in different social contexts. Discourse analysis is commonly used by brand strategists who hope to understand why a group of people feel the way they do about a brand or product.

Thematic Analysis

Thematic analysis is used to deduce the meaning behind the words people use. This is accomplished by discovering repeating themes in text. These meaningful themes reveal key insights into data and can be quantified, particularly when paired with sentiment analysis . Often, the outcome of thematic analysis is a code frame that captures themes in terms of codes, also called categories. So the process of thematic analysis is also referred to as “coding”. A common use-case for thematic analysis in companies is analysis of customer feedback.

Grounded Theory

Grounded theory is a useful approach when little is known about a subject. Grounded theory starts by formulating a theory around a single data case. This means that the theory is “grounded”. Grounded theory analysis is based on actual data, and not entirely speculative. Then additional cases can be examined to see if they are relevant and can add to the original grounded theory.

Methods of qualitative data analysis; approaches and techniques to qualitative data analysis

Challenges of Qualitative Data Analysis

While Qualitative Data Analysis offers rich insights, it comes with its challenges. Each unique QDA method has its unique hurdles. Let’s take a look at the challenges researchers and analysts might face, depending on the chosen method.

  • Time and Effort (Narrative Analysis): Narrative analysis, which focuses on personal stories, demands patience. Sifting through lengthy narratives to find meaningful insights can be time-consuming, requires dedicated effort.
  • Being Objective (Grounded Theory): Grounded theory, building theories from data, faces the challenges of personal biases. Staying objective while interpreting data is crucial, ensuring conclusions are rooted in the data itself.
  • Complexity (Thematic Analysis): Thematic analysis involves identifying themes within data, a process that can be intricate. Categorizing and understanding themes can be complex, especially when each piece of data varies in context and structure. Thematic Analysis software can simplify this process.
  • Generalizing Findings (Narrative Analysis): Narrative analysis, dealing with individual stories, makes drawing broad challenging. Extending findings from a single narrative to a broader context requires careful consideration.
  • Managing Data (Thematic Analysis): Thematic analysis involves organizing and managing vast amounts of unstructured data, like interview transcripts. Managing this can be a hefty task, requiring effective data management strategies.
  • Skill Level (Grounded Theory): Grounded theory demands specific skills to build theories from the ground up. Finding or training analysts with these skills poses a challenge, requiring investment in building expertise.

Benefits of qualitative data analysis

Qualitative Data Analysis (QDA) is like a versatile toolkit, offering a tailored approach to understanding your data. The benefits it offers are as diverse as the methods. Let’s explore why choosing the right method matters.

  • Tailored Methods for Specific Needs: QDA isn't one-size-fits-all. Depending on your research objectives and the type of data at hand, different methods offer unique benefits. If you want emotive customer stories, narrative analysis paints a strong picture. When you want to explain a score, thematic analysis reveals insightful patterns
  • Flexibility with Thematic Analysis: thematic analysis is like a chameleon in the toolkit of QDA. It adapts well to different types of data and research objectives, making it a top choice for any qualitative analysis.
  • Deeper Understanding, Better Products: QDA helps you dive into people's thoughts and feelings. This deep understanding helps you build products and services that truly matches what people want, ensuring satisfied customers
  • Finding the Unexpected: Qualitative data often reveals surprises that we miss in quantitative data. QDA offers us new ideas and perspectives, for insights we might otherwise miss.
  • Building Effective Strategies: Insights from QDA are like strategic guides. They help businesses in crafting plans that match people’s desires.
  • Creating Genuine Connections: Understanding people’s experiences lets businesses connect on a real level. This genuine connection helps build trust and loyalty, priceless for any business.

How to do Qualitative Data Analysis: 5 steps

Now we are going to show how you can do your own qualitative data analysis. We will guide you through this process step by step. As mentioned earlier, you will learn how to do qualitative data analysis manually , and also automatically using modern qualitative data and thematic analysis software.

To get best value from the analysis process and research process, it’s important to be super clear about the nature and scope of the question that’s being researched. This will help you select the research collection channels that are most likely to help you answer your question.

Depending on if you are a business looking to understand customer sentiment, or an academic surveying a school, your approach to qualitative data analysis will be unique.

Once you’re clear, there’s a sequence to follow. And, though there are differences in the manual and automatic approaches, the process steps are mostly the same.

The use case for our step-by-step guide is a company looking to collect data (customer feedback data), and analyze the customer feedback - in order to improve customer experience. By analyzing the customer feedback the company derives insights about their business and their customers. You can follow these same steps regardless of the nature of your research. Let’s get started.

Step 1: Gather your qualitative data and conduct research (Conduct qualitative research)

The first step of qualitative research is to do data collection. Put simply, data collection is gathering all of your data for analysis. A common situation is when qualitative data is spread across various sources.

Classic methods of gathering qualitative data

Most companies use traditional methods for gathering qualitative data: conducting interviews with research participants, running surveys, and running focus groups. This data is typically stored in documents, CRMs, databases and knowledge bases. It’s important to examine which data is available and needs to be included in your research project, based on its scope.

Using your existing qualitative feedback

As it becomes easier for customers to engage across a range of different channels, companies are gathering increasingly large amounts of both solicited and unsolicited qualitative feedback.

Most organizations have now invested in Voice of Customer programs , support ticketing systems, chatbot and support conversations, emails and even customer Slack chats.

These new channels provide companies with new ways of getting feedback, and also allow the collection of unstructured feedback data at scale.

The great thing about this data is that it contains a wealth of valubale insights and that it’s already there! When you have a new question about user behavior or your customers, you don’t need to create a new research study or set up a focus group. You can find most answers in the data you already have.

Typically, this data is stored in third-party solutions or a central database, but there are ways to export it or connect to a feedback analysis solution through integrations or an API.

Utilize untapped qualitative data channels

There are many online qualitative data sources you may not have considered. For example, you can find useful qualitative data in social media channels like Twitter or Facebook. Online forums, review sites, and online communities such as Discourse or Reddit also contain valuable data about your customers, or research questions.

If you are considering performing a qualitative benchmark analysis against competitors - the internet is your best friend. Gathering feedback in competitor reviews on sites like Trustpilot, G2, Capterra, Better Business Bureau or on app stores is a great way to perform a competitor benchmark analysis.

Customer feedback analysis software often has integrations into social media and review sites, or you could use a solution like DataMiner to scrape the reviews.

G2.com reviews of the product Airtable. You could pull reviews from G2 for your analysis.

Step 2: Connect & organize all your qualitative data

Now you all have this qualitative data but there’s a problem, the data is unstructured. Before feedback can be analyzed and assigned any value, it needs to be organized in a single place. Why is this important? Consistency!

If all data is easily accessible in one place and analyzed in a consistent manner, you will have an easier time summarizing and making decisions based on this data.

The manual approach to organizing your data

The classic method of structuring qualitative data is to plot all the raw data you’ve gathered into a spreadsheet.

Typically, research and support teams would share large Excel sheets and different business units would make sense of the qualitative feedback data on their own. Each team collects and organizes the data in a way that best suits them, which means the feedback tends to be kept in separate silos.

An alternative and a more robust solution is to store feedback in a central database, like Snowflake or Amazon Redshift .

Keep in mind that when you organize your data in this way, you are often preparing it to be imported into another software. If you go the route of a database, you would need to use an API to push the feedback into a third-party software.

Computer-assisted qualitative data analysis software (CAQDAS)

Traditionally within the manual analysis approach (but not always), qualitative data is imported into CAQDAS software for coding.

In the early 2000s, CAQDAS software was popularised by developers such as ATLAS.ti, NVivo and MAXQDA and eagerly adopted by researchers to assist with the organizing and coding of data.  

The benefits of using computer-assisted qualitative data analysis software:

  • Assists in the organizing of your data
  • Opens you up to exploring different interpretations of your data analysis
  • Allows you to share your dataset easier and allows group collaboration (allows for secondary analysis)

However you still need to code the data, uncover the themes and do the analysis yourself. Therefore it is still a manual approach.

The user interface of CAQDAS software 'NVivo'

Organizing your qualitative data in a feedback repository

Another solution to organizing your qualitative data is to upload it into a feedback repository where it can be unified with your other data , and easily searchable and taggable. There are a number of software solutions that act as a central repository for your qualitative research data. Here are a couple solutions that you could investigate:  

  • Dovetail: Dovetail is a research repository with a focus on video and audio transcriptions. You can tag your transcriptions within the platform for theme analysis. You can also upload your other qualitative data such as research reports, survey responses, support conversations, and customer interviews. Dovetail acts as a single, searchable repository. And makes it easier to collaborate with other people around your qualitative research.
  • EnjoyHQ: EnjoyHQ is another research repository with similar functionality to Dovetail. It boasts a more sophisticated search engine, but it has a higher starting subscription cost.

Organizing your qualitative data in a feedback analytics platform

If you have a lot of qualitative customer or employee feedback, from the likes of customer surveys or employee surveys, you will benefit from a feedback analytics platform. A feedback analytics platform is a software that automates the process of both sentiment analysis and thematic analysis . Companies use the integrations offered by these platforms to directly tap into their qualitative data sources (review sites, social media, survey responses, etc.). The data collected is then organized and analyzed consistently within the platform.

If you have data prepared in a spreadsheet, it can also be imported into feedback analytics platforms.

Once all this rich data has been organized within the feedback analytics platform, it is ready to be coded and themed, within the same platform. Thematic is a feedback analytics platform that offers one of the largest libraries of integrations with qualitative data sources.

Some of qualitative data integrations offered by Thematic

Step 3: Coding your qualitative data

Your feedback data is now organized in one place. Either within your spreadsheet, CAQDAS, feedback repository or within your feedback analytics platform. The next step is to code your feedback data so we can extract meaningful insights in the next step.

Coding is the process of labelling and organizing your data in such a way that you can then identify themes in the data, and the relationships between these themes.

To simplify the coding process, you will take small samples of your customer feedback data, come up with a set of codes, or categories capturing themes, and label each piece of feedback, systematically, for patterns and meaning. Then you will take a larger sample of data, revising and refining the codes for greater accuracy and consistency as you go.

If you choose to use a feedback analytics platform, much of this process will be automated and accomplished for you.

The terms to describe different categories of meaning (‘theme’, ‘code’, ‘tag’, ‘category’ etc) can be confusing as they are often used interchangeably.  For clarity, this article will use the term ‘code’.

To code means to identify key words or phrases and assign them to a category of meaning. “I really hate the customer service of this computer software company” would be coded as “poor customer service”.

How to manually code your qualitative data

  • Decide whether you will use deductive or inductive coding. Deductive coding is when you create a list of predefined codes, and then assign them to the qualitative data. Inductive coding is the opposite of this, you create codes based on the data itself. Codes arise directly from the data and you label them as you go. You need to weigh up the pros and cons of each coding method and select the most appropriate.
  • Read through the feedback data to get a broad sense of what it reveals. Now it’s time to start assigning your first set of codes to statements and sections of text.
  • Keep repeating step 2, adding new codes and revising the code description as often as necessary.  Once it has all been coded, go through everything again, to be sure there are no inconsistencies and that nothing has been overlooked.
  • Create a code frame to group your codes. The coding frame is the organizational structure of all your codes. And there are two commonly used types of coding frames, flat, or hierarchical. A hierarchical code frame will make it easier for you to derive insights from your analysis.
  • Based on the number of times a particular code occurs, you can now see the common themes in your feedback data. This is insightful! If ‘bad customer service’ is a common code, it’s time to take action.

We have a detailed guide dedicated to manually coding your qualitative data .

Example of a hierarchical coding frame in qualitative data analysis

Using software to speed up manual coding of qualitative data

An Excel spreadsheet is still a popular method for coding. But various software solutions can help speed up this process. Here are some examples.

  • CAQDAS / NVivo - CAQDAS software has built-in functionality that allows you to code text within their software. You may find the interface the software offers easier for managing codes than a spreadsheet.
  • Dovetail/EnjoyHQ - You can tag transcripts and other textual data within these solutions. As they are also repositories you may find it simpler to keep the coding in one platform.
  • IBM SPSS - SPSS is a statistical analysis software that may make coding easier than in a spreadsheet.
  • Ascribe - Ascribe’s ‘Coder’ is a coding management system. Its user interface will make it easier for you to manage your codes.

Automating the qualitative coding process using thematic analysis software

In solutions which speed up the manual coding process, you still have to come up with valid codes and often apply codes manually to pieces of feedback. But there are also solutions that automate both the discovery and the application of codes.

Advances in machine learning have now made it possible to read, code and structure qualitative data automatically. This type of automated coding is offered by thematic analysis software .

Automation makes it far simpler and faster to code the feedback and group it into themes. By incorporating natural language processing (NLP) into the software, the AI looks across sentences and phrases to identify common themes meaningful statements. Some automated solutions detect repeating patterns and assign codes to them, others make you train the AI by providing examples. You could say that the AI learns the meaning of the feedback on its own.

Thematic automates the coding of qualitative feedback regardless of source. There’s no need to set up themes or categories in advance. Simply upload your data and wait a few minutes. You can also manually edit the codes to further refine their accuracy.  Experiments conducted indicate that Thematic’s automated coding is just as accurate as manual coding .

Paired with sentiment analysis and advanced text analytics - these automated solutions become powerful for deriving quality business or research insights.

You could also build your own , if you have the resources!

The key benefits of using an automated coding solution

Automated analysis can often be set up fast and there’s the potential to uncover things that would never have been revealed if you had given the software a prescribed list of themes to look for.

Because the model applies a consistent rule to the data, it captures phrases or statements that a human eye might have missed.

Complete and consistent analysis of customer feedback enables more meaningful findings. Leading us into step 4.

Step 4: Analyze your data: Find meaningful insights

Now we are going to analyze our data to find insights. This is where we start to answer our research questions. Keep in mind that step 4 and step 5 (tell the story) have some overlap . This is because creating visualizations is both part of analysis process and reporting.

The task of uncovering insights is to scour through the codes that emerge from the data and draw meaningful correlations from them. It is also about making sure each insight is distinct and has enough data to support it.

Part of the analysis is to establish how much each code relates to different demographics and customer profiles, and identify whether there’s any relationship between these data points.

Manually create sub-codes to improve the quality of insights

If your code frame only has one level, you may find that your codes are too broad to be able to extract meaningful insights. This is where it is valuable to create sub-codes to your primary codes. This process is sometimes referred to as meta coding.

Note: If you take an inductive coding approach, you can create sub-codes as you are reading through your feedback data and coding it.

While time-consuming, this exercise will improve the quality of your analysis. Here is an example of what sub-codes could look like.

Example of sub-codes

You need to carefully read your qualitative data to create quality sub-codes. But as you can see, the depth of analysis is greatly improved. By calculating the frequency of these sub-codes you can get insight into which  customer service problems you can immediately address.

Correlate the frequency of codes to customer segments

Many businesses use customer segmentation . And you may have your own respondent segments that you can apply to your qualitative analysis. Segmentation is the practise of dividing customers or research respondents into subgroups.

Segments can be based on:

  • Demographic
  • And any other data type that you care to segment by

It is particularly useful to see the occurrence of codes within your segments. If one of your customer segments is considered unimportant to your business, but they are the cause of nearly all customer service complaints, it may be in your best interest to focus attention elsewhere. This is a useful insight!

Manually visualizing coded qualitative data

There are formulas you can use to visualize key insights in your data. The formulas we will suggest are imperative if you are measuring a score alongside your feedback.

If you are collecting a metric alongside your qualitative data this is a key visualization. Impact answers the question: “What’s the impact of a code on my overall score?”. Using Net Promoter Score (NPS) as an example, first you need to:

  • Calculate overall NPS
  • Calculate NPS in the subset of responses that do not contain that theme
  • Subtract B from A

Then you can use this simple formula to calculate code impact on NPS .

Visualizing qualitative data: Calculating the impact of a code on your score

You can then visualize this data using a bar chart.

You can download our CX toolkit - it includes a template to recreate this.

Trends over time

This analysis can help you answer questions like: “Which codes are linked to decreases or increases in my score over time?”

We need to compare two sequences of numbers: NPS over time and code frequency over time . Using Excel, calculate the correlation between the two sequences, which can be either positive (the more codes the higher the NPS, see picture below), or negative (the more codes the lower the NPS).

Now you need to plot code frequency against the absolute value of code correlation with NPS. Here is the formula:

Analyzing qualitative data: Calculate which codes are linked to increases or decreases in my score

The visualization could look like this:

Visualizing qualitative data trends over time

These are two examples, but there are more. For a third manual formula, and to learn why word clouds are not an insightful form of analysis, read our visualizations article .

Using a text analytics solution to automate analysis

Automated text analytics solutions enable codes and sub-codes to be pulled out of the data automatically. This makes it far faster and easier to identify what’s driving negative or positive results. And to pick up emerging trends and find all manner of rich insights in the data.

Another benefit of AI-driven text analytics software is its built-in capability for sentiment analysis, which provides the emotive context behind your feedback and other qualitative textual data therein.

Thematic provides text analytics that goes further by allowing users to apply their expertise on business context to edit or augment the AI-generated outputs.

Since the move away from manual research is generally about reducing the human element, adding human input to the technology might sound counter-intuitive. However, this is mostly to make sure important business nuances in the feedback aren’t missed during coding. The result is a higher accuracy of analysis. This is sometimes referred to as augmented intelligence .

Codes displayed by volume within Thematic. You can 'manage themes' to introduce human input.

Step 5: Report on your data: Tell the story

The last step of analyzing your qualitative data is to report on it, to tell the story. At this point, the codes are fully developed and the focus is on communicating the narrative to the audience.

A coherent outline of the qualitative research, the findings and the insights is vital for stakeholders to discuss and debate before they can devise a meaningful course of action.

Creating graphs and reporting in Powerpoint

Typically, qualitative researchers take the tried and tested approach of distilling their report into a series of charts, tables and other visuals which are woven into a narrative for presentation in Powerpoint.

Using visualization software for reporting

With data transformation and APIs, the analyzed data can be shared with data visualisation software, such as Power BI or Tableau , Google Studio or Looker. Power BI and Tableau are among the most preferred options.

Visualizing your insights inside a feedback analytics platform

Feedback analytics platforms, like Thematic, incorporate visualisation tools that intuitively turn key data and insights into graphs.  This removes the time consuming work of constructing charts to visually identify patterns and creates more time to focus on building a compelling narrative that highlights the insights, in bite-size chunks, for executive teams to review.

Using a feedback analytics platform with visualization tools means you don’t have to use a separate product for visualizations. You can export graphs into Powerpoints straight from the platforms.

Two examples of qualitative data visualizations within Thematic

Conclusion - Manual or Automated?

There are those who remain deeply invested in the manual approach - because it’s familiar, because they’re reluctant to spend money and time learning new software, or because they’ve been burned by the overpromises of AI.  

For projects that involve small datasets, manual analysis makes sense. For example, if the objective is simply to quantify a simple question like “Do customers prefer X concepts to Y?”. If the findings are being extracted from a small set of focus groups and interviews, sometimes it’s easier to just read them

However, as new generations come into the workplace, it’s technology-driven solutions that feel more comfortable and practical. And the merits are undeniable.  Especially if the objective is to go deeper and understand the ‘why’ behind customers’ preference for X or Y. And even more especially if time and money are considerations.

The ability to collect a free flow of qualitative feedback data at the same time as the metric means AI can cost-effectively scan, crunch, score and analyze a ton of feedback from one system in one go. And time-intensive processes like focus groups, or coding, that used to take weeks, can now be completed in a matter of hours or days.

But aside from the ever-present business case to speed things up and keep costs down, there are also powerful research imperatives for automated analysis of qualitative data: namely, accuracy and consistency.

Finding insights hidden in feedback requires consistency, especially in coding.  Not to mention catching all the ‘unknown unknowns’ that can skew research findings and steering clear of cognitive bias.

Some say without manual data analysis researchers won’t get an accurate “feel” for the insights. However, the larger data sets are, the harder it is to sort through the feedback and organize feedback that has been pulled from different places.  And, the more difficult it is to stay on course, the greater the risk of drawing incorrect, or incomplete, conclusions grows.

Though the process steps for qualitative data analysis have remained pretty much unchanged since psychologist Paul Felix Lazarsfeld paved the path a hundred years ago, the impact digital technology has had on types of qualitative feedback data and the approach to the analysis are profound.  

If you want to try an automated feedback analysis solution on your own qualitative data, you can get started with Thematic .

data analysis examples qualitative research

Community & Marketing

Tyler manages our community of CX, insights & analytics professionals. Tyler's goal is to help unite insights professionals around common challenges.

We make it easy to discover the customer and product issues that matter.

Unlock the value of feedback at scale, in one platform. Try it for free now!

  • Questions to ask your Feedback Analytics vendor
  • How to end customer churn for good
  • Scalable analysis of NPS verbatims
  • 5 Text analytics approaches
  • How to calculate the ROI of CX

Our experts will show you how Thematic works, how to discover pain points and track the ROI of decisions. To access your free trial, book a personal demo today.

Recent posts

Watercare is New Zealand's largest water and wastewater service provider. They are responsible for bringing clean water to 1.7 million people in Tamaki Makaurau (Auckland) and safeguarding the wastewater network to minimize impact on the environment. Water is a sector that often gets taken for granted, with drainage and

Become a qualitative theming pro! Creating a perfect code frame is hard, but thematic analysis software makes the process much easier.

Qualtrics is one of the most well-known and powerful Customer Feedback Management platforms. But even so, it has limitations. We recently hosted a live panel where data analysts from two well-known brands shared their experiences with Qualtrics, and how they extended this platform’s capabilities. Below, we’ll share the

Analyst Answers

Data & Finance for Work & Life

man doing qualitative research

Data Analysis for Qualitative Research: 6 Step Guide

Data analysis for qualitative research is not intuitive. This is because qualitative data stands in opposition to traditional data analysis methodologies: while data analysis is concerned with quantities, qualitative data is by definition unquantified . But there is an easy, methodical approach that anyone can take use to get reliable results when performing data analysis for qualitative research. The process consists of 6 steps that I’ll break down in this article:

  • Perform interviews(if necessary )
  • Gather all documents and transcribe any non-paper records
  • Decide whether to either code analytical data, analyze word frequencies, or both
  • Decide what interpretive angle you want to take: content analysis , narrative analysis, discourse analysis, framework analysis, and/or grounded theory
  • Compile your data in a spreadsheet using document saving techniques (windows and mac)
  • Identify trends in words, themes, metaphors, natural patterns, and more

To complete these steps, you will need:

  • Microsoft word
  • Microsoft excel
  • Internet access

You can get the free Intro to Data Analysis eBook to cover the fundamentals and ensure strong progression in all your data endeavors.

What is qualitative research?

Qualitative research is not the same as quantitative research. In short, qualitative research is the interpretation of non-numeric data. It usually aims at drawing conclusions that explain why a phenomenon occurs, rather than that one does occur. Here’s a great quote from a nursing magazine about quantitative vs qualitative research:

“A traditional quantitative study… uses a predetermined (and auditable) set of steps to confirm or refute [a] hypothesis. “In contrast, qualitative research often takes the position that an interpretive understanding is only possible by way of uncovering or deconstructing the meanings of a phenomenon. Thus, a distinction between explaining how something operates (explanation) and why it operates in the manner that it does (interpretation) may be [an] effective way to distinguish quantitative from qualitative analytic processes involved in any particular study.” (bold added) (( EBN ))

Learn to Interpret Your Qualitative Data

This article explain what data analysis is and how to do it. To learn how to interpret the results, visualize, and write an insightful report, sign up for our handbook below.

data analysis examples qualitative research

Step 1a: Data collection methods and techniques in qualitative research: interviews and focus groups

Step 1 is collecting the data that you will need for the analysis. If you are not performing any interviews or focus groups to gather data, then you can skip this step. It’s for people who need to go into the field and collect raw information as part of their qualitative analysis.

Since the whole point of an interview and of qualitative analysis in general is to understand a research question better, you should start by making sure you have a specific, refined research question . Whether you’re a researcher by trade or a data analyst working on one-time project, you must know specifically what you want to understand in order to get results.

Good research questions are specific enough to guide action but open enough to leave room for insight and growth. Examples of good research questions include:

  • Good : To what degree does living in a city impact the quality of a person’s life? (open-ended, complex)
  • Bad : Does living in a city impact the quality of a person’s life? (closed, simple)

Once you understand the research question, you need to develop a list of interview questions. These questions should likewise be open-ended and provide liberty of expression to the responder. They should support the research question in an active way without prejudicing the response. Examples of good interview questions include:

  • Good : Tell me what it’s like to live in a city versus in the country. (open, not leading)
  • Bad : Don’t you prefer the city to the country because there are more people? (closed, leading)

Some additional helpful tips include:

  • Begin each interview with a neutral question to get the person relaxed
  • Limit each question to a single idea
  • If you don’t understand, ask for clarity
  • Do not pass any judgements
  • Do not spend more than 15m on an interview, lest the quality of responses drop

Focus groups

The alternative to interviews is focus groups. Focus groups are a great way for you to get an idea for how people communicate their opinions in a group setting, rather than a one-on-one setting as in interviews.

In short, focus groups are gatherings of small groups of people from representative backgrounds who receive instruction, or “facilitation,” from a focus group leader. Typically, the leader will ask questions to stimulate conversation, reformulate questions to bring the discussion back to focus, and prevent the discussion from turning sour or giving way to bad faith.

Focus group questions should be open-ended like their interview neighbors, and they should stimulate some degree of disagreement. Disagreement often leads to valuable information about differing opinions, as people tend to say what they mean if contradicted.

However, focus group leaders must be careful not to let disagreements escalate, as anger can make people lie to be hurtful or simply to win an argument. And lies are not helpful in data analysis for qualitative research.

Step 1b: Tools for qualitative data collection

When it comes to data analysis for qualitative analysis, the tools you use to collect data should align to some degree with the tools you will use to analyze the data.

As mentioned in the intro, you will be focusing on analysis techniques that only require the traditional Microsoft suite programs: Microsoft Excel and Microsoft Word . At the same time, you can source supplementary tools from various websites, like Text Analyzer and WordCounter.

In short, the tools for qualitative data collection that you need are Excel and Word , as well as web-based free tools like Text Analyzer and WordCounter . These online tools are helpful in the quantitative part of your qualitative research.

Step 2: Gather all documents & transcribe non-written docs

Once you have your interviews and/or focus group transcripts, it’s time to decide if you need other documentation. If you do, you’ll need to gather it all into one place first, then develop a strategy for how to transcribe any non-written documents.

When do you need documentation other than interviews and focus groups? Two situations usually call for documentation. First , if you have little funding , then you can’t afford to run expensive interviews and focus groups.

Second , social science researchers typically focus on documents since their research questions are less concerned with subject-oriented data, while hard science and business researchers typically focus on interviews and focus groups because they want to know what people think, and they want to know today.

Non-written records

Other factors at play include the type of research, the field, and specific research goal. For those who need documentation and to describe non-written records, there are some steps to follow:

  • Put all hard copy source documents into a sealed binder (I use plastic paper holders with elastic seals ).
  • If you are sourcing directly from printed books or journals, then you will need to digitalize them by scanning them and making them text readable by the computer. To do so, turn all PDFs into Word documents using online tools such as PDF to Word Converter . This process is never full-proof, and it may be a source of error in the data collection, but it’s part of the process.
  • If you are sourcing online documents, try as often as possible to get computer-readable PDF documents that you can easily copy/paste or convert. Locked PDFs are essentially a lost cause .
  • Transcribe any audio files into written documents. There are free online tools available to help with this, such as 360converter . If you run a test through the system, you’ll see that the output is not 100%. The best way to use this tool is as a first draft generator. You can then correct and complete it with old fashioned, direct transcription.

Step 3: Decide on the type of qualitative research

Before step 3 you should have collected your data, transcribed it all into written-word documents, and compiled it in one place. Now comes the interesting part. You need to decide what you want to get out of your research by choosing an analytic angle, or type of qualitative research.

The available types of qualitative research are as follows. Each of them takes a unique angle that you must choose to get what information you want from the analysis . In addition, each of them has a different impact on the data analysis for qualitative research (coding vs word frequency) that we use.

Content analysis

Narrative analysis, discourse analysis.

  • Framework analysis, and/or

Grounded theory

From a high level, content, narrative, and discourse analysis are actionable independent tactics, whereas framework analysis and grounded theory are ways of honing and applying the first three.

  • Definition : Content analysis is identify and labelling themes of any kind within a text.
  • Focus : Identifying any kind of pattern in written text, transcribed audio, or transcribed video. This could be thematic, word repetition, idea repetition. Most often, the patterns we find are idea that make up an argument.
  • Goal : To simplify, standardize, and quickly reference ideas from any given text. Content analysis is a way to pull the main ideas from huge documents for comparison. In this way, it’s more a means to an end.
  • Pros : The huge advantage of doing content analysis is that you can quickly process huge amounts of texts using simple coding and word frequency techniques we will look at below. To use a metaphore, it is to qualitative analysis documents what Spark notes are to books.
  • Cons : The downside to content analysis is that it’s quite general. If you have a very specific, narrative research question, then tracing “any and all ideas” will not be very helpful to you.
  • Definition : Narrative analysis is the reformulation and simplification of interview answers or documentation into small narrative components to identify story-like patterns.
  • Focus : Understanding the text based on its narrative components as opposed to themes or other qualities.
  • Goal : To reference the text from an angle closer to the nature of texts in order to obtain further insights.
  • Pros : Narrative analysis is very useful for getting perspective on a topic in which you’re extremely limited. It can be easy to get tunnel vision when you’re digging for themes and ideas from a reason-centric perspective. Turning to a narrative approach will help you stay grounded. More importantly, it helps reveal different kinds of trends.
  • Cons : Narrative analysis adds another layer of subjectivity to the instinctive nature of qualitative research. Many see it as too dependent on the researcher to hold any critical value.
  • Definition : Discourse analysis is the textual analysis of naturally occurring speech. Any oral expression must be transcribed before undergoing legitimate discourse analysis.
  • Focus : Understanding ideas and themes through language communicated orally rather than pre-processed on paper.
  • Goal : To obtain insights from an angle outside the traditional content analysis on text.
  • Pros : Provides a considerable advantage in some areas of study in order to understand how people communicate an idea, versus the idea itself. For example, discourse analysis is important in political campaigning. People rarely vote for the candidate who most closely corresponds to his/her beliefs, but rather for the person they like the most.
  • Cons : As with narrative analysis, discourse analysis is more subjective in nature than content analysis, which focuses on ideas and patterns. Some do not consider it rigorous enough to be considered a legitimate subset of qualitative analysis, but these people are few.

Framework analysis

  • Definition : Framework analysis is a kind of qualitative analysis that includes 5 ordered steps: coding, indexing, charting, mapping, and interpreting . In most ways, framework analysis is a synonym for qualitative analysis — the same thing. The significant difference is the importance it places on the perspective used in the analysis.
  • Focus : Understanding patterns in themes and ideas.
  • Goal : Creating one specific framework for looking at a text.
  • Pros : Framework analysis is helpful when the researcher clearly understands what he/she wants from the project, as it’s a limitation approach. Since each of its step has defined parameters, framework analysis is very useful for teamwork.
  • Cons : It can lead to tunnel vision.
  • Definition : The use of content, narrative, and discourse analysis to examine a single case, in the hopes that discoveries from that case will lead to a foundational theory used to examine other like cases.
  • Focus : A vast approach using multiple techniques in order to establish patterns.
  • Goal : To develop a foundational theory.
  • Pros : When successful, grounded theories can revolutionize entire fields of study.
  • Cons : It’s very difficult to establish ground theories, and there’s an enormous amount of risk involved.

Step 4: Coding, word frequency, or both

Coding in data analysis for qualitative research is the process of writing 2-5 word codes that summarize at least 1 paragraphs of text (not writing computer code). This allows researchers to keep track of and analyze those codes. On the other hand, word frequency is the process of counting the presence and orientation of words within a text, which makes it the quantitative element in qualitative data analysis.

Video example of coding for data analysis in qualitative research

In short, coding in the context of data analysis for qualitative research follows 2 steps (video below):

  • Reading through the text one time
  • Adding 2-5 word summaries each time a significant theme or idea appears

Let’s look at a brief example of how to code for qualitative research in this video:

Click here for a link to the source text. 1

Example of word frequency processing

And word frequency is the process of finding a specific word or identifying the most common words through 3 steps:

  • Decide if you want to find 1 word or identify the most common ones
  • Use word’s “Replace” function to find a word or phrase
  • Use Text Analyzer to find the most common terms

Here’s another look at word frequency processing and how you to do it. Let’s look at the same example above, but from a quantitative perspective.

Imagine we are already familiar with melanoma and KITs , and we want to analyze the text based on these keywords. One thing we can do is look for these words using the Replace function in word

  • Locate the search bar
  • Click replace
  • Type in the word
  • See the total results

Here’s a brief video example:

Another option is to use an online Text Analyzer. This methodology won’t help us find a specific word, but it will help us discover the top performing phrases and words. All you need to do it put in a link to a target page or paste a text. I pasted the abstract from our source text, and what turns up is as expected. Here’s a picture:

text analyzer example

Step 5: Compile your data in a spreadsheet

After you have some coded data in the word document, you need to get it into excel for analysis. This process requires saving the word doc as an .htm extension, which makes it a website. Once you have the website, it’s as simple as opening that page, scrolling to the bottom, and copying/pasting the comments, or codes, into an excel document.

You will need to wrangle the data slightly in order to make it readable in excel. I’ve made a video to explain this process and places it below.

Step 6: Identify trends & analyze!

There are literally thousands of different ways to analyze qualitative data, and in most situations, the best technique depends on the information you want to get out of the research.

Nevertheless, there are a few go-to techniques. The most important of this is occurrences . In this short video, we finish the example from above by counting the number of times our codes appear. In this way, it’s very similar to word frequency (discussed above).

A few other options include:

  • Ranking each code on a set of relevant criteria and clustering
  • Pure cluster analysis
  • Causal analysis

We cover different types of analysis like this on the website, so be sure to check out other articles on the home page .

How to analyze qualitative data from an interview

To analyze qualitative data from an interview , follow the same 6 steps for quantitative data analysis:

  • Perform the interviews
  • Transcribe the interviews onto paper
  • Decide whether to either code analytical data (open, axial, selective), analyze word frequencies, or both
  • Compile your data in a spreadsheet using document saving techniques (for windows and mac)
  • Source text [ ↩ ]

About the Author

Noah is the founder & Editor-in-Chief at AnalystAnswers. He is a transatlantic professional and entrepreneur with 5+ years of corporate finance and data analytics experience, as well as 3+ years in consumer financial products and business software. He started AnalystAnswers to provide aspiring professionals with accessible explanations of otherwise dense finance and data concepts. Noah believes everyone can benefit from an analytical mindset in growing digital world. When he's not busy at work, Noah likes to explore new European cities, exercise, and spend time with friends and family.

File available immediately.

data analysis examples qualitative research

Notice: JavaScript is required for this content.

data analysis examples qualitative research

data analysis examples qualitative research

The Ultimate Guide to Qualitative Research - Part 2: Handling Qualitative Data

data analysis examples qualitative research

  • Handling qualitative data
  • Transcripts
  • Field notes
  • Survey data and responses
  • Visual and audio data
  • Data organization
  • Data coding
  • Coding frame
  • Auto and smart coding
  • Organizing codes
  • Introduction

What is qualitative data analysis?

Qualitative data analysis methods, how do you analyze qualitative data, content analysis, thematic analysis.

  • Thematic analysis vs. content analysis
  • Narrative research

Phenomenological research

Discourse analysis, grounded theory.

  • Deductive reasoning
  • Inductive reasoning
  • Inductive vs. deductive reasoning
  • Qualitative data interpretation
  • Qualitative analysis software

Qualitative data analysis

Analyzing qualitative data is the next step after you have completed the use of qualitative data collection methods . The qualitative analysis process aims to identify themes and patterns that emerge across the data.

data analysis examples qualitative research

In simplified terms, qualitative research methods involve non-numerical data collection followed by an explanation based on the attributes of the data . For example, if you are asked to explain in qualitative terms a thermal image displayed in multiple colors, then you would explain the color differences rather than the heat's numerical value. If you have a large amount of data (e.g., of group discussions or observations of real-life situations), the next step is to transcribe and prepare the raw data for subsequent analysis.

Researchers can conduct studies fully based on qualitative methodology, or researchers can preface a quantitative research study with a qualitative study to identify issues that were not originally envisioned but are important to the study. Quantitative researchers may also collect and analyze qualitative data following their quantitative analyses to better understand the meanings behind their statistical results.

Conducting qualitative research can especially help build an understanding of how and why certain outcomes were achieved (in addition to what was achieved). For example, qualitative data analysis is often used for policy and program evaluation research since it can answer certain important questions more efficiently and effectively than quantitative approaches.

data analysis examples qualitative research

Qualitative data analysis can also answer important questions about the relevance, unintended effects, and impact of programs, such as:

  • Were expectations reasonable?
  • Did processes operate as expected?
  • Were key players able to carry out their duties?
  • Were there any unintended effects of the program?

The importance of qualitative data analysis

Qualitative approaches have the advantage of allowing for more diversity in responses and the capacity to adapt to new developments or issues during the research process itself. While qualitative analysis of data can be demanding and time-consuming to conduct, many fields of research utilize qualitative software tools that have been specifically developed to provide more succinct, cost-efficient, and timely results.

data analysis examples qualitative research

Qualitative data analysis is an important part of research and building greater understanding across fields for a number of reasons. First, cases for qualitative data analysis can be selected purposefully according to whether they typify certain characteristics or contextual locations. In other words, qualitative data permits deep immersion into a topic, phenomenon, or area of interest. Rather than seeking generalizability to the population the sample of participants represent, qualitative research aims to construct an in-depth and nuanced understanding of the research topic.

Secondly, the role or position of the researcher in qualitative analysis of data is given greater critical attention. This is because, in qualitative data analysis, the possibility of the researcher taking a ‘neutral' or transcendent position is seen as more problematic in practical and/or philosophical terms. Hence, qualitative researchers are often exhorted to reflect on their role in the research process and make this clear in the analysis.

data analysis examples qualitative research

Thirdly, while qualitative data analysis can take a wide variety of forms, it largely differs from quantitative research in the focus on language, signs, experiences, and meaning. In addition, qualitative approaches to analysis are often holistic and contextual rather than analyzing the data in a piecemeal fashion or removing the data from its context. Qualitative approaches thus allow researchers to explore inquiries from directions that could not be accessed with only numerical quantitative data.

Establishing research rigor

Systematic and transparent approaches to the analysis of qualitative data are essential for rigor . For example, many qualitative research methods require researchers to carefully code data and discern and document themes in a consistent and credible way.

data analysis examples qualitative research

Perhaps the most traditional division in the way qualitative and quantitative research have been used in the social sciences is for qualitative methods to be used for exploratory purposes (e.g., to generate new theory or propositions) or to explain puzzling quantitative results, while quantitative methods are used to test hypotheses .

data analysis examples qualitative research

After you’ve collected relevant data , what is the best way to look at your data ? As always, it will depend on your research question . For instance, if you employed an observational research method to learn about a group’s shared practices, an ethnographic approach could be appropriate to explain the various dimensions of culture. If you collected textual data to understand how people talk about something, then a discourse analysis approach might help you generate key insights about language and communication.

data analysis examples qualitative research

The qualitative data coding process involves iterative categorization and recategorization, ensuring the evolution of the analysis to best represent the data. The procedure typically concludes with the interpretation of patterns and trends identified through the coding process.

To start off, let’s look at two broad approaches to data analysis.

Deductive analysis

Deductive analysis is guided by pre-existing theories or ideas. It starts with a theoretical framework , which is then used to code the data. The researcher can thus use this theoretical framework to interpret their data and answer their research question .

The key steps include coding the data based on the predetermined concepts or categories and using the theory to guide the interpretation of patterns among the codings. Deductive analysis is particularly useful when researchers aim to verify or extend an existing theory within a new context.

Inductive analysis

Inductive analysis involves the generation of new theories or ideas based on the data. The process starts without any preconceived theories or codes, and patterns, themes, and categories emerge out of the data.

data analysis examples qualitative research

The researcher codes the data to capture any concepts or patterns that seem interesting or important to the research question . These codes are then compared and linked, leading to the formation of broader categories or themes. The main goal of inductive analysis is to allow the data to 'speak for itself' rather than imposing pre-existing expectations or ideas onto the data.

Deductive and inductive approaches can be seen as sitting on opposite poles, and all research falls somewhere within that spectrum. Most often, qualitative analysis approaches blend both deductive and inductive elements to contribute to the existing conversation around a topic while remaining open to potential unexpected findings. To help you make informed decisions about which qualitative data analysis approach fits with your research objectives, let's look at some of the common approaches for qualitative data analysis.

Content analysis is a research method used to identify patterns and themes within qualitative data. This approach involves systematically coding and categorizing specific aspects of the content in the data to uncover trends and patterns. An often important part of content analysis is quantifying frequencies and patterns of words or characteristics present in the data .

It is a highly flexible technique that can be adapted to various data types , including text, images, and audiovisual content . While content analysis can be exploratory in nature, it is also common to use pre-established theories and follow a more deductive approach to categorizing and quantifying the qualitative data.

data analysis examples qualitative research

Thematic analysis is a method used to identify, analyze, and report patterns or themes within the data. This approach moves beyond counting explicit words or phrases and focuses on also identifying implicit concepts and themes within the data.

data analysis examples qualitative research

Researchers conduct detailed coding of the data to ascertain repeated themes or patterns of meaning. Codes can be categorized into themes, and the researcher can analyze how the themes relate to one another. Thematic analysis is flexible in terms of the research framework, allowing for both inductive (data-driven) and deductive (theory-driven) approaches. The outcome is a rich, detailed, and complex account of the data.

Grounded theory is a systematic qualitative research methodology that is used to inductively generate theory that is 'grounded' in the data itself. Analysis takes place simultaneously with data collection , and researchers iterate between data collection and analysis until a comprehensive theory is developed.

Grounded theory is characterized by simultaneous data collection and analysis, the development of theoretical codes from the data, purposeful sampling of participants, and the constant comparison of data with emerging categories and concepts. The ultimate goal is to create a theoretical explanation that fits the data and answers the research question .

Discourse analysis is a qualitative research approach that emphasizes the role of language in social contexts. It involves examining communication and language use beyond the level of the sentence, considering larger units of language such as texts or conversations.

data analysis examples qualitative research

Discourse analysts typically investigate how social meanings and understandings are constructed in different contexts, emphasizing the connection between language and power. It can be applied to texts of all kinds, including interviews , documents, case studies , and social media posts.

Phenomenological research focuses on exploring how human beings make sense of an experience and delves into the essence of this experience. It strives to understand people's perceptions, perspectives, and understandings of a particular situation or phenomenon.

data analysis examples qualitative research

It involves in-depth engagement with participants, often through interviews or conversations, to explore their lived experiences. The goal is to derive detailed descriptions of the essence of the experience and to interpret what insights or implications this may bear on our understanding of this phenomenon.

data analysis examples qualitative research

Whatever your data analysis approach, start with ATLAS.ti

Qualitative data analysis done quickly and intuitively with ATLAS.ti. Download a free trial today.

Now that we've summarized the major approaches to data analysis, let's look at the broader process of research and data analysis. Suppose you need to do some research to find answers to any kind of research question, be it an academic inquiry, business problem, or policy decision. In that case, you need to collect some data. There are many methods of collecting data: you can collect primary data yourself by conducting interviews, focus groups , or a survey , for instance. Another option is to use secondary data sources. These are data previously collected for other projects, historical records, reports, statistics – basically everything that exists already and can be relevant to your research.

data analysis examples qualitative research

The data you collect should always be a good fit for your research question . For example, if you are interested in how many people in your target population like your brand compared to others, it is no use to conduct interviews or a few focus groups . The sample will be too small to get a representative picture of the population. If your questions are about "how many….", "what is the spread…" etc., you need to conduct quantitative research . If you are interested in why people like different brands, their motives, and their experiences, then conducting qualitative research can provide you with the answers you are looking for.

Let's describe the important steps involved in conducting research.

Step 1: Planning the research

As the saying goes: "Garbage in, garbage out." Suppose you find out after you have collected data that

  • you talked to the wrong people
  • asked the wrong questions
  • a couple of focus groups sessions would have yielded better results because of the group interaction, or
  • a survey including a few open-ended questions sent to a larger group of people would have been sufficient and required less effort.

Think thoroughly about sampling, the questions you will be asking, and in which form. If you conduct a focus group or an interview, you are the research instrument, and your data collection will only be as good as you are. If you have never done it before, seek some training and practice. If you have other people do it, make sure they have the skills.

data analysis examples qualitative research

Step 2: Preparing the data

When you conduct focus groups or interviews, think about how to transcribe them. Do you want to run them online or offline? If online, check out which tools can serve your needs, both in terms of functionality and cost. For any audio or video recordings , you can consider using automatic transcription software or services. Automatically generated transcripts can save you time and money, but they still need to be checked. If you don't do this yourself, make sure that you instruct the person doing it on how to prepare the data.

  • How should the final transcript be formatted for later analysis?
  • Which names and locations should be anonymized?
  • What kind of speaker IDs to use?

What about survey data ? Some survey data programs will immediately provide basic descriptive-level analysis of the responses. ATLAS.ti will support you with the analysis of the open-ended questions. For this, you need to export your data as an Excel file. ATLAS.ti's survey import wizard will guide you through the process.

Other kinds of data such as images, videos, audio recordings, text, and more can be imported to ATLAS.ti. You can organize all your data into groups and write comments on each source of data to maintain a systematic organization and documentation of your data.

data analysis examples qualitative research

Step 3: Exploratory data analysis

You can run a few simple exploratory analyses to get to know your data. For instance, you can create a word list or word cloud of all your text data or compare and contrast the words in different documents. You can also let ATLAS.ti find relevant concepts for you. There are many tools available that can automatically code your text data, so you can also use these codings to explore your data and refine your coding.

data analysis examples qualitative research

For instance, you can get a feeling for the sentiments expressed in the data. Who is more optimistic, pessimistic, or neutral in their responses? ATLAS.ti can auto-code the positive, negative, and neutral sentiments in your data. Naturally, you can also simply browse through your data and highlight relevant segments that catch your attention or attach codes to begin condensing the data.

data analysis examples qualitative research

Step 4: Build a code system

Whether you start with auto-coding or manual coding, after having generated some first codes, you need to get some order in your code system to develop a cohesive understanding. You can build your code system by sorting codes into groups and creating categories and subcodes. As this process requires reading and re-reading your data, you will become very familiar with your data. Counting on a tool like ATLAS.ti qualitative data analysis software will support you in the process and make it easier to review your data, modify codings if necessary, change code labels, and write operational definitions to explain what each code means.

data analysis examples qualitative research

Step 5: Query your coded data and write up the analysis

Once you have coded your data, it is time to take the analysis a step further. When using software for qualitative data analysis , it is easy to compare and contrast subsets in your data, such as groups of participants or sets of themes.

data analysis examples qualitative research

For instance, you can query the various opinions of female vs. male respondents. Is there a difference between consumers from rural or urban areas or among different age groups or educational levels? Which codes occur together throughout the data set? Are there relationships between various concepts, and if so, why?

Step 6: Data visualization

Data visualization brings your data to life. It is a powerful way of seeing patterns and relationships in your data. For instance, diagrams allow you to see how your codes are distributed across documents or specific subpopulations in your data.

data analysis examples qualitative research

Exploring coded data on a canvas, moving around code labels in a virtual space, linking codes and other elements of your data set, and thinking about how they are related and why – all of these will advance your analysis and spur further insights. Visuals are also great for communicating results to others.

Step 7: Data presentation

The final step is to summarize the analysis in a written report . You can now put together the memos you have written about the various topics, select some salient quotes that illustrate your writing, and add visuals such as tables and diagrams. If you follow the steps above, you will already have all the building blocks, and you just have to put them together in a report or presentation.

When preparing a report or a presentation, keep your audience in mind. Does your audience better understand numbers than long sections of detailed interpretations? If so, add more tables, charts, and short supportive data quotes to your report or presentation. If your audience loves a good interpretation, add your full-length memos and walk your audience through your conceptual networks and illustrative data quotes.

data analysis examples qualitative research

Qualitative data analysis begins with ATLAS.ti

For tools that can make the most out of your data, check out ATLAS.ti with a free trial.

Learn / Guides / Qualitative data analysis guide

Back to guides

6 qualitative data analysis examples to inspire you

Qualitative data analysis is complex, and without seeing examples of successful QDA in action, it can seem like an overwhelming, time-consuming process. 

But the value of QDA—the customer insights and ideas you'll uncover—makes the process worth it, and you might be surprised at how efficient (and even fun!) some QDA methods can be.

Last updated

Reading time.

6 Qualitative Data Analysis Examples To Inspire you

When you think about data, you probably think quantitative first: facts, figures, and numbers. You can line them up neatly in a spreadsheet, and suddenly they just make sense. 

You know qualitative data is crucial too , but how do you organize and interpret all those words, emotions, and motivations once you collect them? 

This guide looks at six qualitative data analysis examples from companies that got real results. For each one, we look at the type of analysis used and how it played a role in the company’s success—so you can walk away with exciting new techniques to try.

Find clarity on what customers want

Hotjar's product experience insights help teams collect qualitative data so you can deliver a better customer experience.

Get inspired with 6 qualitative data analysis examples 

All companies can benefit from qualitative data analysis to better understand their customers. The question is: which QDA methods are the most effective? 

Qualitative data analysis isn't a one-size-fits-all process —different teams can benefit from different qualitative data analysis types. For example, you might be looking for ways to analyze product reviews, while another team might be trying to make sense of thousands of survey responses.

Sometimes a glimpse into the successful processes of other companies can help you pick up new tricks of your own. Here are six qualitative data analysis examples to inspire you to improve your own process:

1. Art.com 

Art.com is an ecommerce company selling art prints. Their 100% happiness guarantee—they’ll issue a full refund, no questions asked—shows their commitment to putting customers first. But to be proactive—so you can create a delightful customer experience from the start —it helps to collect and analyze data to see what people really want and need.

Their approach to qualitative data analysis

Art.com used Net Promoter Score® (NPS) surveys to ask customers to rate, and then comment in their own words, whether they'd recommend the company to friends or colleagues. 

Collecting the data was one thing, but analyzing it was another. One person was tasked with combing through spreadsheets of insights, using the program’s 'search' function to manually find key words and phrases.

Art.com wanted a Natural Language Processing (NLP) solution to analyze the data for them, so they turned to a tool called Thematic , which allowed them to automatically find and sort survey responses by customized themes. (Note: this qualitative data analysis type is simply called—you guessed it—thematic analysis.)

One Thematic feature essential to Art.com’s analysis was the ability to see how customers' feelings about the company, their products, and the buying experience impacted the bottom line. In other words, the tool allowed them to chart qualitative data alongside quantitative performance data to make actionable changes.

Thematic’s Impact tool

Thematic’s Impact tool

But analysis doesn’t have to be done in a silo. Remember how Art.com had one person poring over data all alone? Thematic enabled the company to create a plan for sharing the responsibility for data analysis. Now Art.com has Team Consumer Leaders: team members who take ownership of the analysis processes each month.

Qualitative data analysis for the win

The results: Art.com spent less time manually combing through data, and shifted the load from one person to a whole team of analysts through data democratization . Plus, they gained a better understanding of customers’ feelings and reactions from NPS surveys, because they could analyze the impact the results had on business performance. 

If this was your company: automatically classifying feedback into categories or themes makes it easier to base decisions on qualitative data versus just a hunch. Follow Art.com's example of using QDA to make customer-centric product decisions and deliver a better user experience.

Pro tip: use Hotjar Net Promoter Score® (NPS) Surveys to create and customize surveys to give your customers. 

In addition to simple rating scales, Hotjar's NPS surveys let you ask short follow-up questions, to gain additional context in the voice of the customer (VoC). You can put these surveys directly on your website, or email them to your customer list. 

With NPS surveys, you can gather valuable insights about what your customers are really thinking —and analyze the responses to find ways to improve their experience.

A household name in the UK, Matalan offers savings on family goods at over 200 retail locations. When they migrated to a new website, their big question was: how can we provide the same smooth experience online we’re known for in-store?

To find the answer to that question, Matalan turned to Hotjar (that’s us! 👋). The user experience team at Matalan started by using our Survey tool to check in with customers, to see what they thought of the new site. Then they dug into a couple of other Hotjar tools for added context—for example, they found that pairing the Feedback widget with Session Recordings was the eye-opening combo they needed:

Hotjar really empowers you to be able to see exactly what your users are doing, how they’re feeling and ultimately their reactions to the changes you make. Without Hotjar we would still be making decisions based on gut instinct instead of qualitative user feedback.

But the Matalan team didn’t stop there. They built a custom dashboard in Google Data Studio as a home base for analyzing their results. When they integrated their Feedback results with Google Data Studio, they could conduct qualitative analysis using the same method we mentioned above: thematic analysis. Organizing the information by theme helped the team spot trends that they could use to inform website changes to A/B test.

#Matalan’s Hotjar data in a custom Google Data Studio dashboard

The results: after using Hotjar to create hypotheses about customer behavior, Matalan’s success rate in split testing for the website went up by 17%. Then, by adding Google Data Studio into the picture, they could dig even deeper into the analytical process. They also found this was a great way to get more eyes on data within the company—and open the lines of communication across teams.

If this was your company: qualitative data analysis can help create clarity around the real user experience, and can help you make customer-centric design decisions to reduce friction for website visitors.

Pro tip : want to follow Matalan’s lead? 

Hotjar has a step-by-step process for open-ended question analysis . Read our tutorial to learn how to export survey results into Google Sheets—we’ve even included a template to get you started.

Yatter is an agency that helps businesses generate more pay-per-click leads so they can scale and grow. Gavin Bell, Yatter’s founder, helps optimize his clients’ (and his own) social media ads and landing pages to drive traffic and make sales. 

Yatter's approach to qualitative data analysis

Gavin’s style of analysis fits squarely into one of the qualitative data analysis types called diagnostic or root-cause analysis. Essentially, this method investigates why people make decisions by looking for outliers or patterns in data, and can be used for both qualitative and quantitative research.

For their qualitative data analysis, Yatter leans heavily on Hotjar Recordings to understand the user experience on websites—and make improvements accordingly. Gavin’s tip? Always watch five recordings of a customer interacting with a site before making any changes to it . 

On one website he was working on for an ecommerce store for car parts, Gavin knew that users left during the checkout process, and wanted to understand why. He watched user after user get confused during checkout, and click on the menu icon instead. As a result, Gavin decided to remove the menu button from that page.

#A Tweet showing Yatter’s success with Hotjar

On his personal site, watching recordings helped Gavin realize that leads spent a long time coming up with a user name to enter in a form. Seeing this behavior led Gavin to auto-fill the form with users’ emails, saving them several seconds in the process and improving their journey.

Hotjar lets you go granular and really understand the individuals using the page. In other words, it turns data into life.

The results: by watching session recordings, Gavin could spot even the smallest bugs and stumbling blocks and find solutions. For example, Yatter increased conversions for one client by 20%just by removing the menu button from the checkout page. For his own page, Gavin was happy to have saved time for visitors, knowing that satisfied leads and customers are the ones that stick around. 

If this was your company: in addition to driving sales, qualitative data analysis provides you with empathetic insights into who customers are, why they do what they do, and what they need to be happy , so you can make the right changes at the right time to create customer delight .

4. WatchShop

An independent retailer based in the UK, WatchShop specializes in selling brand-name and luxury watches directly to the consumer (also known as business-to-consumer, or B2C). The company created its first ecommerce website back in 2007, and continuously makes changes and improvements to the site. WatchShop's goals? To help more leads find the site and optimize their CX.

WatchShop already knew the value of behavioral data—which is why they watched Hotjar Session Recordings. 😉 But they needed help understanding the qualitative insights they were collecting, so explored a QDA method called sentiment analysis. 

Sentiment analysis focuses on emotion in textual data from surveys, reviews, emails, and other sources. Put simply, sentiment analysis helps you understand how customers feel—and why they feel that way. 

WatchShop selected Lumoa , an artificial intelligence-based tool, to help streamline all their text-based data sources. The software then produced an overall customer sentiment score, which functions as a key performance indicator (KPI) that all stakeholders can monitor.

When their customer sentiment score substantially dropped or increased at any point, WatchShop used QDA to understand why . Then, they tasked the appropriate teams to fix the negatives, and take advantage of the positives.

Since Lumoa can integrate with other platforms, WatchShop connected it with TrustPilot, a ratings site, to analyze customer reviews. WatchShop also uses Lumoa to analyze competitors’ reviews, to look at how other brands are perceived—and to figure out what they can learn from their peers.

The results: for one of their clients, WatchShop hoped to improve Product Listing Pages. Using sentiment analysis, the company uncovered issues in the customer journey they hadn’t noticed before, and used their learnings to develop ideas for website changes. In the first round of tests, the company’s conversion rate improved by 4%, and after the second round, conversion rates increased by 10%. 

If this was your company: using a QDA tool like Lumoa helps teams centralize the analytics process, so you can quickly interpret large volumes of qualitative data. Sorting this data also helps you prioritize initiatives based on which issues are most important to your customers.

5. Materials Market

Materials Market does just what their name promises: facilitates trade between construction customers and the suppliers that have the materials they need. The UK-based ecommerce company wants their website to run as smoothly as possible for customers—so they turned to Hotjar for help .

Qualitative data analysis doesn’t have to be fancy to be effective. Andrew Haehn, one of the founders of Materials Market and the Operations Director, takes a simple approach.

Over breakfast every morning, Andrew watches 20 minutes of Hotjar Recordings , carefully observing how users interact with the site. While he eats, he analyzes what’s going well and what needs improvement. 

Why this approach works: consistency . By watching recordings each day, Andrew becomes familiar with users’ standard behaviors—and more attuned to what might be throwing them off track. 

To be even more effective, Andrew sorts recordings by relevance: Hotjar’s algorithm helps him find the most valuable recordings—those marked 'high' or 'very high'—to help him prioritize his time.

#Hotjar’s relevance algorithm surfaces the most useful recordings

One tip from Andrew is to analyze qualitative data alongside quantitative data —from Hotjar’s Heatmaps , for example, which visually depict the most and least popular areas of a web page—to spot areas of confusion and verify user experience issues.

Qualitative data analysis for the win 

The results: Materials Market used Hotjar to collect and analyze qualitative data—and quickly discovered ways to improve the customer experience. Some of the company’s impressive results after watching recordings included: 

A decrease in cart abandonment rate from 25% to 4%

An increase in conversion rate of paying customers from 0.5% to 1.6% (in a single month)

An increase of more than £10,000 in revenue (due to the improved conversion rate)

If this was your company: qualitative data analysis complements quantitative data analysis to help minimize customers' frustrations and maximize profits. Setting time limits and sorting recordings by relevance keeps the analytical process quick and painless.

MURAL , a company offering digital whiteboard solutions, specializes in creative and collaborative problem solving. So, it’s only natural that they used the same techniques in their approach to qualitative data analysis.

MURAL has been refining their qualitative data analysis skills for years, using different methods along the way . Eventually, as the company grew, it sought out a centralized hub for analyzing customer feedback and other insights. 

MURAL, under co-founder and Head of Product Augustin Soler, turned to EnjoyHQ as their platform of choice. EnjoyHQ helped the company collate qualitative data, generate metrics from that data, and conduct thematic analysis. 

As a team that craves data visualization, they export results from EnjoyHQ onto a MURAL whiteboard so they can arrange information to spark discussion and collaboration. Then they use qualitative data analysis as part of their planning process: product teams can home in on a particular feature they plan to update or release down the road, analyze results for that feature, and use it to inform their work.

#A MURAL canvas displaying data from EnjoyHQ

The results: EnjoyHQ helped MURAL shape their qualitative data analysis process—now they can analyze customer feedback in a more structured way, leading to improved communication and collaboration.

If this was your company: collecting and analyzing qualitative data is vital to optimizing product decisions. Don't be afraid to try new qualitative data analysis methods —or to customize solutions to meet your specific needs.

Pro tip: personalized communication shows customers you care, which can improve brand loyalty and trust. 

For example, when MURAL releases new features, they follow up by sending emails to the people who requested them. Customers then know the company was listening and is taking action to meet their specific needs.

Find ways to make your qualitative data work for you

The qualitative data analysis examples on this page show the clear results that come from focusing on customer insights.  

Qualitative data amplifies the success you're already achieving from crunching numbers in quantitative analysis. By using new types of qualitative data analysis in your team’s processes, you can stop relying on your gut—and instead make data-backed, user-centric product decisions.

FAQs about qualitative data analysis

What are some examples of qualitative data analysis.

Qualitative data analysis examples include taking a closer look at results from surveys, online reviews, website recordings, emails, interviews, and other text sources by using tools and methods like:

Thematic analysis with tools like Thematic.com and EnjoyHQ

Sentiment analysis with tools like Lumoa

Root-cause analysis with tools like Hotjar

What are the types of qualitative data analysis?

There are many qualitative data analysis types to explore. Some types include: 

Root-cause analysis: exploring patterns in data to find answers

Thematic analysis: looking for common themes that emerge

Sentiment analysis: exploring what people feel and why

Narrative analysis: examining the stories people tell 

How can you get started with qualitative data analysis?

You don’t have to make big investments with time or money to start qualitative data analysis. All you need to get started are a free Hotjar account to collect product experience insights, and a few minutes a day to watch session recordings and review survey and feedback responses. Look for trends or patterns that stand out, and consider why users behave the way they do. What might they be thinking or feeling?

What are the benefits of qualitative data analysis?

Qualitative data analysis can have many benefits for a company, by helping stakeholders think about their product, website, and customers in new ways. Some specific advantages include:

Building customer empathy

Improving customer acquisition and retention

Boosting engagement 

Recognizing confusion about messaging

Improving website experiences

How do qualitative and quantitative data analysis work together?

Qualitative and quantitative data analysis go hand in hand. Quantitative is a good starting point to find out what is happening in your business, but qualitative helps you figure out why . Using the two together can help you understand how customers’ thoughts, feelings, and behaviors are driving key financial metrics in your business.

QDA challenges

Previous chapter

Next chapter

Logo for Open Educational Resources

Chapter 18. Data Analysis and Coding

Introduction.

Piled before you lie hundreds of pages of fieldnotes you have taken, observations you’ve made while volunteering at city hall. You also have transcripts of interviews you have conducted with the mayor and city council members. What do you do with all this data? How can you use it to answer your original research question (e.g., “How do political polarization and party membership affect local politics?”)? Before you can make sense of your data, you will have to organize and simplify it in a way that allows you to access it more deeply and thoroughly. We call this process coding . [1] Coding is the iterative process of assigning meaning to the data you have collected in order to both simplify and identify patterns. This chapter introduces you to the process of qualitative data analysis and the basic concept of coding, while the following chapter (chapter 19) will take you further into the various kinds of codes and how to use them effectively.

To those who have not yet conducted a qualitative study, the sheer amount of collected data will be a surprise. Qualitative data can be absolutely overwhelming—it may mean hundreds if not thousands of pages of interview transcripts, or fieldnotes, or retrieved documents. How do you make sense of it? Students often want very clear guidelines here, and although I try to accommodate them as much as possible, in the end, analyzing qualitative data is a bit more of an art than a science: “The process of bringing order, structure, and interpretation to a mass of collected data is messy, ambiguous, time-consuming, creative, and fascinating. It does not proceed in a linear fashion: it is not neat. At times, the researcher may feel like an eccentric and tormented artist; not to worry, this is normal” ( Marshall and Rossman 2016:214 ).

To complicate matters further, each approach (e.g., Grounded Theory, deep ethnography, phenomenology) has its own language and bag of tricks (techniques) when it comes to analysis. Grounded Theory, for example, uses in vivo coding to generate new theoretical insights that emerge from a rigorous but open approach to data analysis. Ethnographers, in contrast, are more focused on creating a rich description of the practices, behaviors, and beliefs that operate in a particular field. They are less interested in generating theory and more interested in getting the picture right, valuing verisimilitude in the presentation. And then there are some researchers who seek to account for the qualitative data using almost quantitative methods of analysis, perhaps counting and comparing the uses of certain narrative frames in media accounts of a phenomenon. Qualitative content analysis (QCA) often includes elements of counting (see chapter 17). For these researchers, having very clear hypotheses and clearly defined “variables” before beginning analysis is standard practice, whereas the same would be expressly forbidden by those researchers, like grounded theorists, taking a more emergent approach.

All that said, there are some helpful techniques to get you started, and these will be presented in this and the following chapter. As you become more of an expert yourself, you may want to read more deeply about the tradition that speaks to your research. But know that there are many excellent qualitative researchers that use what works for any given study, who take what they can from each tradition. Most of us find this permissible (but watch out for the methodological purists that exist among us).

Null

Qualitative Data Analysis as a Long Process!

Although most of this and the following chapter will focus on coding, it is important to understand that coding is just one (very important) aspect of the long data-analysis process. We can consider seven phases of data analysis, each of which is important for moving your voluminous data into “findings” that can be reported to others. The first phase involves data organization. This might mean creating a special password-protected Dropbox folder for storing your digital files. It might mean acquiring computer-assisted qualitative data-analysis software ( CAQDAS ) and uploading all transcripts, fieldnotes, and digital files to its storage repository for eventual coding and analysis. Finding a helpful way to store your material can take a lot of time, and you need to be smart about this from the very beginning. Losing data because of poor filing systems or mislabeling is something you want to avoid. You will also want to ensure that you have procedures in place to protect the confidentiality of your interviewees and informants. Filing signed consent forms (with names) separately from transcripts and linking them through an ID number or other code that only you have access to (and store safely) are important.

Once you have all of your material safely and conveniently stored, you will need to immerse yourself in the data. The second phase consists of reading and rereading or viewing and reviewing all of your data. As you do this, you can begin to identify themes or patterns in the data, perhaps writing short memos to yourself about what you are seeing. You are not committing to anything in this third phase but rather keeping your eyes and mind open to what you see. In an actual study, you may very well still be “in the field” or collecting interviews as you do this, and what you see might push you toward either concluding your data collection or expanding so that you can follow a particular group or factor that is emerging as important. For example, you may have interviewed twelve international college students about how they are adjusting to life in the US but realized as you read your transcripts that important gender differences may exist and you have only interviewed two women (and ten men). So you go back out and make sure you have enough female respondents to check your impression that gender matters here. The seven phases do not proceed entirely linearly! It is best to think of them as recursive; conceptually, there is a path to follow, but it meanders and flows.

Coding is the activity of the fourth phase . The second part of this chapter and all of chapter 19 will focus on coding in greater detail. For now, know that coding is the primary tool for analyzing qualitative data and that its purpose is to both simplify and highlight the important elements buried in mounds of data. Coding is a rigorous and systematic process of identifying meaning, patterns, and relationships. It is a more formal extension of what you, as a conscious human being, are trained to do every day when confronting new material and experiences. The “trick” or skill is to learn how to take what you do naturally and semiconsciously in your mind and put it down on paper so it can be documented and verified and tested and refined.

At the conclusion of the coding phase, your material will be searchable, intelligible, and ready for deeper analysis. You can begin to offer interpretations based on all the work you have done so far. This fifth phase might require you to write analytic memos, beginning with short (perhaps a paragraph or two) interpretations of various aspects of the data. You might then attempt stitching together both reflective and analytical memos into longer (up to five pages) general interpretations or theories about the relationships, activities, patterns you have noted as salient.

As you do this, you may be rereading the data, or parts of the data, and reviewing your codes. It’s possible you get to this phase and decide you need to go back to the beginning. Maybe your entire research question or focus has shifted based on what you are now thinking is important. Again, the process is recursive , not linear. The sixth phase requires you to check the interpretations you have generated. Are you really seeing this relationship, or are you ignoring something important you forgot to code? As we don’t have statistical tests to check the validity of our findings as quantitative researchers do, we need to incorporate self-checks on our interpretations. Ask yourself what evidence would exist to counter your interpretation and then actively look for that evidence. Later on, if someone asks you how you know you are correct in believing your interpretation, you will be able to explain what you did to verify this. Guard yourself against accusations of “ cherry-picking ,” selecting only the data that supports your preexisting notion or expectation about what you will find. [2]

The seventh and final phase involves writing up the results of the study. Qualitative results can be written in a variety of ways for various audiences (see chapter 20). Due to the particularities of qualitative research, findings do not exist independently of their being written down. This is different for quantitative research or experimental research, where completed analyses can somewhat speak for themselves. A box of collected qualitative data remains a box of collected qualitative data without its written interpretation. Qualitative research is often evaluated on the strength of its presentation. Some traditions of qualitative inquiry, such as deep ethnography, depend on written thick descriptions, without which the research is wholly incomplete, even nonexistent. All of that practice journaling and writing memos (reflective and analytical) help develop writing skills integral to the presentation of the findings.

Remember that these are seven conceptual phases that operate in roughly this order but with a lot of meandering and recursivity throughout the process. This is very different from quantitative data analysis, which is conducted fairly linearly and processually (first you state a falsifiable research question with hypotheses, then you collect your data or acquire your data set, then you analyze the data, etc.). Things are a bit messier when conducting qualitative research. Embrace the chaos and confusion, and sort your way through the maze. Budget a lot of time for this process. Your research question might change in the middle of data collection. Don’t worry about that. The key to being nimble and flexible in qualitative research is to start thinking and continue thinking about your data, even as it is being collected. All seven phases can be started before all the data has been gathered. Data collection does not always precede data analysis. In some ways, “qualitative data collection is qualitative data analysis.… By integrating data collection and data analysis, instead of breaking them up into two distinct steps, we both enrich our insights and stave off anxiety. We all know the anxiety that builds when we put something off—the longer we put it off, the more anxious we get. If we treat data collection as this mass of work we must do before we can get started on the even bigger mass of work that is analysis, we set ourselves up for massive anxiety” ( Rubin 2021:182–183 ; emphasis added).

The Coding Stage

A code is “a word or short phrase that symbolically assigns a summative, salient, essence-capturing, and/or evocative attribute for a portion of language-based or visual data” ( Saldaña 2014:5 ). Codes can be applied to particular sections of or entire transcripts, documents, or even videos. For example, one might code a video taken of a preschooler trying to solve a puzzle as “puzzle,” or one could take the transcript of that video and highlight particular sections or portions as “arranging puzzle pieces” (a descriptive code) or “frustration” (a summative emotion-based code). If the preschooler happily shouts out, “I see it!” you can denote the code “I see it!” (this is an example of an in vivo, participant-created code). As one can see from even this short example, there are many different kinds of codes and many different strategies and techniques for coding, more of which will be discussed in detail in chapter 19. The point to remember is that coding is a rigorous systematic process—to some extent, you are always coding whenever you look at a person or try to make sense of a situation or event, but you rarely do this consciously. Coding is the process of naming what you are seeing and how you are simplifying the data so that you can make sense of it in a way that is consistent with your study and in a way that others can understand and follow and replicate. Another way of saying this is that a code is “a researcher-generated interpretation that symbolizes or translates data” ( Vogt et al. 2014:13 ).

As with qualitative data analysis generally, coding is often done recursively, meaning that you do not merely take one pass through the data to create your codes. Saldaña ( 2014 ) differentiates first-cycle coding from second-cycle coding. The goal of first-cycle coding is to “tag” or identify what emerges as important codes. Note that I said emerges—you don’t always know from the beginning what will be an important aspect of the study or not, so the coding process is really the place for you to begin making the kinds of notes necessary for future analyses. In second-cycle coding, you will want to be much more focused—no longer gathering wholly new codes but synthesizing what you have into metacodes.

You might also conceive of the coding process in four parts (figure 18.1). First, identify a representative or diverse sample set of interview transcripts (or fieldnotes or other documents). This is the group you are going to use to get a sense of what might be emerging. In my own study of career obstacles to success among first-generation and working-class persons in sociology, I might select one interview from each career stage: a graduate student, a junior faculty member, a senior faculty member.

data analysis examples qualitative research

Second, code everything (“ open coding ”). See what emerges, and don’t limit yourself in any way. You will end up with a ton of codes, many more than you will end up with, but this is an excellent way to not foreclose an interesting finding too early in the analysis. Note the importance of starting with a sample of your collected data, because otherwise, open coding all your data is, frankly, impossible and counterproductive. You will just get stuck in the weeds.

Third, pare down your coding list. Where you may have begun with fifty (or more!) codes, you probably want no more than twenty remaining. Go back through the weeds and pull out everything that does not have the potential to bloom into a nicely shaped garden. Note that you should do this before tackling all of your data . Sometimes, however, you might need to rethink the sample you chose. Let’s say that the graduate student interview brought up some interesting gender issues that were pertinent to female-identifying sociologists, but both the junior and the senior faculty members identified as male. In that case, I might read through and open code at least one other interview transcript, perhaps a female-identifying senior faculty member, before paring down my list of codes.

This is also the time to create a codebook if you are using one, a master guide to the codes you are using, including examples (see Sample Codebooks 1 and 2 ). A codebook is simply a document that lists and describes the codes you are using. It is easy to forget what you meant the first time you penciled a coded notation next to a passage, so the codebook allows you to be clear and consistent with the use of your codes. There is not one correct way to create a codebook, but generally speaking, the codebook should include (1) the code (either name or identification number or both), (2) a description of what the code signifies and when and where it should be applied, and (3) an example of the code to help clarify (2). Listing all the codes down somewhere also allows you to organize and reorganize them, which can be part of the analytical process. It is possible that your twenty remaining codes can be neatly organized into five to seven master “themes.” Codebooks can and should develop as you recursively read through and code your collected material. [3]

Fourth, using the pared-down list of codes (or codebook), read through and code all the data. I know many qualitative researchers who work without a codebook, but it is still a good practice, especially for beginners. At the very least, read through your list of codes before you begin this “ closed coding ” step so that you can minimize the chance of missing a passage or section that needs to be coded. The final step is…to do it all again. Or, at least, do closed coding (step four) again. All of this takes a great deal of time, and you should plan accordingly.

Researcher Note

People often say that qualitative research takes a lot of time. Some say this because qualitative researchers often collect their own data. This part can be time consuming, but to me, it’s the analytical process that takes the most time. I usually read every transcript twice before starting to code, then it usually takes me six rounds of coding until I’m satisfied I’ve thoroughly coded everything. Even after the coding, it usually takes me a year to figure out how to put the analysis together into a coherent argument and to figure out what language to use. Just deciding what name to use for a particular group or idea can take months. Understanding this going in can be helpful so that you know to be patient with yourself.

—Jessi Streib, author of The Power of the Past and Privilege Lost 

Note that there is no magic in any of this, nor is there any single “right” way to code or any “correct” codes. What you see in the data will be prompted by your position as a researcher and your scholarly interests. Where the above codes on a preschooler solving a puzzle emerged from my own interest in puzzle solving, another researcher might focus on something wholly different. A scholar of linguistics, for example, may focus instead on the verbalizations made by the child during the discovery process, perhaps even noting particular vocalizations (incidence of grrrs and gritting of the teeth, for example). Your recording of the codes you used is the important part, as it allows other researchers to assess the reliability and validity of your analyses based on those codes. Chapter 19 will provide more details about the kinds of codes you might develop.

Saldaña ( 2014 ) lists seven “necessary personal attributes” for successful coding. To paraphrase, they are the following:

  • Having (or practicing) good organizational skills
  • Perseverance
  • The ability and willingness to deal with ambiguity
  • Flexibility
  • Creativity, broadly understood, which includes “the ability to think visually, to think symbolically, to think in metaphors, and to think of as many ways as possible to approach a problem” (20)
  • Commitment to being rigorously ethical
  • Having an extensive vocabulary [4]

Writing Analytic Memos during/after Coding

Coding the data you have collected is only one aspect of analyzing it. Too many beginners have coded their data and then wondered what to do next. Coding is meant to help organize your data so that you can see it more clearly, but it is not itself an analysis. Thinking about the data, reviewing the coded data, and bringing in the previous literature (here is where you use your literature review and theory) to help make sense of what you have collected are all important aspects of data analysis. Analytic memos are notes you write to yourself about the data. They can be short (a single page or even a paragraph) or long (several pages). These memos can themselves be the subject of subsequent analytic memoing as part of the recursive process that is qualitative data analysis.

Short analytic memos are written about impressions you have about the data, what is emerging, and what might be of interest later on. You can write a short memo about a particular code, for example, and why this code seems important and where it might connect to previous literature. For example, I might write a paragraph about a “cultural capital” code that I use whenever a working-class sociologist says anything about “not fitting in” with their peers (e.g., not having the right accent or hairstyle or private school background). I could then write a little bit about Bourdieu, who originated the notion of cultural capital, and try to make some connections between his definition and how I am applying it here. I can also use the memo to raise questions or doubts I have about what I am seeing (e.g., Maybe the type of school belongs somewhere else? Is this really the right code?). Later on, I can incorporate some of this writing into the theory section of my final paper or article. Here are some types of things that might form the basis of a short memo: something you want to remember, something you noticed that was new or different, a reaction you had, a suspicion or hunch that you are developing, a pattern you are noticing, any inferences you are starting to draw. Rubin ( 2021 ) advises, “Always include some quotation or excerpt from your dataset…that set you off on this idea. It’s happened to me so many times—I’ll have a really strong reaction to a piece of data, write down some insight without the original quotation or context, and then [later] have no idea what I was talking about and have no way of recreating my insight because I can’t remember what piece of data made me think this way” ( 203 ).

All CAQDAS programs include spaces for writing, generating, and storing memos. You can link a memo to a particular transcript, for example. But you can just as easily keep a notebook at hand in which you write notes to yourself, if you prefer the more tactile approach. Drawing pictures that illustrate themes and patterns you are beginning to see also works. The point is to write early and write often, as these memos are the building blocks of your eventual final product (chapter 20).

In the next chapter (chapter 19), we will go a little deeper into codes and how to use them to identify patterns and themes in your data. This chapter has given you an idea of the process of data analysis, but there is much yet to learn about the elements of that process!

Qualitative Data-Analysis Samples

The following three passages are examples of how qualitative researchers describe their data-analysis practices. The first, by Harvey, is a useful example of how data analysis can shift the original research questions. The second example, by Thai, shows multiple stages of coding and how these stages build upward to conceptual themes and theorization. The third example, by Lamont, shows a masterful use of a variety of techniques to generate theory.

Example 1: “Look Someone in the Eye” by Peter Francis Harvey ( 2022 )

I entered the field intending to study gender socialization. However, through the iterative process of writing fieldnotes, rereading them, conducting further research, and writing extensive analytic memos, my focus shifted. Abductive analysis encourages the search for unexpected findings in light of existing literature. In my early data collection, fieldnotes, and memoing, classed comportment was unmistakably prominent in both schools. I was surprised by how pervasive this bodily socialization proved to be and further surprised by the discrepancies between the two schools.…I returned to the literature to compare my empirical findings.…To further clarify patterns within my data and to aid the search for disconfirming evidence, I constructed data matrices (Miles, Huberman, and Saldaña 2013). While rereading my fieldnotes, I used ATLAS.ti to code and recode key sections (Miles et al. 2013), punctuating this process with additional analytic memos. ( 2022:1420 )

Example 2:” Policing and Symbolic Control” by Mai Thai ( 2022 )

Conventional to qualitative research, my analyses iterated between theory development and testing. Analytical memos were written throughout the data collection, and my analyses using MAXQDA software helped me develop, confirm, and challenge specific themes.…My early coding scheme which included descriptive codes (e.g., uniform inspection, college trips) and verbatim codes of the common terms used by field site participants (e.g., “never quit,” “ghetto”) led me to conceptualize valorization. Later analyses developed into thematic codes (e.g., good citizens, criminality) and process codes (e.g., valorization, criminalization), which helped refine my arguments. ( 2022:1191–1192 )

Example 3: The Dignity of Working Men by Michèle Lamont ( 2000 )

To analyze the interviews, I summarized them in a 13-page document including socio-demographic information as well as information on the boundary work of the interviewees. To facilitate comparisons, I noted some of the respondents’ answers on grids and summarized these on matrix displays using techniques suggested by Miles and Huberman for standardizing and processing qualitative data. Interviews were also analyzed one by one, with a focus on the criteria that each respondent mobilized for the evaluation of status. Moreover, I located each interviewee on several five-point scales pertaining to the most significant dimensions they used to evaluate status. I also compared individual interviewees with respondents who were similar to and different from them, both within and across samples. Finally, I classified all the transcripts thematically to perform a systematic analysis of all the important themes that appear in the interviews, approaching the latter as data against which theoretical questions can be explored. ( 2000:256–257 )

Sample Codebook 1

This is an abridged version of the codebook used to analyze qualitative responses to a question about how class affects careers in sociology. Note the use of numbers to organize the flow, supplemented by highlighting techniques (e.g., bolding) and subcoding numbers.

01. CAPS: Any reference to “capitals” in the response, even if the specific words are not used

01.1: cultural capital 01.2: social capital 01.3: economic capital

(can be mixed: “0.12”= both cultural and asocial capital; “0.23”= both social and economic)

01. CAPS: a reference to “capitals” in which the specific words are used [ bold : thus, 01.23 means that both social capital and economic capital were mentioned specifically

02. DEBT: discussion of debt

02.1: mentions personal issues around debt 02.2: discusses debt but in the abstract only (e.g., “people with debt have to worry”)

03. FirstP: how the response is positioned

03.1: neutral or abstract response 03.2: discusses self (“I”) 03.3: discusses others (“they”)

Sample Coded Passage:

* Question: What other codes jump out to you here? Shouldn’t there be a code for feelings of loneliness or alienation? What about an emotions code ?

Sample Codebook 2

This is an example that uses "word" categories only, with descriptions and examples for each code

Further Readings

Elliott, Victoria. 2018. “Thinking about the Coding Process in Qualitative Analysis.” Qualitative Report 23(11):2850–2861. Address common questions those new to coding ask, including the use of “counting” and how to shore up reliability.

Friese, Susanne. 2019. Qualitative Data Analysis with ATLAS.ti. 3rd ed. A good guide to ATLAS.ti, arguably the most used CAQDAS program. Organized around a series of “skills training” to get you up to speed.

Jackson, Kristi, and Pat Bazeley. 2019. Qualitative Data Analysis with NVIVO . 3rd ed. Thousand Oaks, CA: SAGE. If you want to use the CAQDAS program NVivo, this is a good affordable guide to doing so. Includes copious examples, figures, and graphic displays.

LeCompte, Margaret D. 2000. “Analyzing Qualitative Data.” Theory into Practice 39(3):146–154. A very practical and readable guide to the entire coding process, with particular applicability to educational program evaluation/policy analysis.

Miles, Matthew B., and A. Michael Huberman. 1994. Qualitative Data Analysis: An Expanded Sourcebook . 2nd ed. Thousand Oaks, CA: SAGE. A classic reference on coding. May now be superseded by Miles, Huberman, and Saldaña (2019).

Miles, Matthew B., A. Michael Huberman, and Johnny Saldaña. 2019. Qualitative Data Analysis: A Methods Sourcebook . 4th ed. Thousand Oaks, CA; SAGE. A practical methods sourcebook for all qualitative researchers at all levels using visual displays and examples. Highly recommended.

Saldaña, Johnny. 2014. The Coding Manual for Qualitative Researchers . 2nd ed. Thousand Oaks, CA: SAGE. The most complete and comprehensive compendium of coding techniques out there. Essential reference.

Silver, Christina. 2014. Using Software in Qualitative Research: A Step-by-Step Guide. 2nd ed. Thousand Oaks, CA; SAGE. If you are unsure which CAQDAS program you are interested in using or want to compare the features and usages of each, this guidebook is quite helpful.

Vogt, W. Paul, Elaine R. Vogt, Diane C. Gardner, and Lynne M. Haeffele2014. Selecting the Right Analyses for Your Data: Quantitative, Qualitative, and Mixed Methods . New York: The Guilford Press. User-friendly reference guide to all forms of analysis; may be particularly helpful for those engaged in mixed-methods research.

  • When you have collected content (historical, media, archival) that interests you because of its communicative aspect, content analysis (chapter 17) is appropriate. Whereas content analysis is both a research method and a tool of analysis, coding is a tool of analysis that can be used for all kinds of data to address any number of questions. Content analysis itself includes coding. ↵
  • Scientific research, whether quantitative or qualitative, demands we keep an open mind as we conduct our research, that we are “neutral” regarding what is actually there to find. Students who are trained in non-research-based disciplines such as the arts or philosophy or who are (admirably) focused on pursuing social justice can too easily fall into the trap of thinking their job is to “demonstrate” something through the data. That is not the job of a researcher. The job of a researcher is to present (and interpret) findings—things “out there” (even if inside other people’s hearts and minds). One helpful suggestion: when formulating your research question, if you already know the answer (or think you do), scrap that research. Ask a question to which you do not yet know the answer. ↵
  • Codebooks are particularly useful for collaborative research so that codes are applied and interpreted similarly. If you are working with a team of researchers, you will want to take extra care that your codebooks remain in synch and that any refinements or developments are shared with fellow coders. You will also want to conduct an “intercoder reliability” check, testing whether the codes you have developed are clearly identifiable so that multiple coders are using them similarly. Messy, unclear codes that can be interpreted differently by different coders will make it much more difficult to identify patterns across the data. ↵
  • Note that this is important for creating/denoting new codes. The vocabulary does not need to be in English or any particular language. You can use whatever words or phrases capture what it is you are seeing in the data. ↵

A first-cycle coding process in which gerunds are used to identify conceptual actions, often for the purpose of tracing change and development over time.  Widely used in the Grounded Theory approach.

A first-cycle coding process in which terms or phrases used by the participants become the code applied to a particular passage.  It is also known as “verbatim coding,” “indigenous coding,” “natural coding,” “emic coding,” and “inductive coding,” depending on the tradition of inquiry of the researcher.  It is common in Grounded Theory approaches and has even given its name to one of the primary CAQDAS programs (“NVivo”).

Computer-assisted qualitative data-analysis software.  These are software packages that can serve as a repository for qualitative data and that enable coding, memoing, and other tools of data analysis.  See chapter 17 for particular recommendations.

The purposeful selection of some data to prove a preexisting expectation or desired point of the researcher where other data exists that would contradict the interpretation offered.  Note that it is not cherry picking to select a quote that typifies the main finding of a study, although it would be cherry picking to select a quote that is atypical of a body of interviews and then present it as if it is typical.

A preliminary stage of coding in which the researcher notes particular aspects of interest in the data set and begins creating codes.  Later stages of coding refine these preliminary codes.  Note: in Grounded Theory , open coding has a more specific meaning and is often called initial coding : data are broken down into substantive codes in a line-by-line manner, and incidents are compared with one another for similarities and differences until the core category is found.  See also closed coding .

A set of codes, definitions, and examples used as a guide to help analyze interview data.  Codebooks are particularly helpful and necessary when research analysis is shared among members of a research team, as codebooks allow for standardization of shared meanings and code attributions.

The final stages of coding after the refinement of codes has created a complete list or codebook in which all the data is coded using this refined list or codebook.  Compare to open coding .

A first-cycle coding process in which emotions and emotionally salient passages are tagged.

Introduction to Qualitative Research Methods Copyright © 2023 by Allison Hurst is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License , except where otherwise noted.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Neurol Res Pract

Logo of neurrp

How to use and assess qualitative research methods

Loraine busetto.

1 Department of Neurology, Heidelberg University Hospital, Im Neuenheimer Feld 400, 69120 Heidelberg, Germany

Wolfgang Wick

2 Clinical Cooperation Unit Neuro-Oncology, German Cancer Research Center, Heidelberg, Germany

Christoph Gumbinger

Associated data.

Not applicable.

This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions, and focussing on intervention improvement. The most common methods of data collection are document study, (non-) participant observations, semi-structured interviews and focus groups. For data analysis, field-notes and audio-recordings are transcribed into protocols and transcripts, and coded using qualitative data management software. Criteria such as checklists, reflexivity, sampling strategies, piloting, co-coding, member-checking and stakeholder involvement can be used to enhance and assess the quality of the research conducted. Using qualitative in addition to quantitative designs will equip us with better tools to address a greater range of research problems, and to fill in blind spots in current neurological research and practice.

The aim of this paper is to provide an overview of qualitative research methods, including hands-on information on how they can be used, reported and assessed. This article is intended for beginning qualitative researchers in the health sciences as well as experienced quantitative researchers who wish to broaden their understanding of qualitative research.

What is qualitative research?

Qualitative research is defined as “the study of the nature of phenomena”, including “their quality, different manifestations, the context in which they appear or the perspectives from which they can be perceived” , but excluding “their range, frequency and place in an objectively determined chain of cause and effect” [ 1 ]. This formal definition can be complemented with a more pragmatic rule of thumb: qualitative research generally includes data in form of words rather than numbers [ 2 ].

Why conduct qualitative research?

Because some research questions cannot be answered using (only) quantitative methods. For example, one Australian study addressed the issue of why patients from Aboriginal communities often present late or not at all to specialist services offered by tertiary care hospitals. Using qualitative interviews with patients and staff, it found one of the most significant access barriers to be transportation problems, including some towns and communities simply not having a bus service to the hospital [ 3 ]. A quantitative study could have measured the number of patients over time or even looked at possible explanatory factors – but only those previously known or suspected to be of relevance. To discover reasons for observed patterns, especially the invisible or surprising ones, qualitative designs are needed.

While qualitative research is common in other fields, it is still relatively underrepresented in health services research. The latter field is more traditionally rooted in the evidence-based-medicine paradigm, as seen in " research that involves testing the effectiveness of various strategies to achieve changes in clinical practice, preferably applying randomised controlled trial study designs (...) " [ 4 ]. This focus on quantitative research and specifically randomised controlled trials (RCT) is visible in the idea of a hierarchy of research evidence which assumes that some research designs are objectively better than others, and that choosing a "lesser" design is only acceptable when the better ones are not practically or ethically feasible [ 5 , 6 ]. Others, however, argue that an objective hierarchy does not exist, and that, instead, the research design and methods should be chosen to fit the specific research question at hand – "questions before methods" [ 2 , 7 – 9 ]. This means that even when an RCT is possible, some research problems require a different design that is better suited to addressing them. Arguing in JAMA, Berwick uses the example of rapid response teams in hospitals, which he describes as " a complex, multicomponent intervention – essentially a process of social change" susceptible to a range of different context factors including leadership or organisation history. According to him, "[in] such complex terrain, the RCT is an impoverished way to learn. Critics who use it as a truth standard in this context are incorrect" [ 8 ] . Instead of limiting oneself to RCTs, Berwick recommends embracing a wider range of methods , including qualitative ones, which for "these specific applications, (...) are not compromises in learning how to improve; they are superior" [ 8 ].

Research problems that can be approached particularly well using qualitative methods include assessing complex multi-component interventions or systems (of change), addressing questions beyond “what works”, towards “what works for whom when, how and why”, and focussing on intervention improvement rather than accreditation [ 7 , 9 – 12 ]. Using qualitative methods can also help shed light on the “softer” side of medical treatment. For example, while quantitative trials can measure the costs and benefits of neuro-oncological treatment in terms of survival rates or adverse effects, qualitative research can help provide a better understanding of patient or caregiver stress, visibility of illness or out-of-pocket expenses.

How to conduct qualitative research?

Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [ 13 , 14 ]. As Fossey puts it : “sampling, data collection, analysis and interpretation are related to each other in a cyclical (iterative) manner, rather than following one after another in a stepwise approach” [ 15 ]. The researcher can make educated decisions with regard to the choice of method, how they are implemented, and to which and how many units they are applied [ 13 ]. As shown in Fig.  1 , this can involve several back-and-forth steps between data collection and analysis where new insights and experiences can lead to adaption and expansion of the original plan. Some insights may also necessitate a revision of the research question and/or the research design as a whole. The process ends when saturation is achieved, i.e. when no relevant new information can be found (see also below: sampling and saturation). For reasons of transparency, it is essential for all decisions as well as the underlying reasoning to be well-documented.

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig1_HTML.jpg

Iterative research process

While it is not always explicitly addressed, qualitative methods reflect a different underlying research paradigm than quantitative research (e.g. constructivism or interpretivism as opposed to positivism). The choice of methods can be based on the respective underlying substantive theory or theoretical framework used by the researcher [ 2 ].

Data collection

The methods of qualitative data collection most commonly used in health research are document study, observations, semi-structured interviews and focus groups [ 1 , 14 , 16 , 17 ].

Document study

Document study (also called document analysis) refers to the review by the researcher of written materials [ 14 ]. These can include personal and non-personal documents such as archives, annual reports, guidelines, policy documents, diaries or letters.

Observations

Observations are particularly useful to gain insights into a certain setting and actual behaviour – as opposed to reported behaviour or opinions [ 13 ]. Qualitative observations can be either participant or non-participant in nature. In participant observations, the observer is part of the observed setting, for example a nurse working in an intensive care unit [ 18 ]. In non-participant observations, the observer is “on the outside looking in”, i.e. present in but not part of the situation, trying not to influence the setting by their presence. Observations can be planned (e.g. for 3 h during the day or night shift) or ad hoc (e.g. as soon as a stroke patient arrives at the emergency room). During the observation, the observer takes notes on everything or certain pre-determined parts of what is happening around them, for example focusing on physician-patient interactions or communication between different professional groups. Written notes can be taken during or after the observations, depending on feasibility (which is usually lower during participant observations) and acceptability (e.g. when the observer is perceived to be judging the observed). Afterwards, these field notes are transcribed into observation protocols. If more than one observer was involved, field notes are taken independently, but notes can be consolidated into one protocol after discussions. Advantages of conducting observations include minimising the distance between the researcher and the researched, the potential discovery of topics that the researcher did not realise were relevant and gaining deeper insights into the real-world dimensions of the research problem at hand [ 18 ].

Semi-structured interviews

Hijmans & Kuyper describe qualitative interviews as “an exchange with an informal character, a conversation with a goal” [ 19 ]. Interviews are used to gain insights into a person’s subjective experiences, opinions and motivations – as opposed to facts or behaviours [ 13 ]. Interviews can be distinguished by the degree to which they are structured (i.e. a questionnaire), open (e.g. free conversation or autobiographical interviews) or semi-structured [ 2 , 13 ]. Semi-structured interviews are characterized by open-ended questions and the use of an interview guide (or topic guide/list) in which the broad areas of interest, sometimes including sub-questions, are defined [ 19 ]. The pre-defined topics in the interview guide can be derived from the literature, previous research or a preliminary method of data collection, e.g. document study or observations. The topic list is usually adapted and improved at the start of the data collection process as the interviewer learns more about the field [ 20 ]. Across interviews the focus on the different (blocks of) questions may differ and some questions may be skipped altogether (e.g. if the interviewee is not able or willing to answer the questions or for concerns about the total length of the interview) [ 20 ]. Qualitative interviews are usually not conducted in written format as it impedes on the interactive component of the method [ 20 ]. In comparison to written surveys, qualitative interviews have the advantage of being interactive and allowing for unexpected topics to emerge and to be taken up by the researcher. This can also help overcome a provider or researcher-centred bias often found in written surveys, which by nature, can only measure what is already known or expected to be of relevance to the researcher. Interviews can be audio- or video-taped; but sometimes it is only feasible or acceptable for the interviewer to take written notes [ 14 , 16 , 20 ].

Focus groups

Focus groups are group interviews to explore participants’ expertise and experiences, including explorations of how and why people behave in certain ways [ 1 ]. Focus groups usually consist of 6–8 people and are led by an experienced moderator following a topic guide or “script” [ 21 ]. They can involve an observer who takes note of the non-verbal aspects of the situation, possibly using an observation guide [ 21 ]. Depending on researchers’ and participants’ preferences, the discussions can be audio- or video-taped and transcribed afterwards [ 21 ]. Focus groups are useful for bringing together homogeneous (to a lesser extent heterogeneous) groups of participants with relevant expertise and experience on a given topic on which they can share detailed information [ 21 ]. Focus groups are a relatively easy, fast and inexpensive method to gain access to information on interactions in a given group, i.e. “the sharing and comparing” among participants [ 21 ]. Disadvantages include less control over the process and a lesser extent to which each individual may participate. Moreover, focus group moderators need experience, as do those tasked with the analysis of the resulting data. Focus groups can be less appropriate for discussing sensitive topics that participants might be reluctant to disclose in a group setting [ 13 ]. Moreover, attention must be paid to the emergence of “groupthink” as well as possible power dynamics within the group, e.g. when patients are awed or intimidated by health professionals.

Choosing the “right” method

As explained above, the school of thought underlying qualitative research assumes no objective hierarchy of evidence and methods. This means that each choice of single or combined methods has to be based on the research question that needs to be answered and a critical assessment with regard to whether or to what extent the chosen method can accomplish this – i.e. the “fit” between question and method [ 14 ]. It is necessary for these decisions to be documented when they are being made, and to be critically discussed when reporting methods and results.

Let us assume that our research aim is to examine the (clinical) processes around acute endovascular treatment (EVT), from the patient’s arrival at the emergency room to recanalization, with the aim to identify possible causes for delay and/or other causes for sub-optimal treatment outcome. As a first step, we could conduct a document study of the relevant standard operating procedures (SOPs) for this phase of care – are they up-to-date and in line with current guidelines? Do they contain any mistakes, irregularities or uncertainties that could cause delays or other problems? Regardless of the answers to these questions, the results have to be interpreted based on what they are: a written outline of what care processes in this hospital should look like. If we want to know what they actually look like in practice, we can conduct observations of the processes described in the SOPs. These results can (and should) be analysed in themselves, but also in comparison to the results of the document analysis, especially as regards relevant discrepancies. Do the SOPs outline specific tests for which no equipment can be observed or tasks to be performed by specialized nurses who are not present during the observation? It might also be possible that the written SOP is outdated, but the actual care provided is in line with current best practice. In order to find out why these discrepancies exist, it can be useful to conduct interviews. Are the physicians simply not aware of the SOPs (because their existence is limited to the hospital’s intranet) or do they actively disagree with them or does the infrastructure make it impossible to provide the care as described? Another rationale for adding interviews is that some situations (or all of their possible variations for different patient groups or the day, night or weekend shift) cannot practically or ethically be observed. In this case, it is possible to ask those involved to report on their actions – being aware that this is not the same as the actual observation. A senior physician’s or hospital manager’s description of certain situations might differ from a nurse’s or junior physician’s one, maybe because they intentionally misrepresent facts or maybe because different aspects of the process are visible or important to them. In some cases, it can also be relevant to consider to whom the interviewee is disclosing this information – someone they trust, someone they are otherwise not connected to, or someone they suspect or are aware of being in a potentially “dangerous” power relationship to them. Lastly, a focus group could be conducted with representatives of the relevant professional groups to explore how and why exactly they provide care around EVT. The discussion might reveal discrepancies (between SOPs and actual care or between different physicians) and motivations to the researchers as well as to the focus group members that they might not have been aware of themselves. For the focus group to deliver relevant information, attention has to be paid to its composition and conduct, for example, to make sure that all participants feel safe to disclose sensitive or potentially problematic information or that the discussion is not dominated by (senior) physicians only. The resulting combination of data collection methods is shown in Fig.  2 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig2_HTML.jpg

Possible combination of data collection methods

Attributions for icons: “Book” by Serhii Smirnov, “Interview” by Adrien Coquet, FR, “Magnifying Glass” by anggun, ID, “Business communication” by Vectors Market; all from the Noun Project

The combination of multiple data source as described for this example can be referred to as “triangulation”, in which multiple measurements are carried out from different angles to achieve a more comprehensive understanding of the phenomenon under study [ 22 , 23 ].

Data analysis

To analyse the data collected through observations, interviews and focus groups these need to be transcribed into protocols and transcripts (see Fig.  3 ). Interviews and focus groups can be transcribed verbatim , with or without annotations for behaviour (e.g. laughing, crying, pausing) and with or without phonetic transcription of dialects and filler words, depending on what is expected or known to be relevant for the analysis. In the next step, the protocols and transcripts are coded , that is, marked (or tagged, labelled) with one or more short descriptors of the content of a sentence or paragraph [ 2 , 15 , 23 ]. Jansen describes coding as “connecting the raw data with “theoretical” terms” [ 20 ]. In a more practical sense, coding makes raw data sortable. This makes it possible to extract and examine all segments describing, say, a tele-neurology consultation from multiple data sources (e.g. SOPs, emergency room observations, staff and patient interview). In a process of synthesis and abstraction, the codes are then grouped, summarised and/or categorised [ 15 , 20 ]. The end product of the coding or analysis process is a descriptive theory of the behavioural pattern under investigation [ 20 ]. The coding process is performed using qualitative data management software, the most common ones being InVivo, MaxQDA and Atlas.ti. It should be noted that these are data management tools which support the analysis performed by the researcher(s) [ 14 ].

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig3_HTML.jpg

From data collection to data analysis

Attributions for icons: see Fig. ​ Fig.2, 2 , also “Speech to text” by Trevor Dsouza, “Field Notes” by Mike O’Brien, US, “Voice Record” by ProSymbols, US, “Inspection” by Made, AU, and “Cloud” by Graphic Tigers; all from the Noun Project

How to report qualitative research?

Protocols of qualitative research can be published separately and in advance of the study results. However, the aim is not the same as in RCT protocols, i.e. to pre-define and set in stone the research questions and primary or secondary endpoints. Rather, it is a way to describe the research methods in detail, which might not be possible in the results paper given journals’ word limits. Qualitative research papers are usually longer than their quantitative counterparts to allow for deep understanding and so-called “thick description”. In the methods section, the focus is on transparency of the methods used, including why, how and by whom they were implemented in the specific study setting, so as to enable a discussion of whether and how this may have influenced data collection, analysis and interpretation. The results section usually starts with a paragraph outlining the main findings, followed by more detailed descriptions of, for example, the commonalities, discrepancies or exceptions per category [ 20 ]. Here it is important to support main findings by relevant quotations, which may add information, context, emphasis or real-life examples [ 20 , 23 ]. It is subject to debate in the field whether it is relevant to state the exact number or percentage of respondents supporting a certain statement (e.g. “Five interviewees expressed negative feelings towards XYZ”) [ 21 ].

How to combine qualitative with quantitative research?

Qualitative methods can be combined with other methods in multi- or mixed methods designs, which “[employ] two or more different methods [ …] within the same study or research program rather than confining the research to one single method” [ 24 ]. Reasons for combining methods can be diverse, including triangulation for corroboration of findings, complementarity for illustration and clarification of results, expansion to extend the breadth and range of the study, explanation of (unexpected) results generated with one method with the help of another, or offsetting the weakness of one method with the strength of another [ 1 , 17 , 24 – 26 ]. The resulting designs can be classified according to when, why and how the different quantitative and/or qualitative data strands are combined. The three most common types of mixed method designs are the convergent parallel design , the explanatory sequential design and the exploratory sequential design. The designs with examples are shown in Fig.  4 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig4_HTML.jpg

Three common mixed methods designs

In the convergent parallel design, a qualitative study is conducted in parallel to and independently of a quantitative study, and the results of both studies are compared and combined at the stage of interpretation of results. Using the above example of EVT provision, this could entail setting up a quantitative EVT registry to measure process times and patient outcomes in parallel to conducting the qualitative research outlined above, and then comparing results. Amongst other things, this would make it possible to assess whether interview respondents’ subjective impressions of patients receiving good care match modified Rankin Scores at follow-up, or whether observed delays in care provision are exceptions or the rule when compared to door-to-needle times as documented in the registry. In the explanatory sequential design, a quantitative study is carried out first, followed by a qualitative study to help explain the results from the quantitative study. This would be an appropriate design if the registry alone had revealed relevant delays in door-to-needle times and the qualitative study would be used to understand where and why these occurred, and how they could be improved. In the exploratory design, the qualitative study is carried out first and its results help informing and building the quantitative study in the next step [ 26 ]. If the qualitative study around EVT provision had shown a high level of dissatisfaction among the staff members involved, a quantitative questionnaire investigating staff satisfaction could be set up in the next step, informed by the qualitative study on which topics dissatisfaction had been expressed. Amongst other things, the questionnaire design would make it possible to widen the reach of the research to more respondents from different (types of) hospitals, regions, countries or settings, and to conduct sub-group analyses for different professional groups.

How to assess qualitative research?

A variety of assessment criteria and lists have been developed for qualitative research, ranging in their focus and comprehensiveness [ 14 , 17 , 27 ]. However, none of these has been elevated to the “gold standard” in the field. In the following, we therefore focus on a set of commonly used assessment criteria that, from a practical standpoint, a researcher can look for when assessing a qualitative research report or paper.

Assessors should check the authors’ use of and adherence to the relevant reporting checklists (e.g. Standards for Reporting Qualitative Research (SRQR)) to make sure all items that are relevant for this type of research are addressed [ 23 , 28 ]. Discussions of quantitative measures in addition to or instead of these qualitative measures can be a sign of lower quality of the research (paper). Providing and adhering to a checklist for qualitative research contributes to an important quality criterion for qualitative research, namely transparency [ 15 , 17 , 23 ].

Reflexivity

While methodological transparency and complete reporting is relevant for all types of research, some additional criteria must be taken into account for qualitative research. This includes what is called reflexivity, i.e. sensitivity to the relationship between the researcher and the researched, including how contact was established and maintained, or the background and experience of the researcher(s) involved in data collection and analysis. Depending on the research question and population to be researched this can be limited to professional experience, but it may also include gender, age or ethnicity [ 17 , 27 ]. These details are relevant because in qualitative research, as opposed to quantitative research, the researcher as a person cannot be isolated from the research process [ 23 ]. It may influence the conversation when an interviewed patient speaks to an interviewer who is a physician, or when an interviewee is asked to discuss a gynaecological procedure with a male interviewer, and therefore the reader must be made aware of these details [ 19 ].

Sampling and saturation

The aim of qualitative sampling is for all variants of the objects of observation that are deemed relevant for the study to be present in the sample “ to see the issue and its meanings from as many angles as possible” [ 1 , 16 , 19 , 20 , 27 ] , and to ensure “information-richness [ 15 ]. An iterative sampling approach is advised, in which data collection (e.g. five interviews) is followed by data analysis, followed by more data collection to find variants that are lacking in the current sample. This process continues until no new (relevant) information can be found and further sampling becomes redundant – which is called saturation [ 1 , 15 ] . In other words: qualitative data collection finds its end point not a priori , but when the research team determines that saturation has been reached [ 29 , 30 ].

This is also the reason why most qualitative studies use deliberate instead of random sampling strategies. This is generally referred to as “ purposive sampling” , in which researchers pre-define which types of participants or cases they need to include so as to cover all variations that are expected to be of relevance, based on the literature, previous experience or theory (i.e. theoretical sampling) [ 14 , 20 ]. Other types of purposive sampling include (but are not limited to) maximum variation sampling, critical case sampling or extreme or deviant case sampling [ 2 ]. In the above EVT example, a purposive sample could include all relevant professional groups and/or all relevant stakeholders (patients, relatives) and/or all relevant times of observation (day, night and weekend shift).

Assessors of qualitative research should check whether the considerations underlying the sampling strategy were sound and whether or how researchers tried to adapt and improve their strategies in stepwise or cyclical approaches between data collection and analysis to achieve saturation [ 14 ].

Good qualitative research is iterative in nature, i.e. it goes back and forth between data collection and analysis, revising and improving the approach where necessary. One example of this are pilot interviews, where different aspects of the interview (especially the interview guide, but also, for example, the site of the interview or whether the interview can be audio-recorded) are tested with a small number of respondents, evaluated and revised [ 19 ]. In doing so, the interviewer learns which wording or types of questions work best, or which is the best length of an interview with patients who have trouble concentrating for an extended time. Of course, the same reasoning applies to observations or focus groups which can also be piloted.

Ideally, coding should be performed by at least two researchers, especially at the beginning of the coding process when a common approach must be defined, including the establishment of a useful coding list (or tree), and when a common meaning of individual codes must be established [ 23 ]. An initial sub-set or all transcripts can be coded independently by the coders and then compared and consolidated after regular discussions in the research team. This is to make sure that codes are applied consistently to the research data.

Member checking

Member checking, also called respondent validation , refers to the practice of checking back with study respondents to see if the research is in line with their views [ 14 , 27 ]. This can happen after data collection or analysis or when first results are available [ 23 ]. For example, interviewees can be provided with (summaries of) their transcripts and asked whether they believe this to be a complete representation of their views or whether they would like to clarify or elaborate on their responses [ 17 ]. Respondents’ feedback on these issues then becomes part of the data collection and analysis [ 27 ].

Stakeholder involvement

In those niches where qualitative approaches have been able to evolve and grow, a new trend has seen the inclusion of patients and their representatives not only as study participants (i.e. “members”, see above) but as consultants to and active participants in the broader research process [ 31 – 33 ]. The underlying assumption is that patients and other stakeholders hold unique perspectives and experiences that add value beyond their own single story, making the research more relevant and beneficial to researchers, study participants and (future) patients alike [ 34 , 35 ]. Using the example of patients on or nearing dialysis, a recent scoping review found that 80% of clinical research did not address the top 10 research priorities identified by patients and caregivers [ 32 , 36 ]. In this sense, the involvement of the relevant stakeholders, especially patients and relatives, is increasingly being seen as a quality indicator in and of itself.

How not to assess qualitative research

The above overview does not include certain items that are routine in assessments of quantitative research. What follows is a non-exhaustive, non-representative, experience-based list of the quantitative criteria often applied to the assessment of qualitative research, as well as an explanation of the limited usefulness of these endeavours.

Protocol adherence

Given the openness and flexibility of qualitative research, it should not be assessed by how well it adheres to pre-determined and fixed strategies – in other words: its rigidity. Instead, the assessor should look for signs of adaptation and refinement based on lessons learned from earlier steps in the research process.

Sample size

For the reasons explained above, qualitative research does not require specific sample sizes, nor does it require that the sample size be determined a priori [ 1 , 14 , 27 , 37 – 39 ]. Sample size can only be a useful quality indicator when related to the research purpose, the chosen methodology and the composition of the sample, i.e. who was included and why.

Randomisation

While some authors argue that randomisation can be used in qualitative research, this is not commonly the case, as neither its feasibility nor its necessity or usefulness has been convincingly established for qualitative research [ 13 , 27 ]. Relevant disadvantages include the negative impact of a too large sample size as well as the possibility (or probability) of selecting “ quiet, uncooperative or inarticulate individuals ” [ 17 ]. Qualitative studies do not use control groups, either.

Interrater reliability, variability and other “objectivity checks”

The concept of “interrater reliability” is sometimes used in qualitative research to assess to which extent the coding approach overlaps between the two co-coders. However, it is not clear what this measure tells us about the quality of the analysis [ 23 ]. This means that these scores can be included in qualitative research reports, preferably with some additional information on what the score means for the analysis, but it is not a requirement. Relatedly, it is not relevant for the quality or “objectivity” of qualitative research to separate those who recruited the study participants and collected and analysed the data. Experiences even show that it might be better to have the same person or team perform all of these tasks [ 20 ]. First, when researchers introduce themselves during recruitment this can enhance trust when the interview takes place days or weeks later with the same researcher. Second, when the audio-recording is transcribed for analysis, the researcher conducting the interviews will usually remember the interviewee and the specific interview situation during data analysis. This might be helpful in providing additional context information for interpretation of data, e.g. on whether something might have been meant as a joke [ 18 ].

Not being quantitative research

Being qualitative research instead of quantitative research should not be used as an assessment criterion if it is used irrespectively of the research problem at hand. Similarly, qualitative research should not be required to be combined with quantitative research per se – unless mixed methods research is judged as inherently better than single-method research. In this case, the same criterion should be applied for quantitative studies without a qualitative component.

The main take-away points of this paper are summarised in Table ​ Table1. 1 . We aimed to show that, if conducted well, qualitative research can answer specific research questions that cannot to be adequately answered using (only) quantitative designs. Seeing qualitative and quantitative methods as equal will help us become more aware and critical of the “fit” between the research problem and our chosen methods: I can conduct an RCT to determine the reasons for transportation delays of acute stroke patients – but should I? It also provides us with a greater range of tools to tackle a greater range of research problems more appropriately and successfully, filling in the blind spots on one half of the methodological spectrum to better address the whole complexity of neurological research and practice.

Take-away-points

Acknowledgements

Abbreviations, authors’ contributions.

LB drafted the manuscript; WW and CG revised the manuscript; all authors approved the final versions.

no external funding.

Availability of data and materials

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Search Menu
  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Urban Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Literature
  • Classical Reception
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Papyrology
  • Greek and Roman Archaeology
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Emotions
  • History of Agriculture
  • History of Education
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Acquisition
  • Language Evolution
  • Language Reference
  • Language Variation
  • Language Families
  • Lexicography
  • Linguistic Anthropology
  • Linguistic Theories
  • Linguistic Typology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Modernism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Religion
  • Music and Media
  • Music and Culture
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Science
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Lifestyle, Home, and Garden
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Politics
  • Law and Society
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Toxicology
  • Medical Oncology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Clinical Neuroscience
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Medical Ethics
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Security
  • Computer Games
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Psychology
  • Cognitive Neuroscience
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Strategy
  • Business Ethics
  • Business History
  • Business and Government
  • Business and Technology
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic Systems
  • Economic History
  • Economic Methodology
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Natural Disasters (Environment)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Theory
  • Politics and Law
  • Public Administration
  • Public Policy
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Developmental and Physical Disabilities Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The Oxford Handbook of Qualitative Research (2nd edn)

  • < Previous chapter
  • Next chapter >

The Oxford Handbook of Qualitative Research (2nd edn)

29 Qualitative Data Analysis Strategies

Johnny Saldaña, School of Theatre and Film, Arizona State University

  • Published: 02 September 2020
  • Cite Icon Cite
  • Permissions Icon Permissions

This chapter provides an overview of selected qualitative data analysis strategies with a particular focus on codes and coding. Preparatory strategies for a qualitative research study and data management are first outlined. Six coding methods are then profiled using comparable interview data: process coding, in vivo coding, descriptive coding, values coding, dramaturgical coding, and versus coding. Strategies for constructing themes and assertions from the data follow. Analytic memo writing is woven throughout as a method for generating additional analytic insight. Next, display and arts-based strategies are provided, followed by recommended qualitative data analytic software programs and a discussion on verifying the researcher’s analytic findings.

Qualitative Data Analysis Strategies

Anthropologist Clifford Geertz ( 1983 ) charmingly mused, “Life is just a bowl of strategies” (p. 25). Strategy , as I use it here, refers to a carefully considered plan or method to achieve a particular goal. The goal in this case is to develop a write-up of your analytic work with the qualitative data you have been given and collected as part of a study. The plans and methods you might employ to achieve that goal are what this article profiles.

Some may perceive strategy as an inappropriate, if not manipulative, word, suggesting formulaic or regimented approaches to inquiry. I assure you that is not my intent. My use of strategy is dramaturgical in nature: Strategies are actions that characters in plays take to overcome obstacles to achieve their objectives. Actors portraying these characters rely on action verbs to generate belief within themselves and to motivate them as they interpret their lines and move appropriately on stage.

What I offer is a qualitative researcher’s array of actions from which to draw to overcome the obstacles to thinking to achieve an analysis of your data. But unlike the prescripted text of a play in which the obstacles, strategies, and outcomes have been predetermined by the playwright, your work must be improvisational—acting, reacting, and interacting with data on a moment-by-moment basis to determine what obstacles stand in your way and thus what strategies you should take to reach your goals.

Another intriguing quote to keep in mind comes from research methodologist Robert E. Stake ( 1995 ), who posited, “Good research is not about good methods as much as it is about good thinking” (p. 19). In other words, strategies can take you only so far. You can have a box full of tools, but if you do not know how to use them well or use them creatively, the collection seems rather purposeless. One of the best ways we learn is by doing . So, pick up one or more of these strategies (in the form of verbs) and take analytic action with your data. Also keep in mind that these are discussed in the order in which they may typically occur, although humans think cyclically, iteratively, and reverberatively, and each research project has its unique contexts and needs. Be prepared for your mind to jump purposefully and/or idiosyncratically from one strategy to another throughout the study.

Qualitative Data Analysis Strategy: To Foresee

To foresee in qualitative data analysis (QDA) is to reflect beforehand on what forms of data you will most likely need and collect, which thus informs what types of data analytic strategies you anticipate using. Analysis, in a way, begins even before you collect data (Saldaña & Omasta, 2018 ). As you design your research study in your mind and on a text editing page, one strategy is to consider what types of data you may need to help inform and answer your central and related research questions. Interview transcripts, participant observation field notes, documents, artifacts, photographs, video recordings, and so on are not only forms of data but also foundations for how you may plan to analyze them. A participant interview, for example, suggests that you will transcribe all or relevant portions of the recording and use both the transcription and the recording itself as sources for data analysis. Any analytic memos (discussed later) you make about your impressions of the interview also become data to analyze. Even the computing software you plan to employ will be relevant to data analysis because it may help or hinder your efforts.

As your research design formulates, compose one to two paragraphs that outline how your QDA may proceed. This will necessitate that you have some background knowledge of the vast array of methods available to you. Thus, surveying the literature is vital preparatory work.

Qualitative Data Analysis Strategy: To Survey

To survey in QDA is to look for and consider the applicability of the QDA literature in your field that may provide useful guidance for your forthcoming data analytic work. General sources in QDA will provide a good starting point for acquainting you with the data analysis strategies available for the variety of methodologies or genres in qualitative inquiry (e.g., ethnography, phenomenology, case study, arts-based research, mixed methods). One of the most accessible (and humorous) is Galman’s ( 2013 ) The Good, the Bad, and the Data , and one of the most richly detailed is Frederick J. Wertz et al.’s ( 2011 ) Five Ways of Doing Qualitative Analysis . The author’s core texts for this chapter come from The Coding Manual for Qualitative Researchers (Saldaña, 2016 ) and Qualitative Research: Analyzing Life (Saldaña & Omasta, 2018 ).

If your study’s methodology or approach is grounded theory, for example, then a survey of methods works by authors such as Barney G. Glaser, Anselm L. Strauss, Juliet Corbin, and, in particular, the prolific Kathy Charmaz ( 2014 ) may be expected. But there has been a recent outpouring of additional book publications in grounded theory by Birks and Mills ( 2015 ), Bryant ( 2017 ), Bryant and Charmaz ( 2019 ), and Stern and Porr ( 2011 ), plus the legacy of thousands of articles and chapters across many disciplines that have addressed grounded theory in their studies.

Fields such as education, psychology, social work, healthcare, and others also have their own QDA methods literature in the form of texts and journals, as well as international conferences and workshops for members of the profession. It is important to have had some university coursework and/or mentorship in qualitative research to suitably prepare you for the intricacies of QDA, and you must acknowledge that the emergent nature of qualitative inquiry may require you to adopt analysis strategies that differ from what you originally planned.

Qualitative Data Analysis Strategy: To Collect

To collect in QDA is to receive the data given to you by participants and those data you actively gather to inform your study. Qualitative data analysis is concurrent with data collection and management. As interviews are transcribed, field notes are fleshed out, and documents are filed, the researcher uses opportunities to carefully read the corpus and make preliminary notations directly on the data documents by highlighting, bolding, italicizing, or noting in some way any particularly interesting or salient portions. As these data are initially reviewed, the researcher also composes supplemental analytic memos that include first impressions, reminders for follow-up, preliminary connections, and other thinking matters about the phenomena at work.

Some of the most common fieldwork tools you might use to collect data are notepads, pens and pencils; file folders for hard-copy documents; a laptop, tablet, or desktop with text editing software (Microsoft Word and Excel are most useful) and Internet access; and a digital camera and voice recorder (functions available on many electronic devices such as smartphones). Some fieldworkers may even employ a digital video camera to record social action, as long as participant permissions have been secured. But everything originates from the researcher. Your senses are immersed in the cultural milieu you study, taking in and holding onto relevant details, or significant trivia , as I call them. You become a human camera, zooming out to capture the broad landscape of your field site one day and then zooming in on a particularly interesting individual or phenomenon the next. Your analysis is only as good as the data you collect.

Fieldwork can be an overwhelming experience because so many details of social life are happening in front of you. Take a holistic approach to your entrée, but as you become more familiar with the setting and participants, actively focus on things that relate to your research topic and questions. Keep yourself open to the intriguing, surprising, and disturbing (Sunstein & Chiseri-Strater, 2012 , p. 115), because these facets enrich your study by making you aware of the unexpected.

Qualitative Data Analysis Strategy: To Feel

To feel in QDA is to gain deep emotional insight into the social worlds you study and what it means to be human. Virtually everything we do has an accompanying emotion(s), and feelings are both reactions and stimuli for action. Others’ emotions clue you to their motives, values, attitudes, beliefs, worldviews, identities, and other subjective perceptions and interpretations. Acknowledge that emotional detachment is not possible in field research. Attunement to the emotional experiences of your participants plus sympathetic and empathetic responses to the actions around you are necessary in qualitative endeavors. Your own emotional responses during fieldwork are also data because they document the tacit and visceral. It is important during such analytic reflection to assess why your emotional reactions were as they were. But it is equally important not to let emotions alone steer the course of your study. A proper balance must be found between feelings and facts.

Qualitative Data Analysis Strategy: To Organize

To organize in QDA is to maintain an orderly repository of data for easy access and analysis. Even in the smallest of qualitative studies, a large amount of data will be collected across time. Prepare both a hard drive and hard-copy folders for digital data and paperwork, and back up all materials for security from loss. I recommend that each data unit (e.g., one interview transcript, one document, one day’s worth of field notes) have its own file, with subfolders specifying the data forms and research study logistics (e.g., interviews, field notes, documents, institutional review board correspondence, calendar).

For small-scale qualitative studies, I have found it quite useful to maintain one large master file with all participant and field site data copied and combined with the literature review and accompanying researcher analytic memos. This master file is used to cut and paste related passages together, deleting what seems unnecessary as the study proceeds and eventually transforming the document into the final report itself. Cosmetic devices such as font style, font size, rich text (italicizing, bolding, underlining, etc.), and color can help you distinguish between different data forms and highlight significant passages. For example, descriptive, narrative passages of field notes are logged in regular font. “Quotations, things spoken by participants, are logged in bold font.”   Observer’s comments, such as the researcher’s subjective impressions or analytic jottings, are set in italics.

Qualitative Data Analysis Strategy: To Jot

To jot in QDA is to write occasional, brief notes about your thinking or reminders for follow-up. A jot is a phrase or brief sentence that will fit on a standard-size sticky note. As data are brought and documented together, take some initial time to review their contents and jot some notes about preliminary patterns, participant quotes that seem vivid, anomalies in the data, and so forth.

As you work on a project, keep something to write with or to voice record with you at all times to capture your fleeting thoughts. You will most likely find yourself thinking about your research when you are not working exclusively on the project, and a “mental jot” may occur to you as you ruminate on logistical or analytic matters. Document the thought in some way for later retrieval and elaboration as an analytic memo.

Qualitative Data Analysis Strategy: To Prioritize

To prioritize in QDA is to determine which data are most significant in your corpus and which tasks are most necessary. During fieldwork, massive amounts of data in various forms may be collected, and your mind can be easily overwhelmed by the magnitude of the quantity, its richness, and its management. Decisions will need to be made about the most pertinent data because they help answer your research questions or emerge as salient pieces of evidence. As a sweeping generalization, approximately one half to two thirds of what you collect may become unnecessary as you proceed toward the more formal stages of QDA.

To prioritize in QDA is also to determine what matters most in your assembly of codes, categories, patterns, themes, assertions, propositions, and concepts. Return to your research purpose and questions to keep you framed for what the focus should be.

Qualitative Data Analysis Strategy: To Analyze

To analyze in QDA is to observe and discern patterns within data and to construct meanings that seem to capture their essences and essentials. Just as there are a variety of genres, elements, and styles of qualitative research, so too are there a variety of methods available for QDA. Analytic choices are most often based on what methods will harmonize with your genre selection and conceptual framework, what will generate the most sufficient answers to your research questions, and what will best represent and present the project’s findings.

Analysis can range from the factual to the conceptual to the interpretive. Analysis can also range from a straightforward descriptive account to an emergently constructed grounded theory to an evocatively composed short story. A qualitative research project’s outcomes may range from rigorously achieved, insightful answers to open-ended, evocative questions; from rich descriptive detail to a bullet-point list of themes; and from third-person, objective reportage to first-person, emotion-laden poetry. Just as there are multiple destinations in qualitative research, there are multiple pathways and journeys along the way.

Analysis is accelerated as you take cognitive ownership of your data. By reading and rereading the corpus, you gain intimate familiarity with its contents and begin to notice significant details as well as make new connections and insights about their meanings. Patterns, categories, themes, and their interrelationships become more evident the more you know the subtleties of the database.

Since qualitative research’s design, fieldwork, and data collection are most often provisional, emergent, and evolutionary processes, you reflect on and analyze the data as you gather them and proceed through the project. If preplanned methods are not working, you change them to secure the data you need. There is generally a postfieldwork period when continued reflection and more systematic data analysis occur, concurrent with or followed by additional data collection, if needed, and the more formal write-up of the study, which is in itself an analytic act. Through field note writing, interview transcribing, analytic memo writing, and other documentation processes, you gain cognitive ownership of your data; and the intuitive, tacit, synthesizing capabilities of your brain begin sensing patterns, making connections, and seeing the bigger picture. The purpose and outcome of data analysis is to reveal to others through fresh insights what we have observed and discovered about the human condition. Fortunately, there are heuristics for reorganizing and reflecting on your qualitative data to help you achieve that goal.

Qualitative Data Analysis Strategy: To Pattern

To pattern in QDA is to detect similarities within and regularities among the data you have collected. The natural world is filled with patterns because we, as humans, have constructed them as such. Stars in the night sky are not just a random assembly; our ancestors pieced them together to form constellations like the Big Dipper. A collection of flowers growing wild in a field has a pattern, as does an individual flower’s patterns of leaves and petals. Look at the physical objects humans have created and notice how pattern oriented we are in our construction, organization, and decoration. Look around you in your environment and notice how many patterns are evident on your clothing, in a room, and on most objects themselves. Even our sometimes mundane daily and long-term human actions are reproduced patterns in the form of routines, rituals, rules, roles, and relationships (Saldaña & Omasta, 2018 ).

This human propensity for pattern-making follows us into QDA. From the vast array of interview transcripts, field notes, documents, and other forms of data, there is this instinctive, hardwired need to bring order to the collection—not just to reorganize it but to look for and construct patterns out of it. The discernment of patterns is one of the first steps in the data analytic process, and the methods described next are recommended ways to construct them.

Qualitative Data Analysis Strategy: To Code

To code in QDA is to assign a truncated, symbolic meaning to each datum for purposes of qualitative analysis—primarily patterning and categorizing. Coding is a heuristic—a method of discovery—to the meanings of individual sections of data. These codes function as a way of patterning, classifying, and later reorganizing them into emergent categories for further analysis. Different types of codes exist for different types of research genres and qualitative data analytic approaches, but this chapter will focus on only a few selected methods. First, a code can be defined as follows:

A code in qualitative data analysis is most often a word or short phrase that symbolically assigns a summative, salient, essence-capturing, and/or evocative attribute for a portion of language-based or visual data. The data can consist of interview transcripts, participant observation field notes, journals, documents, open-ended survey responses, drawings, artifacts, photographs, video, Internet sites, e-mail correspondence, academic and fictional literature, and so on. The portion of data coded … can range in magnitude from a single word to a full paragraph, an entire page of text or a stream of moving images.… Just as a title represents and captures a book or film or poem’s primary content and essence, so does a code represent and capture a datum’s primary content and essence. (Saldaña, 2016 , p. 4)

One helpful precoding task is to divide or parse long selections of field note or interview transcript data into shorter stanzas . Stanza division unitizes or “chunks” the corpus into more manageable paragraph-like units for coding assignments and analysis. The transcript sample that follows illustrates one possible way of inserting line breaks between self-standing passages of interview text for easier readability.

Process Coding

As a first coding example, the following interview excerpt about an employed, single, lower middle-class adult male’s spending habits during a difficult economic period in the United States is coded in the right-hand margin in capital letters. The superscript numbers match the beginning of the datum unit with its corresponding code. This method is called process coding (Charmaz, 2014 ), and it uses gerunds (“-ing” words) exclusively to represent action suggested by the data. Processes can consist of observable human actions (e.g., BUYING BARGAINS), mental or internal processes (e.g., THINKING TWICE), and more conceptual ideas (e.g., APPRECIATING WHAT YOU’VE GOT). Notice that the interviewer’s (I) portions are not coded, just the participant’s (P). A code is applied each time the subtopic of the interview shifts—even within a stanza—and the same codes can (and should) be used more than once if the subtopics are similar. The central research question driving this qualitative study is, “In what ways are middle-class Americans influenced and affected by an economic recession?”

Different researchers analyzing this same piece of data may develop completely different codes, depending on their personal lenses, filters, and angles. The previous codes are only one person’s interpretation of what is happening in the data, not a definitive list. The process codes have transformed the raw data units into new symbolic representations for analysis. A listing of the codes applied to this interview transcript, in the order they appear, reads:

BUYING BARGAINS

QUESTIONING A PURCHASE

THINKING TWICE

STOCKING UP

REFUSING SACRIFICE

PRIORITIZING

FINDING ALTERNATIVES

LIVING CHEAPLY

NOTICING CHANGES

STAYING INFORMED

MAINTAINING HEALTH

PICKING UP THE TAB

APPRECIATING WHAT YOU’VE GOT

Coding the data is the first step in this approach to QDA, and categorization is just one of the next possible steps.

Qualitative Data Analysis Strategy: To Categorize

To categorize in QDA is to cluster similar or comparable codes into groups for pattern construction and further analysis. Humans categorize things in innumerable ways. Think of an average apartment or house’s layout. The rooms of a dwelling have been constructed or categorized by their builders and occupants according to function. A kitchen is designated as an area to store and prepare food and to store the cooking and dining materials, such as pots, pans, and utensils. A bedroom is designated for sleeping, a closet for clothing storage, a bathroom for bodily functions and hygiene, and so on. Each room is like a category in which related and relevant patterns of human action occur. There are exceptions now and then, such as eating breakfast in bed rather than in a dining area or living in a small studio apartment in which most possessions are contained within one large room (but nonetheless are most often organized and clustered into subcategories according to function and optimal use of space).

The point is that the patterns of social action we designate into categories during QDA are not perfectly bounded. Category construction is our best attempt to cluster the most seemingly alike things into the most seemingly appropriate groups. Categorizing is reorganizing and reordering the vast array of data from a study because it is from these smaller, larger, and meaning-rich units that we can better grasp the particular features of each one and the categories’ possible interrelationships with one another.

One analytic strategy with a list of codes is to classify them into similar clusters. The same codes share the same category, but it is also possible that a single code can merit its own group if you feel it is unique enough. After the codes have been classified, a category label is applied to each grouping. Sometimes a code can also double as a category name if you feel it best summarizes the totality of the cluster. Like coding, categorizing is an interpretive act, because there can be different ways of separating and collecting codes that seem to belong together. The cut-and-paste functions of text editing software are most useful for exploring which codes share something in common.

Below is my categorization of the 15 codes generated from the interview transcript presented earlier. Like the gerunds for process codes, the categories have also been labeled as “-ing” words to connote action. And there was no particular reason why 15 codes resulted in three categories—there could have been less or even more, but this is how the array came together after my reflections on which codes seemed to belong together. The category labels are ways of answering why they belong together. For at-a-glance differentiation, I place codes in CAPITAL LETTERS and categories in upper- and lowercase Bold Font :

Category 1: Thinking Strategically

Category 2: Spending Strategically

Category 3: Living Strategically

Notice that the three category labels share a common word: strategically . Where did this word come from? It came from analytic reflection on the original data, the codes, and the process of categorizing the codes and generating their category labels. It was the analyst’s choice based on the interpretation of what primary action was happening. Your categories generated from your coded data do not need to share a common word or phrase, but I find that this technique, when appropriate, helps build a sense of unity to the initial analytic scheme.

The three categories— Thinking Strategically, Spending Strategically , and Living Strategically —are then reflected on for how they might interact and interplay. This is where the next major facet of data analysis, analytic memos, enters the scheme. But a necessary section on the basic principles of interrelationship and analytic reasoning must precede that discussion.

Qualitative Data Analysis Strategy: To Interrelate

To interrelate in QDA is to propose connections within, between, and among the constituent elements of analyzed data. One task of QDA is to explore the ways our patterns and categories interact and interplay. I use these terms to suggest the qualitative equivalent of statistical correlation, but interaction and interplay are much more than a simple relationship. They imply interrelationship . Interaction refers to reverberative connections—for example, how one or more categories might influence and affect the others, how categories operate concurrently, or whether there is some kind of domino effect to them. Interplay refers to the structural and processual nature of categories—for example, whether some type of sequential order, hierarchy, or taxonomy exists; whether any overlaps occur; whether there is superordinate and subordinate arrangement; and what types of organizational frameworks or networks might exist among them. The positivist construct of cause and effect becomes influences and affects in QDA.

There can even be patterns of patterns and categories of categories if your mind thinks conceptually and abstractly enough. Our minds can intricately connect multiple phenomena, but only if the data and their analyses support the constructions. We can speculate about interaction and interplay all we want, but it is only through a more systematic investigation of the data—in other words, good thinking—that we can plausibly establish any possible interrelationships.

Qualitative Data Analysis Strategy: To Reason

To reason in QDA is to think in ways that lead to summative findings, causal probabilities, and evaluative conclusions. Unlike quantitative research, with its statistical formulas and established hypothesis-testing protocols, qualitative research has no standardized methods of data analysis. Rest assured, there are recommended guidelines from the field’s scholars and a legacy of analysis strategies from which to draw. But the primary heuristics (or methods of discovery) you apply during a study are retroductive, inductive, substructive, abductive , and deductive reasoning.

Retroduction is historic reconstruction, working backward to figure out how the current conditions came to exist. Induction is what we experientially explore and infer to be transferable from the particular to the general, based on an examination of the evidence and an accumulation of knowledge. Substruction takes things apart to more carefully examine the constituent elements of the whole. Abduction is surmising from a range of possibilities that which is most likely, those explanatory hunches of plausibility based on clues. Deduction is what we generally draw and conclude from established facts and evidence.

It is not always necessary to know the names of these five ways of reasoning as you proceed through analysis. In fact, you will more than likely reverberate quickly from one to another depending on the task at hand. But what is important to remember about reasoning is:

to examine the evidence carefully and make reasonable inferences;

to base your conclusions primarily on the participants’ experiences, not just your own;

not to take the obvious for granted, because sometimes the expected will not happen;

your hunches can be right and, at other times, quite wrong; and

to logically yet imaginatively think about what is going on and how it all comes together.

Futurists and inventors propose three questions when they think about creating new visions for the world: What is possible (induction)? What is plausible (abduction)? What is preferable (deduction)? These same three questions might be posed as you proceed through QDA and particularly through analytic memo writing, which is substructive and retroductive reflection on your analytic work thus far.

Qualitative Data Analysis Strategy: To Memo

To memo in QDA is to reflect in writing on the nuances, inferences, meanings, and transfer of coded and categorized data plus your analytic processes. Like field note writing, perspectives vary among practitioners as to the methods for documenting the researcher’s analytic insights and subjective experiences. Some advise that such reflections should be included in field notes as relevant to the data. Others advise that a separate researcher’s journal should be maintained for recording these impressions. And still others advise that these thoughts be documented as separate analytic memos. I prescribe the latter as a method because it is generated by and directly connected to the data themselves.

An analytic memo is a “think piece” of reflective free writing, a narrative that sets in words your interpretations of the data. Coding and categorizing are heuristics to detect some of the possible patterns and interrelationships at work within the corpus, and an analytic memo further articulates your retroductive, inductive, substructive, abductive, and deductive thinking processes on what things may mean. Though the metaphor is a bit flawed and limiting, think of codes and their consequent categories as separate jigsaw puzzle pieces and their integration into an analytic memo as the trial assembly of the complete picture.

What follows is an example of an analytic memo based on the earlier process coded and categorized interview transcript. It is intended not as the final write-up for a publication, but as an open-ended reflection on the phenomena and processes suggested by the data and their analysis thus far. As the study proceeds, however, initial and substantive analytic memos can be revisited and revised for eventual integration into the final report. Note how the memo is dated and given a title for future and further categorization, how participant quotes are occasionally included for evidentiary support, and how the category names are bolded and the codes kept in capital letters to show how they integrate or weave into the thinking:

April 14, 2017 EMERGENT CATEGORIES: A STRATEGIC AMALGAM There’s a popular saying: “Smart is the new rich.” This participant is Thinking Strategically about his spending through such tactics as THINKING TWICE and QUESTIONING A PURCHASE before he decides to invest in a product. There’s a heightened awareness of both immediate trends and forthcoming economic bad news that positively affects his Spending Strategically . However, he seems unaware that there are even more ways of LIVING CHEAPLY by FINDING ALTERNATIVES. He dines at all-you-can-eat restaurants as a way of STOCKING UP on meals, but doesn’t state that he could bring lunch from home to work, possibly saving even more money. One of his “bad habits” is cigarettes, which he refuses to give up; but he doesn’t seem to realize that by quitting smoking he could save even more money, not to mention possible health care costs. He balks at the idea of paying $2.00 for a soft drink, but doesn’t mind paying $6.00–$7.00 for a pack of cigarettes. Penny-wise and pound-foolish. Addictions skew priorities. Living Strategically , for this participant during “scary times,” appears to be a combination of PRIORITIZING those things which cannot be helped, such as pet care and personal dental care; REFUSING SACRIFICE for maintaining personal creature-comforts; and FINDING ALTERNATIVES to high costs and excessive spending. Living Strategically is an amalgam of thinking and action-oriented strategies.

There are several recommended topics for analytic memo writing throughout the qualitative study. Memos are opportunities to reflect on and write about:

A descriptive summary of the data;

How the researcher personally relates to the participants and/or the phenomenon;

The participants’ actions, reactions, and interactions;

The participants’ routines, rituals, rules, roles, and relationships;

What is surprising, intriguing, or disturbing (Sunstein & Chiseri-Strater, 2012 , p. 115);

Code choices and their operational definitions;

Emergent patterns, categories, themes, concepts, assertions, and propositions;

The possible networks and processes (links, connections, overlaps, flows) among the codes, patterns, categories, themes, concepts, assertions, and propositions;

An emergent or related existent theory;

Any problems with the study;

Any personal or ethical dilemmas with the study;

Future directions for the study;

The analytic memos generated thus far (i.e., metamemos);

Tentative answers to the study’s research questions; and

The final report for the study. (adapted from Saldaña & Omasta, 2018 , p. 54)

Since writing is analysis, analytic memos expand on the inferential meanings of the truncated codes, categories, and patterns as a transitional stage into a more coherent narrative with hopefully rich social insight.

Qualitative Data Analysis Strategy: To Code—A Different Way

The first example of coding illustrated process coding, a way of exploring general social action among humans. But sometimes a researcher works with an individual case study in which the language is unique or with someone the researcher wishes to honor by maintaining the authenticity of his or her speech in the analysis. These reasons suggest that a more participant-centered form of coding may be more appropriate.

In Vivo Coding

A second frequently applied method of coding is called in vivo coding. The root meaning of in vivo is “in that which is alive”; it refers to a code based on the actual language used by the participant (Strauss, 1987 ). The words or phrases in the data record you select as codes are those that seem to stand out as significant or summative of what is being said.

Using the same transcript of the male participant living in difficult economic times, in vivo codes are listed in the right-hand column. I recommend that in vivo codes be placed in quotation marks as a way of designating that the code is extracted directly from the data record. Note that instead of 15 codes generated from process coding, the total number of in vivo codes is 30. This is not to suggest that there should be specific numbers or ranges of codes used for particular methods. In vivo codes, however, tend to be applied more frequently to data. Again, the interviewer’s questions and prompts are not coded, just the participant’s responses:

The 30 in vivo codes are then extracted from the transcript and could be listed in the order they appear, but this time they are placed in alphabetical order as a heuristic to prepare them for analytic action and reflection:

“ALL-YOU-CAN-EAT”

“ANOTHER DING IN MY WALLET”

“BAD HABITS”

“CHEAP AND FILLING”

“COUPLE OF THOUSAND”

“DON’T REALLY NEED”

“HAVEN’T CHANGED MY HABITS”

“HIGH MAINTENANCE”

“INSURANCE IS JUST WORTHLESS”

“IT ALL ADDS UP”

“LIVED KIND OF CHEAP”

“NOT A BIG SPENDER”

“NOT AS BAD OFF”

“NOT PUTTING AS MUCH INTO SAVINGS”

“PICK UP THE TAB”

“SCARY TIMES”

“SKYROCKETED”

“SPENDING MORE”

“THE LITTLE THINGS”

“THINK TWICE”

“TWO-FOR-ONE”

Even though no systematic categorization has been conducted with the codes thus far, an analytic memo of first impressions can still be composed:

March 19, 2017 CODE CHOICES: THE EVERYDAY LANGUAGE OF ECONOMICS After eyeballing the in vivo codes list, I noticed that variants of “CHEAP” appear most often. I recall a running joke between me and a friend of mine when we were shopping for sales. We’d say, “We’re not ‘cheap,’ we’re frugal .” There’s no formal economic or business language in this transcript—no terms such as “recession” or “downsizing”—just the everyday language of one person trying to cope during “SCARY TIMES” with “ANOTHER DING IN MY WALLET.” The participant notes that he’s always “LIVED KIND OF CHEAP” and is “NOT A BIG SPENDER” and, due to his employment, “NOT AS BAD OFF” as others in the country. Yet even with his middle class status, he’s still feeling the monetary pinch, dining at inexpensive “ALL-YOU-CAN-EAT” restaurants and worried about the rising price of peanut butter, observing that he’s “NOT PUTTING AS MUCH INTO SAVINGS” as he used to. Of all the codes, “ANOTHER DING IN MY WALLET” stands out to me, particularly because on the audio recording he sounded bitter and frustrated. It seems that he’s so concerned about “THE LITTLE THINGS” because of high veterinary and dental charges. The only way to cope with a “COUPLE OF THOUSAND” dollars worth of medical expenses is to find ways of trimming the excess in everyday facets of living: “IT ALL ADDS UP.”

Like process coding, in vivo codes could be clustered into similar categories, but another simple data analytic strategy is also possible.

Qualitative Data Analysis Strategy: To Outline

To outline in QDA is to hierarchically, processually, and/or temporally assemble such things as codes, categories, themes, assertions, propositions, and concepts into a coherent, text-based display. Traditional outlining formats and content provide not only templates for writing a report but also templates for analytic organization. This principle can be found in several computer-assisted qualitative data analysis software (CAQDAS) programs through their use of such functions as “hierarchies,” “trees,” and “nodes,” for example. Basic outlining is simply a way of arranging primary, secondary, and subsecondary items into a patterned display. For example, an organized listing of things in a home might consist of the following:

Large appliances

Refrigerator

Stove-top oven

Microwave oven

Small appliances

Coffee maker

Dining room

In QDA, outlining may include descriptive nouns or topics but, depending on the study, it may also involve processes or phenomena in extended passages, such as in vivo codes or themes.

The complexity of what we learn in the field can be overwhelming, and outlining is a way of organizing and ordering that complexity so that it does not become complicated. The cut-and-paste and tab functions of a text editing page enable you to arrange and rearrange the salient items from your preliminary coded analytic work into a more streamlined flow. By no means do I suggest that the intricate messiness of life can always be organized into neatly formatted arrangements, but outlining is an analytic act that stimulates deep reflection on both the interconnectedness and the interrelationships of what we study. As an example, here are the 30 in vivo codes generated from the initial transcript analysis, arranged in such a way as to construct five major categories:

Now that the codes have been rearranged into an outline format, an analytic memo is composed to expand on the rationale and constructed meanings in progress:

March 19, 2017 NETWORKS: EMERGENT CATEGORIES The five major categories I constructed from the in vivo codes are: “SCARY TIMES,” “PRIORTY,” “ANOTHER DING IN MY WALLET,” “THE LITTLE THINGS,” and “LIVED KIND OF CHEAP.” One of the things that hit me today was that the reason he may be pinching pennies on smaller purchases is that he cannot control the larger ones he has to deal with. Perhaps the only way we can cope with or seem to have some sense of agency over major expenses is to cut back on the smaller ones that we can control. $1,000 for a dental bill? Skip lunch for a few days a week. Insulin medication to buy for a pet? Don’t buy a soft drink from a vending machine. Using this reasoning, let me try to interrelate and weave the categories together as they relate to this particular participant: During these scary economic times, he prioritizes his spending because there seems to be just one ding after another to his wallet. A general lifestyle of living cheaply and keeping an eye out for how to save money on the little things compensates for those major expenses beyond his control.

Qualitative Data Analysis Strategy: To Code—In Even More Ways

The process and in vivo coding examples thus far have demonstrated only two specific methods of 33 documented approaches (Saldaña, 2016 ). Which one(s) you choose for your analysis depends on such factors as your conceptual framework, the genre of qualitative research for your project, the types of data you collect, and so on. The following sections present four additional approaches available for coding qualitative data that you may find useful as starting points.

Descriptive Coding

Descriptive codes are primarily nouns that simply summarize the topic of a datum. This coding approach is particularly useful when you have different types of data gathered for one study, such as interview transcripts, field notes, open-ended survey responses, documents, and visual materials such as photographs. Descriptive codes not only help categorize but also index the data corpus’s basic contents for further analytic work. An example of an interview portion coded descriptively, taken from the participant living in tough economic times, follows to illustrate how the same data can be coded in multiple ways:

For initial analysis, descriptive codes are clustered into similar categories to detect such patterns as frequency (i.e., categories with the largest number of codes) and interrelationship (i.e., categories that seem to connect in some way). Keep in mind that descriptive coding should be used sparingly with interview transcript data because other coding methods will reveal richer participant dynamics.

Values Coding

Values coding identifies the values, attitudes, and beliefs of a participant, as shared by the individual and/or interpreted by the analyst. This coding method infers the “heart and mind” of an individual or group’s worldview as to what is important, perceived as true, maintained as opinion, and felt strongly. The three constructs are coded separately but are part of a complex interconnected system.

Briefly, a value (V) is what we attribute as important, be it a person, thing, or idea. An attitude (A) is the evaluative way we think and feel about ourselves, others, things, or ideas. A belief (B) is what we think and feel as true or necessary, formed from our “personal knowledge, experiences, opinions, prejudices, morals, and other interpretive perceptions of the social world” (Saldaña, 2016 , p. 132). Values coding explores intrapersonal, interpersonal, and cultural constructs, or ethos . It is an admittedly slippery task to code this way because it is sometimes difficult to discern what is a value, attitude, or belief since they are intricately interrelated. But the depth you can potentially obtain is rich. An example of values coding follows:

For analysis, categorize the codes for each of the three different constructs together (i.e., all values in one group, attitudes in a second group, and beliefs in a third group). Analytic memo writing about the patterns and possible interrelationships may reveal a more detailed and intricate worldview of the participant.

Dramaturgical Coding

Dramaturgical coding perceives life as performance and its participants as characters in a social drama. Codes are assigned to the data (i.e., a “play script”) that analyze the characters in action, reaction, and interaction. Dramaturgical coding of participants examines their objectives (OBJ) or wants, needs, and motives; the conflicts (CON) or obstacles they face as they try to achieve their objectives; the tactics (TAC) or strategies they employ to reach their objectives; their attitudes (ATT) toward others and their given circumstances; the particular emotions (EMO) they experience throughout; and their subtexts (SUB), or underlying and unspoken thoughts. The following is an example of dramaturgically coded data:

Not included in this particular interview excerpt are the emotions the participant may have experienced or talked about. His later line, “that’s another ding in my wallet,” would have been coded EMO: BITTER. A reader may not have inferred that specific emotion from seeing the line in print. But the interviewer, present during the event and listening carefully to the audio recording during transcription, noted that feeling in his tone of voice.

For analysis, group similar codes together (e.g., all objectives in one group, all conflicts in another group, all tactics in a third group) or string together chains of how participants deal with their circumstances to overcome their obstacles through tactics:

OBJ: SAVING MEAL MONEY → TAC: SKIPPING MEALS + COUPONS

Dramaturgical coding is particularly useful as preliminary work for narrative inquiry story development or arts-based research representations such as performance ethnography. The method explores how the individuals or groups manage problem solving in their daily lives.

Versus Coding

Versus (VS) coding identifies the conflicts, struggles, and power issues observed in social action, reaction, and interaction as an X VS Y code, such as MEN VS WOMEN, CONSERVATIVES VS LIBERALS, FAITH VS LOGIC, and so on. Conflicts are rarely this dichotomous; they are typically nuanced and much more complex. But humans tend to perceive these struggles with an US VS THEM mindset. The codes can range from the observable to the conceptual and can be applied to data that show humans in tension with others, themselves, or ideologies.

What follows are examples of versus codes applied to the case study participant’s descriptions of his major medical expenses:

As an initial analytic tactic, group the versus codes into one of three categories: the Stakeholders , their Perceptions and/or Actions , and the Issues at stake. Examine how the three interrelate and identify the central ideological conflict at work as an X VS Y category. Analytic memos and the final write-up can detail the nuances of the issues.

Remember that what has been profiled in this section is a broad brushstroke description of just a few basic coding processes, several of which can be compatibly mixed and matched within a single analysis (see Saldaña’s 2016   The Coding Manual for Qualitative Researchers for a complete discussion). Certainly with additional data, more in-depth analysis can occur, but coding is only one approach to extracting and constructing preliminary meanings from the data corpus. What now follows are additional methods for qualitative analysis.

Qualitative Data Analysis Strategy: To Theme

To theme in QDA is to construct summative, phenomenological meanings from data through extended passages of text. Unlike codes, which are most often single words or short phrases that symbolically represent a datum, themes are extended phrases or sentences that summarize the manifest (apparent) and latent (underlying) meanings of data (Auerbach & Silverstein, 2003 ; Boyatzis, 1998 ). Themes, intended to represent the essences and essentials of humans’ lived experiences, can also be categorized or listed in superordinate and subordinate outline formats as an analytic tactic.

Below is the interview transcript example used in the previous coding sections. (Hopefully you are not too fatigued at this point with the transcript, but it is important to know how inquiry with the same data set can be approached in several different ways.) During the investigation of the ways middle-class Americans are influenced and affected by an economic recession, the researcher noticed that participants’ stories exhibited facets of what he labeled economic intelligence , or EI (based on the formerly developed theories of Howard Gardner’s multiple intelligences and Daniel Goleman’s emotional intelligence). Notice how theming interprets what is happening through the use of two distinct phrases—ECONOMIC INTELLIGENCE IS (i.e., manifest or apparent meanings) and ECONOMIC INTELLIGENCE MEANS (i.e., latent or underlying meanings):

Unlike the 15 process codes and 30 in vivo codes in the previous examples, there are now 14 themes to work with. They could be listed in the order they appear, but one initial heuristic is to group them separately by “is” and “means” statements to detect any possible patterns (discussed later):

EI IS TAKING ADVANTAGE OF UNEXPECTED OPPORTUNITY

EI IS BUYING CHEAP

EI IS SAVING A FEW DOLLARS NOW AND THEN

EI IS SETTING PRIORITIES

EI IS FINDING CHEAPER FORMS OF ENTERTAINMENT

EI IS NOTICING PERSONAL AND NATIONAL ECONOMIC TRENDS

EI IS TAKING CARE OF ONE’S OWN HEALTH

EI MEANS THINKING BEFORE YOU ACT

EI MEANS SACRIFICE

EI MEANS KNOWING YOUR FLAWS

EI MEANS LIVING AN INEXPENSIVE LIFESTYLE

EI MEANS YOU CANNOT CONTROL EVERYTHING

EI MEANS KNOWING YOUR LUCK

There are several ways to categorize the themes as preparation for analytic memo writing. The first is to arrange them in outline format with superordinate and subordinate levels, based on how the themes seem to take organizational shape and structure. Simply cutting and pasting the themes in multiple arrangements on a text editing page eventually develops a sense of order to them. For example:

A second approach is to categorize the themes into similar clusters and to develop different category labels or theoretical constructs . A theoretical construct is an abstraction that transforms the central phenomenon’s themes into broader applications but can still use “is” and “means” as prompts to capture the bigger picture at work:

Theoretical Construct 1: EI Means Knowing the Unfortunate Present

Supporting Themes:

Theoretical Construct 2: EI Is Cultivating a Small Fortune

Theoretical Construct 3: EI Means a Fortunate Future

What follows is an analytic memo generated from the cut-and-paste arrangement of themes into “is” and “means” statements, into an outline, and into theoretical constructs:

March 19, 2017 EMERGENT THEMES: FORTUNE/FORTUNATELY/UNFORTUNATELY I first reorganized the themes by listing them in two groups: “is” and “means.” The “is” statements seemed to contain positive actions and constructive strategies for economic intelligence. The “means” statements held primarily a sense of caution and restriction with a touch of negativity thrown in. The first outline with two major themes, LIVING AN INEXPENSIVE LIFESTYLE and YOU CANNOT CONTROL EVERYTHING also had this same tone. This reminded me of the old children’s picture book, Fortunately/Unfortunately , and the themes of “fortune” as a motif for the three theoretical constructs came to mind. Knowing the Unfortunate Present means knowing what’s (most) important and what’s (mostly) uncontrollable in one’s personal economic life. Cultivating a Small Fortune consists of those small money-saving actions that, over time, become part of one’s lifestyle. A Fortunate Future consists of heightened awareness of trends and opportunities at micro and macro levels, with the understanding that health matters can idiosyncratically affect one’s fortune. These three constructs comprise this particular individual’s EI—economic intelligence.

Again, keep in mind that the examples for coding and theming were from one small interview transcript excerpt. The number of codes and their categorization would increase, given a longer interview and/or multiple interviews to analyze. But the same basic principles apply: codes and themes relegated into patterned and categorized forms are heuristics—stimuli for good thinking through the analytic memo-writing process on how everything plausibly interrelates. Methodologists vary in the number of recommended final categories that result from analysis, ranging anywhere from three to seven, with traditional grounded theorists prescribing one central or core category from coded work.

Qualitative Data Analysis Strategy: To Assert

To assert in QDA is to put forward statements that summarize particular fieldwork and analytic observations that the researcher believes credibly represent and transcend the experiences. Educational anthropologist Frederick Erickson ( 1986 ) wrote a significant and influential chapter on qualitative methods that outlined heuristics for assertion development . Assertions are declarative statements of summative synthesis, supported by confirming evidence from the data and revised when disconfirming evidence or discrepant cases require modification of the assertions. These summative statements are generated from an interpretive review of the data corpus and then supported and illustrated through narrative vignettes—reconstructed stories from field notes, interview transcripts, or other data sources that provide a vivid profile as part of the evidentiary warrant.

Coding or theming data can certainly precede assertion development as a way of gaining intimate familiarity with the data, but Erickson’s ( 1986 ) methods are a more admittedly intuitive yet systematic heuristic for analysis. Erickson promotes analytic induction and exploration of and inferences about the data, based on an examination of the evidence and an accumulation of knowledge. The goal is not to look for “proof” to support the assertions, but to look for plausibility of inference-laden observations about the local and particular social world under investigation.

Assertion development is the writing of general statements, plus subordinate yet related ones called subassertions and a major statement called a key assertion that represents the totality of the data. One also looks for key linkages between them, meaning that the key assertion links to its related assertions, which then link to their respective subassertions. Subassertions can include particulars about any discrepant related cases or specify components of their parent assertions.

Excerpts from the interview transcript of our case study will be used to illustrate assertion development at work. By now, you should be quite familiar with the contents, so I will proceed directly to the analytic example. First, there is a series of thematically related statements the participant makes:

“Buy one package of chicken, get the second one free. Now that was a bargain. And I got some.”

“With Sweet Tomatoes I get those coupons for a few bucks off for lunch, so that really helps.”

“I don’t go to movies anymore. I rent DVDs from Netflix or Redbox or watch movies online—so much cheaper than paying over ten or twelve bucks for a movie ticket.”

Assertions can be categorized into low-level and high-level inferences . Low-level inferences address and summarize what is happening within the particulars of the case or field site—the micro . High-level inferences extend beyond the particulars to speculate on what it means in the more general social scheme of things—the meso or macro . A reasonable low-level assertion about the three statements above collectively might read, The participant finds several small ways to save money during a difficult economic period . A high-level inference that transcends the case to the meso level might read, Selected businesses provide alternatives and opportunities to buy products and services at reduced rates during a recession to maintain consumer spending.

Assertions are instantiated (i.e., supported) by concrete instances of action or participant testimony, whose patterns lead to more general description outside the specific field site. The author’s interpretive commentary can be interspersed throughout the report, but the assertions should be supported with the evidentiary warrant . A few assertions and subassertions based on the case interview transcript might read as follows (and notice how high-level assertions serve as the paragraphs’ topic sentences):

Selected businesses provide alternatives and opportunities to buy products and services at reduced rates during a recession to maintain consumer spending. Restaurants, for example, need to find ways during difficult economic periods when potential customers may be opting to eat inexpensively at home rather than spending more money by dining out. Special offers can motivate cash-strapped clientele to patronize restaurants more frequently. An adult male dealing with such major expenses as underinsured dental care offers: “With Sweet Tomatoes I get those coupons for a few bucks off for lunch, so that really helps.” The film and video industries also seem to be suffering from a double-whammy during the current recession: less consumer spending on higher-priced entertainment, resulting in a reduced rate of movie theater attendance (recently 39 percent of the American population, according to a CNN report); coupled with a media technology and business revolution that provides consumers less costly alternatives through video rentals and Internet viewing: “I don’t go to movies anymore. I rent DVDs from Netflix or Redbox or watch movies online—so much cheaper than paying over ten or twelve bucks for a movie ticket.”

To clarify terminology, any assertion that has an if–then or predictive structure to it is called a proposition since it proposes a conditional event. For example, this assertion is also a proposition: “Special offers can motivate cash-strapped clientele to patronize restaurants more frequently.” Propositions are the building blocks of hypothesis testing in the field and for later theory construction. Research not only documents human action but also can sometimes formulate statements that predict it. This provides a transferable and generalizable quality in our findings to other comparable settings and contexts. And to clarify terminology further, all propositions are assertions, but not all assertions are propositions.

Particularizability —the search for specific and unique dimensions of action at a site and/or the specific and unique perspectives of an individual participant—is not intended to filter out trivial excess but to magnify the salient characteristics of local meaning. Although generalizable knowledge is difficult to formulate in qualitative inquiry since each naturalistic setting will contain its own unique set of social and cultural conditions, there will be some aspects of social action that are plausibly universal or “generic” across settings and perhaps even across time.

To work toward this, Erickson advocates that the interpretive researcher look for “concrete universals” by studying actions at a particular site in detail and then comparing those actions to actions at other sites that have also been studied in detail. The exhibit or display of these generalizable features is to provide a synoptic representation, or a view of the whole. What the researcher attempts to uncover is what is both particular and general at the site of interest, preferably from the perspective of the participants. It is from the detailed analysis of actions at a specific site that these universals can be concretely discerned, rather than abstractly constructed as in grounded theory.

In sum, assertion development is a qualitative data analytic strategy that relies on the researcher’s intense review of interview transcripts, field notes, documents, and other data to inductively formulate, with reasonable certainty, composite statements that credibly summarize and interpret participant actions and meanings and their possible representation of and transfer into broader social contexts and issues.

Qualitative Data Analysis Strategy: To Display

To display in QDA is to visually present the processes and dynamics of human or conceptual action represented in the data. Qualitative researchers use not only language but also illustrations to both analyze and display the phenomena and processes at work in the data. Tables, charts, matrices, flow diagrams, and other models and graphics help both you and your readers cognitively and conceptually grasp the essence and essentials of your findings. As you have seen thus far, even simple outlining of codes, categories, and themes is one visual tactic for organizing the scope of the data. Rich text, font, and format features such as italicizing, bolding, capitalizing, indenting, and bullet pointing provide simple emphasis to selected words and phrases within the longer narrative.

Think display was a phrase coined by methodologists Miles and Huberman ( 1994 ) to encourage the researcher to think visually as data were collected and analyzed. The magnitude of text can be essentialized into graphics for at-a-glance review. Bins in various shapes and lines of various thicknesses, along with arrows suggesting pathways and direction, render the study a portrait of action. Bins can include the names of codes, categories, concepts, processes, key participants, and/or groups.

As a simple example, Figure 29.1 illustrates the three categories’ interrelationship derived from process coding. It displays what could be the apex of this interaction, LIVING STRATEGICALLY, and its connections to THINKING STRATEGICALLY, which influences and affects SPENDING STRATEGICALLY.

Three categories’ interrelationship derived from process coding.

Figure 29.2 represents a slightly more complex (if not playful) model, based on the five major in vivo codes/categories generated from analysis. The graphic is used as a way of initially exploring the interrelationship and flow from one category to another. The use of different font styles, font sizes, and line and arrow thicknesses is intended to suggest the visual qualities of the participant’s language and his dilemmas—a way of heightening in vivo coding even further.

In vivo categories in rich text display

Accompanying graphics are not always necessary for a qualitative report. They can be very helpful for the researcher during the analytic stage as a heuristic for exploring how major ideas interrelate, but illustrations are generally included in published work when they will help supplement and clarify complex processes for readers. Photographs of the field setting or the participants (and only with their written permission) also provide evidentiary reality to the write-up and help your readers get a sense of being there.

Qualitative Data Analysis Strategy: To Narrate

To narrate in QDA is to create an evocative literary representation and presentation of the data in the form of creative nonfiction. All research reports are stories of one kind or another. But there is yet another approach to QDA that intentionally documents the research experience as story, in its traditional literary sense. Narrative inquiry serves to plot and story-line the participant’s experiences into what might be initially perceived as a fictional short story or novel. But the story is carefully crafted and creatively written to provide readers with an almost omniscient perspective about the participants’ worldview. The transformation of the corpus from database to creative nonfiction ranges from systematic transcript analysis to open-ended literary composition. The narrative, however, should be solidly grounded in and emerge from the data as a plausible rendering of social life.

The following is a narrative vignette based on interview transcript selections from the participant living through tough economic times:

Jack stood in front of the soft drink vending machine at work and looked almost worriedly at the selections. With both hands in his pants pockets, his fingers jingled the few coins he had inside them as he contemplated whether he could afford the purchase. Two dollars for a twenty-ounce bottle of Diet Coke. Two dollars. “I can practically get a two-liter bottle for that same price at the grocery store,” he thought. Then Jack remembered the upcoming dental surgery he needed—that would cost one thousand dollars—and the bottle of insulin and syringes he needed to buy for his diabetic, high maintenance cat—almost two hundred dollars. He sighed, took his hands out of his pockets, and walked away from the vending machine. He was skipping lunch that day anyway so he could stock up on dinner later at the cheap-but-filling all-you-can-eat Chinese buffet. He could get his Diet Coke there.

Narrative inquiry representations, like literature, vary in tone, style, and point of view. The common goal, however, is to create an evocative portrait of participants through the aesthetic power of literary form. A story does not always have to have a moral explicitly stated by its author. The reader reflects on personal meanings derived from the piece and how the specific tale relates to one’s self and the social world.

Qualitative Data Analysis Strategy: To Poeticize

To poeticize in QDA is to create an evocative literary representation and presentation of the data in poetic form. One approach to analyzing or documenting analytic findings is to strategically truncate interview transcripts, field notes, and other pertinent data into poetic structures. Like coding, poetic constructions capture the essence and essentials of data in a creative, evocative way. The elegance of the format attests to the power of carefully chosen language to represent and convey complex human experience.

In vivo codes (codes based on the actual words used by participants themselves) can provide imagery, symbols, and metaphors for rich category, theme, concept, and assertion development, in addition to evocative content for arts-based interpretations of the data. Poetic inquiry takes note of what words and phrases seem to stand out from the data corpus as rich material for reinterpretation. Using some of the participant’s own language from the interview transcript illustrated previously, a poetic reconstruction or “found poetry” might read as follows:

Scary Times Scary times … spending more   (another ding in my wallet) a couple of thousand   (another ding in my wallet) insurance is just worthless   (another ding in my wallet) pick up the tab   (another ding in my wallet) not putting as much into savings   (another ding in my wallet) It all adds up. Think twice:   don’t really need    skip Think twice, think cheap:   coupons   bargains   two-for-one    free Think twice, think cheaper:   stock up   all-you-can-eat    (cheap—and filling) It all adds up.

Anna Deavere Smith, a verbatim theatre performer, attests that people speak in forms of “organic poetry” in everyday life. Thus, in vivo codes can provide core material for poetic representation and presentation of lived experiences, potentially transforming the routine and mundane into the epic. Some researchers also find the genre of poetry to be the most effective way to compose original work that reflects their own fieldwork experiences and autoethnographic stories.

Qualitative Data Analysis Strategy: To Compute

To compute in QDA is to employ specialized software programs for qualitative data management and analysis. The acronym for computer-assisted qualitative data analysis software is CAQDAS. There are diverse opinions among practitioners in the field about the utility of such specialized programs for qualitative data management and analysis. The software, unlike statistical computation, does not actually analyze data for you at higher conceptual levels. These CAQDAS software packages serve primarily as a repository for your data (both textual and visual) that enables you to code them, and they can perform such functions as calculating the number of times a particular word or phrase appears in the data corpus (a particularly useful function for content analysis) and can display selected facets after coding, such as possible interrelationships. Basic software such as Microsoft Word and Excel provides utilities that can store and, with some preformatting and strategic entry, organize qualitative data to enable the researcher’s analytic review. The following Internet addresses are listed to help in exploring selected CAQDAS packages and obtaining demonstration/trial software; video tutorials are available on the companies’ websites and on YouTube:

ATLAS.ti: http://www.atlasti.com

Dedoose: http://www.dedoose.com

HyperRESEARCH: http://www.researchware.com

MAXQDA: http://www.maxqda.com

NVivo: http://www.qsrinternational.com

QDA Miner: http://www.provalisresearch.com

Quirkos: http://www.quirkos.com

Transana: http://www.transana.com

V-Note: http://www.v-note.org

Some qualitative researchers attest that the software is indispensable for qualitative data management, especially for large-scale studies. Others feel that the learning curve of most CAQDAS programs is too lengthy to be of pragmatic value, especially for small-scale studies. From my own experience, if you have an aptitude for picking up quickly on the scripts and syntax of software programs, explore one or more of the packages listed. If you are a novice to qualitative research, though, I recommend working manually or “by hand” for your first project so you can focus exclusively on the data and not on the software.

Qualitative Data Analysis Strategy: To Verify

To verify in QDA is to administer an audit of “quality control” to your analysis. After your data analysis and the development of key findings, you may be thinking to yourself, “Did I get it right?” “Did I learn anything new?” Reliability and validity are terms and constructs of the positivist quantitative paradigm that refer to the replicability and accuracy of measures. But in the qualitative paradigm, other constructs are more appropriate.

Credibility and trustworthiness (Lincoln & Guba, 1985 ) are two factors to consider when collecting and analyzing the data and presenting your findings. In our qualitative research projects, we must present a convincing story to our audiences that we “got it right” methodologically. In other words, the amount of time we spent in the field, the number of participants we interviewed, the analytic methods we used, the thinking processes evident to reach our conclusions, and so on should be “just right” to assure the reader that we have conducted our jobs soundly. But remember that we can never conclusively prove something; we can only, at best, convincingly suggest. Research is an act of persuasion.

Credibility in a qualitative research report can be established in several ways. First, citing the key writers of related works in your literature review is essential. Seasoned researchers will sometimes assess whether a novice has “done her homework” by reviewing the bibliography or references. You need not list everything that seminal writers have published about a topic, but their names should appear at least once as evidence that you know the field’s key figures and their work.

Credibility can also be established by specifying the particular data analysis methods you employed (e.g., “Interview transcripts were taken through two cycles of process coding, resulting in three primary categories”), through corroboration of data analysis with the participants themselves (e.g., “I asked my participants to read and respond to a draft of this report for their confirmation of accuracy and recommendations for revision”), or through your description of how data and findings were substantiated (e.g., “Data sources included interview transcripts, participant observation field notes, and participant response journals to gather multiple perspectives about the phenomenon”).

Data scientist W. Edwards Deming is attributed with offering this cautionary advice about making a convincing argument: “Without data, you’re just another person with an opinion.” Thus, researchers can also support their findings with relevant, specific evidence by quoting participants directly and/or including field note excerpts from the data corpus. These serve both as illustrative examples for readers and to present more credible testimony of what happened in the field.

Trustworthiness, or providing credibility to the writing, is when we inform the reader of our research processes. Some make the case by stating the duration of fieldwork (e.g., “Forty-five clock hours were spent in the field”; “The study extended over a 10-month period”). Others put forth the amounts of data they gathered (e.g., “Sixteen individuals were interviewed”; “My field notes totaled 157 pages”). Sometimes trustworthiness is established when we are up front or confessional with the analytic or ethical dilemmas we encountered (e.g., “It was difficult to watch the participant’s teaching effectiveness erode during fieldwork”; “Analysis was stalled until I recoded the entire data corpus with a new perspective”).

The bottom line is that credibility and trustworthiness are matters of researcher honesty and integrity . Anyone can write that he worked ethically, rigorously, and reflexively, but only the writer will ever know the truth. There is no shame if something goes wrong with your research. In fact, it is more than likely the rule, not the exception. Work and write transparently to achieve credibility and trustworthiness with your readers.

The length of this chapter does not enable me to expand on other QDA strategies such as to conceptualize, theorize, and write. Yet there are even more subtle thinking strategies to employ throughout the research enterprise, such as to synthesize, problematize, and create. Each researcher has his or her own ways of working, and deep reflexivity (another strategy) on your own methodology and methods as a qualitative inquirer throughout fieldwork and writing provides you with metacognitive awareness of data analysis processes and possibilities.

Data analysis is one of the most elusive practices in qualitative research, perhaps because it is a backstage, behind-the-scenes, in-your-head enterprise. It is not that there are no models to follow. It is just that each project is contextual and case specific. The unique data you collect from your unique research design must be approached with your unique analytic signature. It truly is a learning-by-doing process, so accept that and leave yourself open to discovery and insight as you carefully scrutinize the data corpus for patterns, categories, themes, concepts, assertions, propositions, and possibly new theories through strategic analysis.

Auerbach, C. F. , & Silverstein, L. B. ( 2003 ). Qualitative data: An introduction to coding and analysis . New York, NY: New York University Press.

Google Scholar

Google Preview

Birks, M. , & Mills, J. ( 2015 ). Grounded theory: A practical guide (2nd ed.). London, England: Sage.

Boyatzis, R. E. ( 1998 ). Transforming qualitative information: Thematic analysis and code development . Thousand Oaks, CA: Sage.

Bryant, A. ( 2017 ). Grounded theory and grounded theorizing: Pragmatism in research practice. New York, NY: Oxford.

Bryant, A. , & Charmaz, K. (Eds.). ( 2019 ). The Sage handbook of current developments in grounded theory . London, England: Sage.

Charmaz, K. ( 2014 ). Constructing grounded theory: A practical guide through qualitative analysis (2nd ed.). London, England: Sage.

Erickson, F. ( 1986 ). Qualitative methods in research on teaching. In M. C. Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 119–161). New York, NY: Macmillan.

Galman, S. C. ( 2013 ). The good, the bad, and the data: Shane the lone ethnographer’s basic guide to qualitative data analysis. Walnut Creek, CA: Left Coast Press.

Geertz, C. ( 1983 ). Local knowledge: Further essays in interpretive anthropology . New York, NY: Basic Books.

Lincoln, Y. S. , & Guba, E. G. ( 1985 ). Naturalistic inquiry . Newbury Park, CA: Sage.

Miles, M. B. , & Huberman, A. M. ( 1994 ). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage.

Saldaña, J. ( 2016 ). The coding manual for qualitative researchers (3rd ed.). London, England: Sage.

Saldaña, J. , & Omasta, M. ( 2018 ). Qualitative research: Analyzing life . Thousand Oaks, CA: Sage.

Stake, R. E. ( 1995 ). The art of case study research . Thousand Oaks, CA: Sage.

Stern, P. N. , & Porr, C. J. ( 2011 ). Essentials of accessible grounded theory . Walnut Creek, CA: Left Coast Press.

Strauss, A. L. ( 1987 ). Qualitative analysis for social scientists . Cambridge, England: Cambridge University Press.

Sunstein, B. S. , & Chiseri-Strater, E. ( 2012 ). FieldWorking: Reading and writing research (4th ed.). Boston, MA: Bedford/St. Martin’s.

Wertz, F. J. , Charmaz, K. , McMullen, L. M. , Josselson, R. , Anderson, R. , & McSpadden, E. ( 2011 ). Five ways of doing qualitative analysis: Phenomenological psychology, grounded theory, discourse analysis, narrative research, and intuitive inquiry . New York, NY: Guilford Press.

  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

  • Privacy Policy

Research Method

Home » Qualitative Data – Types, Methods and Examples

Qualitative Data – Types, Methods and Examples

Table of Contents

Qualitative Data

Qualitative Data

Definition:

Qualitative data is a type of data that is collected and analyzed in a non-numerical form, such as words, images, or observations. It is generally used to gain an in-depth understanding of complex phenomena, such as human behavior, attitudes, and beliefs.

Types of Qualitative Data

There are various types of qualitative data that can be collected and analyzed, including:

  • Interviews : These involve in-depth, face-to-face conversations with individuals or groups to gather their perspectives, experiences, and opinions on a particular topic.
  • Focus Groups: These are group discussions where a facilitator leads a discussion on a specific topic, allowing participants to share their views and experiences.
  • Observations : These involve observing and recording the behavior and interactions of individuals or groups in a particular setting.
  • Case Studies: These involve in-depth analysis of a particular individual, group, or organization, usually over an extended period.
  • Document Analysis : This involves examining written or recorded materials, such as newspaper articles, diaries, or public records, to gain insight into a particular topic.
  • Visual Data : This involves analyzing images or videos to understand people’s experiences or perspectives on a particular topic.
  • Online Data: This involves analyzing data collected from social media platforms, forums, or online communities to understand people’s views and opinions on a particular topic.

Qualitative Data Formats

Qualitative data can be collected and presented in various formats. Some common formats include:

  • Textual data: This includes written or transcribed data from interviews, focus groups, or observations. It can be analyzed using various techniques such as thematic analysis or content analysis.
  • Audio data: This includes recordings of interviews or focus groups, which can be transcribed and analyzed using software such as NVivo.
  • Visual data: This includes photographs, videos, or drawings, which can be analyzed using techniques such as visual analysis or semiotics.
  • Mixed media data : This includes data collected in different formats, such as audio and text. This can be analyzed using mixed methods research, which combines both qualitative and quantitative research methods.
  • Field notes: These are notes taken by researchers during observations, which can include descriptions of the setting, behaviors, and interactions of participants.

Qualitative Data Analysis Methods

Qualitative data analysis refers to the process of systematically analyzing and interpreting qualitative data to identify patterns, themes, and relationships. Here are some common methods of analyzing qualitative data:

  • Thematic analysis: This involves identifying and analyzing patterns or themes within the data. It involves coding the data into themes and subthemes and organizing them into a coherent narrative.
  • Content analysis: This involves analyzing the content of the data, such as the words, phrases, or images used. It involves identifying patterns and themes in the data and examining the relationships between them.
  • Discourse analysis: This involves analyzing the language and communication used in the data, such as the meaning behind certain words or phrases. It involves examining how the language constructs and shapes social reality.
  • Grounded theory: This involves developing a theory or framework based on the data. It involves identifying patterns and themes in the data and using them to develop a theory that explains the phenomenon being studied.
  • Narrative analysis : This involves analyzing the stories and narratives present in the data. It involves examining how the stories are constructed and how they contribute to the overall understanding of the phenomenon being studied.
  • Ethnographic analysis : This involves analyzing the culture and social practices present in the data. It involves examining how the cultural and social practices contribute to the phenomenon being studied.

Qualitative Data Collection Guide

Here are some steps to guide the collection of qualitative data:

  • Define the research question : Start by clearly defining the research question that you want to answer. This will guide the selection of data collection methods and help to ensure that the data collected is relevant to the research question.
  • Choose data collection methods : Select the most appropriate data collection methods based on the research question, the research design, and the resources available. Common methods include interviews, focus groups, observations, document analysis, and participatory research.
  • Develop a data collection plan : Develop a plan for data collection that outlines the specific procedures, timelines, and resources needed for each data collection method. This plan should include details such as how to recruit participants, how to conduct interviews or focus groups, and how to record and store data.
  • Obtain ethical approval : Obtain ethical approval from an institutional review board or ethics committee before beginning data collection. This is particularly important when working with human participants to ensure that their rights and interests are protected.
  • Recruit participants: Recruit participants based on the research question and the data collection methods chosen. This may involve purposive sampling, snowball sampling, or random sampling.
  • Collect data: Collect data using the chosen data collection methods. This may involve conducting interviews, facilitating focus groups, observing participants, or analyzing documents.
  • Transcribe and store data : Transcribe and store the data in a secure location. This may involve transcribing audio or video recordings, organizing field notes, or scanning documents.
  • Analyze data: Analyze the data using appropriate qualitative data analysis methods, such as thematic analysis or content analysis.
  • I nterpret findings : Interpret the findings of the data analysis in the context of the research question and the relevant literature. This may involve developing new theories or frameworks, or validating existing ones.
  • Communicate results: Communicate the results of the research in a clear and concise manner, using appropriate language and visual aids where necessary. This may involve writing a report, presenting at a conference, or publishing in a peer-reviewed journal.

Qualitative Data Examples

Some examples of qualitative data in different fields are as follows:

  • Sociology : In sociology, qualitative data is used to study social phenomena such as culture, norms, and social relationships. For example, a researcher might conduct interviews with members of a community to understand their beliefs and practices.
  • Psychology : In psychology, qualitative data is used to study human behavior, emotions, and attitudes. For example, a researcher might conduct a focus group to explore how individuals with anxiety cope with their symptoms.
  • Education : In education, qualitative data is used to study learning processes and educational outcomes. For example, a researcher might conduct observations in a classroom to understand how students interact with each other and with their teacher.
  • Marketing : In marketing, qualitative data is used to understand consumer behavior and preferences. For example, a researcher might conduct in-depth interviews with customers to understand their purchasing decisions.
  • Anthropology : In anthropology, qualitative data is used to study human cultures and societies. For example, a researcher might conduct participant observation in a remote community to understand their customs and traditions.
  • Health Sciences: In health sciences, qualitative data is used to study patient experiences, beliefs, and preferences. For example, a researcher might conduct interviews with cancer patients to understand how they cope with their illness.

Application of Qualitative Data

Qualitative data is used in a variety of fields and has numerous applications. Here are some common applications of qualitative data:

  • Exploratory research: Qualitative data is often used in exploratory research to understand a new or unfamiliar topic. Researchers use qualitative data to generate hypotheses and develop a deeper understanding of the research question.
  • Evaluation: Qualitative data is often used to evaluate programs or interventions. Researchers use qualitative data to understand the impact of a program or intervention on the people who participate in it.
  • Needs assessment: Qualitative data is often used in needs assessments to understand the needs of a specific population. Researchers use qualitative data to identify the most pressing needs of the population and develop strategies to address those needs.
  • Case studies: Qualitative data is often used in case studies to understand a particular case in detail. Researchers use qualitative data to understand the context, experiences, and perspectives of the people involved in the case.
  • Market research: Qualitative data is often used in market research to understand consumer behavior and preferences. Researchers use qualitative data to gain insights into consumer attitudes, opinions, and motivations.
  • Social and cultural research : Qualitative data is often used in social and cultural research to understand social phenomena such as culture, norms, and social relationships. Researchers use qualitative data to understand the experiences, beliefs, and practices of individuals and communities.

Purpose of Qualitative Data

The purpose of qualitative data is to gain a deeper understanding of social phenomena that cannot be captured by numerical or quantitative data. Qualitative data is collected through methods such as observation, interviews, and focus groups, and it provides descriptive information that can shed light on people’s experiences, beliefs, attitudes, and behaviors.

Qualitative data serves several purposes, including:

  • Generating hypotheses: Qualitative data can be used to generate hypotheses about social phenomena that can be further tested with quantitative data.
  • Providing context : Qualitative data provides a rich and detailed context for understanding social phenomena that cannot be captured by numerical data alone.
  • Exploring complex phenomena : Qualitative data can be used to explore complex phenomena such as culture, social relationships, and the experiences of marginalized groups.
  • Evaluating programs and intervention s: Qualitative data can be used to evaluate the impact of programs and interventions on the people who participate in them.
  • Enhancing understanding: Qualitative data can be used to enhance understanding of the experiences, beliefs, and attitudes of individuals and communities, which can inform policy and practice.

When to use Qualitative Data

Qualitative data is appropriate when the research question requires an in-depth understanding of complex social phenomena that cannot be captured by numerical or quantitative data.

Here are some situations when qualitative data is appropriate:

  • Exploratory research : Qualitative data is often used in exploratory research to generate hypotheses and develop a deeper understanding of a research question.
  • Understanding social phenomena : Qualitative data is appropriate when the research question requires an in-depth understanding of social phenomena such as culture, social relationships, and experiences of marginalized groups.
  • Program evaluation: Qualitative data is often used in program evaluation to understand the impact of a program on the people who participate in it.
  • Needs assessment: Qualitative data is often used in needs assessments to understand the needs of a specific population.
  • Market research: Qualitative data is often used in market research to understand consumer behavior and preferences.
  • Case studies: Qualitative data is often used in case studies to understand a particular case in detail.

Characteristics of Qualitative Data

Here are some characteristics of qualitative data:

  • Descriptive : Qualitative data provides a rich and detailed description of the social phenomena under investigation.
  • Contextual : Qualitative data is collected in the context in which the social phenomena occur, which allows for a deeper understanding of the phenomena.
  • Subjective : Qualitative data reflects the subjective experiences, beliefs, attitudes, and behaviors of the individuals and communities under investigation.
  • Flexible : Qualitative data collection methods are flexible and can be adapted to the specific needs of the research question.
  • Emergent : Qualitative data analysis is often an iterative process, where new themes and patterns emerge as the data is analyzed.
  • Interpretive : Qualitative data analysis involves interpretation of the data, which requires the researcher to be reflexive and aware of their own biases and assumptions.
  • Non-standardized: Qualitative data collection methods are often non-standardized, which means that the data is not collected in a standardized or uniform way.

Advantages of Qualitative Data

Some advantages of qualitative data are as follows:

  • Richness : Qualitative data provides a rich and detailed description of the social phenomena under investigation, allowing for a deeper understanding of the phenomena.
  • Flexibility : Qualitative data collection methods are flexible and can be adapted to the specific needs of the research question, allowing for a more nuanced exploration of social phenomena.
  • Contextualization : Qualitative data is collected in the context in which the social phenomena occur, which allows for a deeper understanding of the phenomena and their cultural and social context.
  • Subjectivity : Qualitative data reflects the subjective experiences, beliefs, attitudes, and behaviors of the individuals and communities under investigation, allowing for a more holistic understanding of the phenomena.
  • New insights : Qualitative data can generate new insights and hypotheses that can be further tested with quantitative data.
  • Participant voice : Qualitative data collection methods often involve direct participation by the individuals and communities under investigation, allowing for their voices to be heard.
  • Ethical considerations: Qualitative data collection methods often prioritize ethical considerations such as informed consent, confidentiality, and respect for the autonomy of the participants.

Limitations of Qualitative Data

Here are some limitations of qualitative data:

  • Subjectivity : Qualitative data is subjective, and the interpretation of the data depends on the researcher’s own biases, assumptions, and perspectives.
  • Small sample size: Qualitative data collection methods often involve a small sample size, which limits the generalizability of the findings.
  • Time-consuming: Qualitative data collection and analysis can be time-consuming, as it requires in-depth engagement with the data and often involves iterative processes.
  • Limited statistical analysis: Qualitative data is often not suitable for statistical analysis, which limits the ability to draw quantitative conclusions from the data.
  • Limited comparability: Qualitative data collection methods are often non-standardized, which makes it difficult to compare findings across different studies or contexts.
  • Social desirability bias : Qualitative data collection methods often rely on self-reporting by the participants, which can be influenced by social desirability bias.
  • Researcher bias: The researcher’s own biases, assumptions, and perspectives can influence the data collection and analysis, which can limit the objectivity of the findings.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Primary Data

Primary Data – Types, Methods and Examples

Research Data

Research Data – Types Methods and Examples

Quantitative Data

Quantitative Data – Types, Methods and Examples

Secondary Data

Secondary Data – Types, Methods and Examples

Research Information

Information in Research – Types and Examples

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

data analysis examples qualitative research

Home Market Research

Data Analysis in Research: Types & Methods

data-analysis-in-research

Content Index

Why analyze data in research?

Types of data in research, finding patterns in the qualitative data, methods used for data analysis in qualitative research, preparing data for analysis, methods used for data analysis in quantitative research, considerations in research data analysis, what is data analysis in research.

Definition of research in data analysis: According to LeCompte and Schensul, research data analysis is a process used by researchers to reduce data to a story and interpret it to derive insights. The data analysis process helps reduce a large chunk of data into smaller fragments, which makes sense. 

Three essential things occur during the data analysis process — the first is data organization . Summarization and categorization together contribute to becoming the second known method used for data reduction. It helps find patterns and themes in the data for easy identification and linking. The third and last way is data analysis – researchers do it in both top-down and bottom-up fashion.

LEARN ABOUT: Research Process Steps

On the other hand, Marshall and Rossman describe data analysis as a messy, ambiguous, and time-consuming but creative and fascinating process through which a mass of collected data is brought to order, structure and meaning.

We can say that “the data analysis and data interpretation is a process representing the application of deductive and inductive logic to the research and data analysis.”

Researchers rely heavily on data as they have a story to tell or research problems to solve. It starts with a question, and data is nothing but an answer to that question. But, what if there is no question to ask? Well! It is possible to explore data even without a problem – we call it ‘Data Mining’, which often reveals some interesting patterns within the data that are worth exploring.

Irrelevant to the type of data researchers explore, their mission and audiences’ vision guide them to find the patterns to shape the story they want to tell. One of the essential things expected from researchers while analyzing data is to stay open and remain unbiased toward unexpected patterns, expressions, and results. Remember, sometimes, data analysis tells the most unforeseen yet exciting stories that were not expected when initiating data analysis. Therefore, rely on the data you have at hand and enjoy the journey of exploratory research. 

Create a Free Account

Every kind of data has a rare quality of describing things after assigning a specific value to it. For analysis, you need to organize these values, processed and presented in a given context, to make it useful. Data can be in different forms; here are the primary data types.

  • Qualitative data: When the data presented has words and descriptions, then we call it qualitative data . Although you can observe this data, it is subjective and harder to analyze data in research, especially for comparison. Example: Quality data represents everything describing taste, experience, texture, or an opinion that is considered quality data. This type of data is usually collected through focus groups, personal qualitative interviews , qualitative observation or using open-ended questions in surveys.
  • Quantitative data: Any data expressed in numbers of numerical figures are called quantitative data . This type of data can be distinguished into categories, grouped, measured, calculated, or ranked. Example: questions such as age, rank, cost, length, weight, scores, etc. everything comes under this type of data. You can present such data in graphical format, charts, or apply statistical analysis methods to this data. The (Outcomes Measurement Systems) OMS questionnaires in surveys are a significant source of collecting numeric data.
  • Categorical data: It is data presented in groups. However, an item included in the categorical data cannot belong to more than one group. Example: A person responding to a survey by telling his living style, marital status, smoking habit, or drinking habit comes under the categorical data. A chi-square test is a standard method used to analyze this data.

Learn More : Examples of Qualitative Data in Education

Data analysis in qualitative research

Data analysis and qualitative data research work a little differently from the numerical data as the quality data is made up of words, descriptions, images, objects, and sometimes symbols. Getting insight from such complicated information is a complicated process. Hence it is typically used for exploratory research and data analysis .

Although there are several ways to find patterns in the textual information, a word-based method is the most relied and widely used global technique for research and data analysis. Notably, the data analysis process in qualitative research is manual. Here the researchers usually read the available data and find repetitive or commonly used words. 

For example, while studying data collected from African countries to understand the most pressing issues people face, researchers might find  “food”  and  “hunger” are the most commonly used words and will highlight them for further analysis.

LEARN ABOUT: Level of Analysis

The keyword context is another widely used word-based technique. In this method, the researcher tries to understand the concept by analyzing the context in which the participants use a particular keyword.  

For example , researchers conducting research and data analysis for studying the concept of ‘diabetes’ amongst respondents might analyze the context of when and how the respondent has used or referred to the word ‘diabetes.’

The scrutiny-based technique is also one of the highly recommended  text analysis  methods used to identify a quality data pattern. Compare and contrast is the widely used method under this technique to differentiate how a specific text is similar or different from each other. 

For example: To find out the “importance of resident doctor in a company,” the collected data is divided into people who think it is necessary to hire a resident doctor and those who think it is unnecessary. Compare and contrast is the best method that can be used to analyze the polls having single-answer questions types .

Metaphors can be used to reduce the data pile and find patterns in it so that it becomes easier to connect data with theory.

Variable Partitioning is another technique used to split variables so that researchers can find more coherent descriptions and explanations from the enormous data.

LEARN ABOUT: Qualitative Research Questions and Questionnaires

There are several techniques to analyze the data in qualitative research, but here are some commonly used methods,

  • Content Analysis:  It is widely accepted and the most frequently employed technique for data analysis in research methodology. It can be used to analyze the documented information from text, images, and sometimes from the physical items. It depends on the research questions to predict when and where to use this method.
  • Narrative Analysis: This method is used to analyze content gathered from various sources such as personal interviews, field observation, and  surveys . The majority of times, stories, or opinions shared by people are focused on finding answers to the research questions.
  • Discourse Analysis:  Similar to narrative analysis, discourse analysis is used to analyze the interactions with people. Nevertheless, this particular method considers the social context under which or within which the communication between the researcher and respondent takes place. In addition to that, discourse analysis also focuses on the lifestyle and day-to-day environment while deriving any conclusion.
  • Grounded Theory:  When you want to explain why a particular phenomenon happened, then using grounded theory for analyzing quality data is the best resort. Grounded theory is applied to study data about the host of similar cases occurring in different settings. When researchers are using this method, they might alter explanations or produce new ones until they arrive at some conclusion.

LEARN ABOUT: 12 Best Tools for Researchers

Data analysis in quantitative research

The first stage in research and data analysis is to make it for the analysis so that the nominal data can be converted into something meaningful. Data preparation consists of the below phases.

Phase I: Data Validation

Data validation is done to understand if the collected data sample is per the pre-set standards, or it is a biased data sample again divided into four different stages

  • Fraud: To ensure an actual human being records each response to the survey or the questionnaire
  • Screening: To make sure each participant or respondent is selected or chosen in compliance with the research criteria
  • Procedure: To ensure ethical standards were maintained while collecting the data sample
  • Completeness: To ensure that the respondent has answered all the questions in an online survey. Else, the interviewer had asked all the questions devised in the questionnaire.

Phase II: Data Editing

More often, an extensive research data sample comes loaded with errors. Respondents sometimes fill in some fields incorrectly or sometimes skip them accidentally. Data editing is a process wherein the researchers have to confirm that the provided data is free of such errors. They need to conduct necessary checks and outlier checks to edit the raw edit and make it ready for analysis.

Phase III: Data Coding

Out of all three, this is the most critical phase of data preparation associated with grouping and assigning values to the survey responses . If a survey is completed with a 1000 sample size, the researcher will create an age bracket to distinguish the respondents based on their age. Thus, it becomes easier to analyze small data buckets rather than deal with the massive data pile.

LEARN ABOUT: Steps in Qualitative Research

After the data is prepared for analysis, researchers are open to using different research and data analysis methods to derive meaningful insights. For sure, statistical analysis plans are the most favored to analyze numerical data. In statistical analysis, distinguishing between categorical data and numerical data is essential, as categorical data involves distinct categories or labels, while numerical data consists of measurable quantities. The method is again classified into two groups. First, ‘Descriptive Statistics’ used to describe data. Second, ‘Inferential statistics’ that helps in comparing the data .

Descriptive statistics

This method is used to describe the basic features of versatile types of data in research. It presents the data in such a meaningful way that pattern in the data starts making sense. Nevertheless, the descriptive analysis does not go beyond making conclusions. The conclusions are again based on the hypothesis researchers have formulated so far. Here are a few major types of descriptive analysis methods.

Measures of Frequency

  • Count, Percent, Frequency
  • It is used to denote home often a particular event occurs.
  • Researchers use it when they want to showcase how often a response is given.

Measures of Central Tendency

  • Mean, Median, Mode
  • The method is widely used to demonstrate distribution by various points.
  • Researchers use this method when they want to showcase the most commonly or averagely indicated response.

Measures of Dispersion or Variation

  • Range, Variance, Standard deviation
  • Here the field equals high/low points.
  • Variance standard deviation = difference between the observed score and mean
  • It is used to identify the spread of scores by stating intervals.
  • Researchers use this method to showcase data spread out. It helps them identify the depth until which the data is spread out that it directly affects the mean.

Measures of Position

  • Percentile ranks, Quartile ranks
  • It relies on standardized scores helping researchers to identify the relationship between different scores.
  • It is often used when researchers want to compare scores with the average count.

For quantitative research use of descriptive analysis often give absolute numbers, but the in-depth analysis is never sufficient to demonstrate the rationale behind those numbers. Nevertheless, it is necessary to think of the best method for research and data analysis suiting your survey questionnaire and what story researchers want to tell. For example, the mean is the best way to demonstrate the students’ average scores in schools. It is better to rely on the descriptive statistics when the researchers intend to keep the research or outcome limited to the provided  sample  without generalizing it. For example, when you want to compare average voting done in two different cities, differential statistics are enough.

Descriptive analysis is also called a ‘univariate analysis’ since it is commonly used to analyze a single variable.

Inferential statistics

Inferential statistics are used to make predictions about a larger population after research and data analysis of the representing population’s collected sample. For example, you can ask some odd 100 audiences at a movie theater if they like the movie they are watching. Researchers then use inferential statistics on the collected  sample  to reason that about 80-90% of people like the movie. 

Here are two significant areas of inferential statistics.

  • Estimating parameters: It takes statistics from the sample research data and demonstrates something about the population parameter.
  • Hypothesis test: I t’s about sampling research data to answer the survey research questions. For example, researchers might be interested to understand if the new shade of lipstick recently launched is good or not, or if the multivitamin capsules help children to perform better at games.

These are sophisticated analysis methods used to showcase the relationship between different variables instead of describing a single variable. It is often used when researchers want something beyond absolute numbers to understand the relationship between variables.

Here are some of the commonly used methods for data analysis in research.

  • Correlation: When researchers are not conducting experimental research or quasi-experimental research wherein the researchers are interested to understand the relationship between two or more variables, they opt for correlational research methods.
  • Cross-tabulation: Also called contingency tables,  cross-tabulation  is used to analyze the relationship between multiple variables.  Suppose provided data has age and gender categories presented in rows and columns. A two-dimensional cross-tabulation helps for seamless data analysis and research by showing the number of males and females in each age category.
  • Regression analysis: For understanding the strong relationship between two variables, researchers do not look beyond the primary and commonly used regression analysis method, which is also a type of predictive analysis used. In this method, you have an essential factor called the dependent variable. You also have multiple independent variables in regression analysis. You undertake efforts to find out the impact of independent variables on the dependent variable. The values of both independent and dependent variables are assumed as being ascertained in an error-free random manner.
  • Frequency tables: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Analysis of variance: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Researchers must have the necessary research skills to analyze and manipulation the data , Getting trained to demonstrate a high standard of research practice. Ideally, researchers must possess more than a basic understanding of the rationale of selecting one statistical method over the other to obtain better data insights.
  • Usually, research and data analytics projects differ by scientific discipline; therefore, getting statistical advice at the beginning of analysis helps design a survey questionnaire, select data collection methods , and choose samples.

LEARN ABOUT: Best Data Collection Tools

  • The primary aim of data research and analysis is to derive ultimate insights that are unbiased. Any mistake in or keeping a biased mind to collect data, selecting an analysis method, or choosing  audience  sample il to draw a biased inference.
  • Irrelevant to the sophistication used in research data and analysis is enough to rectify the poorly defined objective outcome measurements. It does not matter if the design is at fault or intentions are not clear, but lack of clarity might mislead readers, so avoid the practice.
  • The motive behind data analysis in research is to present accurate and reliable data. As far as possible, avoid statistical errors, and find a way to deal with everyday challenges like outliers, missing data, data altering, data mining , or developing graphical representation.

LEARN MORE: Descriptive Research vs Correlational Research The sheer amount of data generated daily is frightening. Especially when data analysis has taken center stage. in 2018. In last year, the total data supply amounted to 2.8 trillion gigabytes. Hence, it is clear that the enterprises willing to survive in the hypercompetitive world must possess an excellent capability to analyze complex research data, derive actionable insights, and adapt to the new market needs.

LEARN ABOUT: Average Order Value

QuestionPro is an online survey platform that empowers organizations in data analysis and research and provides them a medium to collect data by creating appealing surveys.

MORE LIKE THIS

data information vs insight

Data Information vs Insight: Essential differences

May 14, 2024

pricing analytics software

Pricing Analytics Software: Optimize Your Pricing Strategy

May 13, 2024

relationship marketing

Relationship Marketing: What It Is, Examples & Top 7 Benefits

May 8, 2024

email survey tool

The Best Email Survey Tool to Boost Your Feedback Game

May 7, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

data analysis examples qualitative research

No products in the cart.

What is Qualitative Data Analysis Software (QDA Software)?

data analysis examples qualitative research

Qualitative Data Analysis Software (QDA software) allows researchers to organize, analyze and visualize their data, finding the patterns in qualitative data or unstructured data: interviews, surveys, field notes, videos, audio files, images, journal articles interviews, web content etc.

Quantitative vs. Qualitative Data Analysis

What is the difference between quantitative and qualitative data analysis. As the name implies, quantitative data analysis has to do with numbers. For example, any time you are doing statistical analysis, you are doing quantitative data analysis. Some examples of quantitative data analysis software are SPSS, STATA, SAS, and Lumivero’s own powerful statistics software, XLSTAT .

In contrast, qualitative analysis "helps you understand people’s perceptions and experiences by systematically coding and analyzing the data", as described in Qualitative vs Quantitative Research 101 . It tends to deal more with words than numbers. It can be useful when working with a lot of rich and deep data and when you aren’t trying to test something very specific. Some examples of qualitative data analysis software are MAXQDA, ATLAS.ti, Quirkos, and Lumivero’s NVivo, the leading tool for qualitative data analysis .

When would you use each one? Well, qualitative data analysis is often used for exploratory research or developing a theory, whereas quantitative is better if you want to test a hypothesis, find averages, and determine relationships between variables. With quantitative research you often want a large sample size to get relevant statistics. In contrast, qualitative research, because so much data in the form of text is involved, can have much smaller sample sizes and still yield valuable insights.

Of course, it’s not always so cut and dry, and many researchers end up taking a «mixed methods» approach, meaning that they combine both types of research. In this case they might use a combination of both types of software programs.

Learn how some qualitative researchers use QDA software for text analysis in the on-demand webinar Twenty-Five Qualitative Researchers Share How-To's for Data Analysis .

NVivo Demo Request

How is Qualitative Data Analysis Software Used for Research?

Qualitative Data Analysis Software works with any qualitative research methodology used by a researcher For example, software for qualitative data analysis can be used by a social scientist wanting to develop new concepts or theories may take a ‘grounded theory’ approach. Or a researcher looking for ways to improve health policy or program design might use ‘evaluation methods’. QDA software analysis tools don't favor a particular methodology — they're designed to facilitate common qualitative techniques no matter what method you use.

NVivo can help you to manage, explore and find patterns in your data and conduct thematic and sentiment analysis, but it cannot replace your analytical expertise.

Qualitative Research as an Iterative Process

Handling qualitative and mixed methods data is not usually a step-by-step process. Instead, it tends to be an iterative process where you explore, code, reflect, memo, code some more, query and so on. For example, this picture shows a path you might take to investigate an interesting theme using QDA software, like NVivo, to analyze data:

data analysis examples qualitative research

How Do I Choose the Best Approach for My Research Project with QDA Software?

Every research project is unique — the way you organize and analyze the material depends on your methodology, data and research design.

Here are some example scenarios for handling different types of research projects in QDA software — these are just suggestions to get you up and running.

A study with interviews exploring stakeholder perception of a community arts program

Your files consist of unstructured interview documents. You would set up a case for each interview participant, then code to codes and cases. You could then explore your data with simple queries or charts and use memos to record your discoveries.

data analysis examples qualitative research

A study exploring community perceptions about climate change using autocoding with AI

Your files consist of structured, consistently formatted interviews (where each participant is asked the same set of questions). With AI, you could autocode the interviews and set up cases for each participant. Then code themes to query and visualize your data.

data analysis examples qualitative research

A literature review on adolescent depression

Your files consist of journal articles, books and web pages. You would classify your files before coding and querying them; and then you could critique each file in a memo. With Citavi integration in NVivo, you can import your Citavi references into NVivo.

data analysis examples qualitative research

A social media study of the language used by members of an online community

Your files consist of Facebook data captured with NCapture. You would import it as a dataset ready to code and query. Use memos to record your insights.

data analysis examples qualitative research

A quick analysis of a local government budget survey

Your file is a large dataset of survey responses. You would import it using the Survey Import Wizard, which prepares your data for analysis. As part of the import, choose to run automated insights with AI to identify and code to themes and sentiment so that you can quickly review results and report broad findings.

data analysis examples qualitative research

Ways to Get Started with Your Project with Qualitative Analysis Software

Since projects (and researchers) are unique there is no one 'best practice' approach to organizing and analyzing your data but there are some useful strategies to help you get up and running:

  • Start now - don't wait until you have collected all the data. Import your research design, grant application or thesis proposal.
  • Make a  project journa l and state your research questions and record your goals. Why are you doing the project? What is it about? What do you expect to find and why?
  • Make a  mind map  for your preliminary ideas. Show the relationships or patterns you expect to find in your data based on prior experience or preliminary reading.
  • Import your interviews, field notes, focus groups —organize these files into folders for easy access.
  • Set up an initial code structure based on your early reading and ideas—you could run a  Word Frequency query over your data to tease out the common themes for creating your code structure.
  • Set up  cases  for the people, places or other cases in your project.
  • Explore your material and  code themes as they emerge in your data mining —creating memos and describing your discoveries and interpretations.
  • To protect your work, get in the habit of making regular back-ups.

QDA Analysis Tools Help You Work Toward Outcomes that are Robust and Transparent

Using QDA software to organize and analyze your data also increases the 'transparency' of your research outcomes—for example, you can:

  • Demonstrate the evolution of your ideas in memos and maps.
  • Document your early preconceptions and biases (in a memo or map) and demonstrate how these have been acknowledged and tested.
  • Easily find illustrative quotes.
  • Always return to the original context of your coded material.
  • Save and revisit the queries and visualizations that helped you to arrive at your conclusions.

QDA software, like NVivo, can demonstrate the credibility of your findings in the following ways:

  • If you used NVivo for your literature review, run a  query  or create a  chart  to demonstrate how your findings compare with the views of other authors.
  • Was an issue or theme reported by more than one participant? Run a  Matrix Coding query  to see how many participants talked about a theme.
  • Were multiple methods used to collect the data (interviews, observations, surveys)—and are the findings supported across these text data and video data files? Run a Matrix Coding query to see how often a theme is reported across all your files.

data analysis examples qualitative research

  • If multiple researchers analyzed the material — were their findings consistent? Use coding stripes (or filter the contents in a code) to see how various team members have coded the material and run a Coding Comparison query to assess the level of agreement.

data analysis examples qualitative research

QDA Software Integrations

Many qualitative analysis software options have integration with other software to enhance your research process. NVivo integrates or can be used with the following software:

  • NVivo Transcription to save you time and jump start your qualitative data analysis. Learn how in the on-demand webinar Transcription – Go Beyond the Words .
  • Reference management software, like Lumivero’s Citavi, for reference management and writing. By combining Citavi and NVivo, you can create complicated searches for certain keywords, terms, and categories and make advanced search syntax, like wildcards, boolean operators, and regular expressions. This integration allows you to take your analyses beyond reference management by developing a central location to collect references and thoughts, analyze literature, and connect empirical data.
  • Statistical software, like Lumivero’s XLSTAT , SPSS, or STATA to export your queries from NVivo to run statistical analysis
  • Qualtrics, SurveyMonkey to import your survey results into NVivo to start analyzing.

Make Choosing QDA Software Easy —  Try NVivo Today!

It's tough choosing QDA software! Test out NVivo, the most cited qualitative data analysis tool, by requesting a free 14-day trial of NVivo to start improving your qualitative and mixed methods research today.

Recent Articles

  • Python For Data Analysis
  • Data Science
  • Data Analysis with R
  • Data Analysis with Python
  • Data Visualization with Python
  • Data Analysis Examples
  • Math for Data Analysis
  • Data Analysis Interview questions
  • Artificial Intelligence
  • Data Analysis Projects
  • Machine Learning
  • Deep Learning
  • Computer Vision
  • What is Data Analytics?
  • What is Statistical Analysis in Data Science?
  • What Is Spatial Analysis, and How Does It Work
  • What is Data Analysis?
  • What is Data Munging in Analysis?
  • What is Geospatial Data Analysis?
  • What is Exploratory Data Analysis ?
  • Qualitative and Quantitative Data
  • What are Descriptive Analytics?
  • What is Prescriptive Analytics in Data Science?
  • Qualitative Data
  • Data Analysis Tutorial
  • Why Data Visualization Matters in Data Analytics?
  • What is Data Lineage?
  • Data analysis using R
  • What is Data Organization?
  • What is a Data Science Platform?

What is Qualitative Data Analysis?

Understanding qualitative information analysis is important for researchers searching for to uncover nuanced insights from non-numerical statistics. By exploring qualitative statistics evaluation, you can still draw close its importance in studies, understand its methodologies, and determine while and the way to apply it successfully to extract meaningful insights from qualitative records.

The article goals to provide a complete manual to expertise qualitative records evaluation, masking its significance, methodologies, steps, advantages, disadvantages, and applications.

What-is-Qualitative-Data-Analysis

Table of Content

Understanding Qualitative Data Analysis

Importance of qualitative data analysis, steps to perform qualitative data analysis, 1. craft clear research questions, 2. gather rich customer insights, 3. organize and categorize data, 4. uncover themes and patterns : coding, 5. make hypotheses and validating, methodologies in qualitative data analysis, advantages of qualitative data analysis, disadvantages of qualitative data analysis, when qualitative data analysis is used, applications of qualitative data analysis.

Qualitative data analysis is the process of systematically examining and deciphering qualitative facts (such as textual content, pix, motion pictures, or observations) to discover patterns, themes, and meanings inside the statistics· Unlike quantitative statistics evaluation, which focuses on numerical measurements and statistical strategies, qualitative statistics analysis emphasizes know-how the context, nuances, and subjective views embedded inside the information.

Qualitative facts evaluation is crucial because it is going past the bloodless hard information and numbers to provide a richer expertise of why and the way things appear. Qualitative statistics analysis is important for numerous motives:

  • Understanding Complexity and unveils the “Why” : Quantitative facts tells you “what” came about (e· g·, sales figures), however qualitative evaluation sheds light on the motives in the back of it (e·g·, consumer comments on product features).
  • Contextual Insight : Numbers don’t exist in a vacuum. Qualitative information affords context to quantitative findings, making the bigger photo clearer· Imagine high customer churn – interviews would possibly monitor lacking functionalities or perplexing interfaces.
  • Uncovers Emotions and Opinions: Qualitative records faucets into the human element· Surveys with open ended questions or awareness companies can display emotions, critiques, and motivations that can’t be captured by using numbers on my own.
  • Informs Better Decisions: By understanding the “why” and the “how” at the back of customer behavior or employee sentiment, companies can make greater knowledgeable decisions about product improvement, advertising techniques, and internal techniques.
  • Generates New Ideas : Qualitative analysis can spark clean thoughts and hypotheses· For example, via analyzing consumer interviews, commonplace subject matters may emerge that cause totally new product features.
  • Complements Quantitative Data : While each facts sorts are precious, they paintings quality collectively· Imagine combining website site visitors records (quantitative) with person comments (qualitative) to apprehend user revel in on a particular webpage.

In essence, qualitative data evaluation bridges the gap among the what and the why, providing a nuanced know-how that empowers better choice making·

Steps-to-Perform-Qualitative-Data-Analysis

Qualitative data analysis process, follow the structure in below steps:

Qualitative information evaluation procedure, comply with the shape in underneath steps:

Before diving into evaluation, it is critical to outline clear and particular studies questions. These questions ought to articulate what you want to study from the data and manual your analysis towards actionable insights. For instance, asking “How do employees understand the organizational culture inside our agency?” helps makes a speciality of know-how personnel’ perceptions of the organizational subculture inside a selected business enterprise. By exploring employees’ perspectives, attitudes, and stories related to organizational tradition, researchers can find valuable insights into workplace dynamics, communication patterns, management patterns, and worker delight degrees.

There are numerous methods to acquire qualitative information, each offering specific insights into client perceptions and reviews.

  • User Feedback: In-app surveys, app rankings, and social media feedback provide direct remarks from users approximately their studies with the products or services.
  • In-Depth Interviews : One-on-one interviews allow for deeper exploration of particular topics and offer wealthy, special insights into individuals’ views and behaviors.
  • Focus Groups : Facilitating group discussions allows the exploration of numerous viewpoints and permits individuals to construct upon every different’s ideas.
  • Review Sites : Analyzing purchaser critiques on systems like Amazon, Yelp, or app shops can monitor not unusual pain points, pride levels, and areas for improvement.
  • NPS Follow-Up Questions : Following up on Net Promoter Score (NPS) surveys with open-ended questions allows customers to elaborate on their rankings and provides qualitative context to quantitative ratings.

Efficient facts below is crucial for powerful analysis and interpretation.

  • Centralize: Gather all qualitative statistics, along with recordings, notes, and transcripts, right into a valuable repository for smooth get admission to and control.
  • Categorize through Research Question : Group facts primarily based at the specific studies questions they deal with. This organizational structure allows maintain consciousness in the course of analysis and guarantees that insights are aligned with the research objectives.

Coding is a scientific manner of assigning labels or categories to segments of qualitative statistics to uncover underlying issues and patterns.

  • Theme Identification : Themes are overarching principles or ideas that emerge from the records· During coding, researchers perceive and label segments of statistics that relate to those themes, bearing in mind the identification of vital principles in the dataset.
  • Pattern Detection : Patterns seek advice from relationships or connections between exceptional elements in the statistics. By reading coded segments, researchers can locate trends, repetitions, or cause-and-effect relationships, imparting deeper insights into patron perceptions and behaviors.

Based on the identified topics and styles, researchers can formulate hypotheses and draw conclusions about patron experiences and choices.

  • Hypothesis Formulation: Hypotheses are tentative causes or predictions based on found styles within the information. Researchers formulate hypotheses to provide an explanation for why certain themes or styles emerge and make predictions approximately their effect on patron behavior.
  • Validation : Researchers validate hypotheses by means of segmenting the facts based on one-of-a-kind standards (e.g., demographic elements, usage patterns) and analyzing variations or relationships inside the records. This procedure enables enhance the validity of findings and offers proof to assist conclusions drawn from qualitative evaluation.

There are five common methodologies utilized in Qualitative Data Analysis·

  • Thematic Analysis : Thematic Analysis involves systematically figuring out and reading habitual subject matters or styles within qualitative statistics. Researchers begin with the aid of coding the facts, breaking it down into significant segments, and then categorizing these segments based on shared traits. Through iterative analysis, themes are advanced and refined, permitting researchers to benefit insight into the underlying phenomena being studied.
  • Content Analysis: Content Analysis focuses on reading textual information to pick out and quantify particular styles or issues. Researchers code the statistics primarily based on predefined classes or subject matters, taking into consideration systematic agency and interpretation of the content. By analyzing how frequently positive themes occur and the way they’re represented inside the data, researchers can draw conclusions and insights relevant to their research objectives.
  • Narrative Analysis: Narrative Analysis delves into the narrative or story within qualitative statistics, that specialize in its structure, content, and meaning. Researchers examine the narrative to understand its context and attitude, exploring how individuals assemble and speak their reports thru storytelling. By analyzing the nuances and intricacies of the narrative, researchers can find underlying issues and advantage a deeper know-how of the phenomena being studied.
  • Grounded Theory : Grounded Theory is an iterative technique to growing and checking out theoretical frameworks primarily based on empirical facts. Researchers gather, code, and examine information without preconceived hypotheses, permitting theories to emerge from the information itself. Through constant assessment and theoretical sampling, researchers validate and refine theories, main to a deeper knowledge of the phenomenon under investigation.
  • Phenomenological Analysis : Phenomenological Analysis objectives to discover and recognize the lived stories and views of people. Researchers analyze and interpret the meanings, essences, and systems of these reviews, figuring out not unusual topics and styles across individual debts. By immersing themselves in members’ subjective stories, researchers advantage perception into the underlying phenomena from the individuals’ perspectives, enriching our expertise of human behavior and phenomena.
  • Richness and Depth: Qualitative records evaluation lets in researchers to discover complex phenomena intensive, shooting the richness and complexity of human stories, behaviors, and social processes.
  • Flexibility : Qualitative techniques offer flexibility in statistics collection and evaluation, allowing researchers to conform their method based on emergent topics and evolving studies questions.
  • Contextual Understanding: Qualitative evaluation presents perception into the context and meaning of information, helping researchers recognize the social, cultural, and historic elements that form human conduct and interactions.
  • Subjective Perspectives : Qualitative methods allow researchers to explore subjective perspectives, beliefs, and reviews, offering a nuanced know-how of people’ mind, emotions, and motivations.
  • Theory Generation : Qualitative information analysis can cause the generation of recent theories or hypotheses, as researchers uncover patterns, themes, and relationships in the records that might not were formerly recognized.
  • Subjectivity: Qualitative records evaluation is inherently subjective, as interpretations can be stimulated with the aid of researchers’ biases, views, and preconceptions .
  • Time-Intensive : Qualitative records analysis may be time-consuming, requiring giant data collection, transcription, coding, and interpretation.
  • Generalizability: Findings from qualitative studies might not be effortlessly generalizable to larger populations, as the focus is often on know-how unique contexts and reviews in preference to making statistical inferences.
  • Validity and Reliability : Ensuring the validity and reliability of qualitative findings may be difficult, as there are fewer standardized methods for assessing and establishing rigor in comparison to quantitative studies.
  • Data Management : Managing and organizing qualitative information, together with transcripts, subject notes, and multimedia recordings, can be complicated and require careful documentation and garage.
  • Exploratory Research: Qualitative records evaluation is nicely-suited for exploratory studies, wherein the aim is to generate hypotheses, theories, or insights into complex phenomena.
  • Understanding Context : Qualitative techniques are precious for knowledge the context and which means of statistics, in particular in studies wherein social, cultural, or ancient factors are vital.
  • Subjective Experiences : Qualitative evaluation is good for exploring subjective stories, beliefs, and views, providing a deeper knowledge of people’ mind, feelings, and behaviors.
  • Complex Phenomena: Qualitative strategies are effective for studying complex phenomena that can not be effortlessly quantified or measured, allowing researchers to seize the richness and depth of human stories and interactions.
  • Complementary to Quantitative Data: Qualitative information analysis can complement quantitative research by means of offering context, intensity, and insight into the meanings at the back of numerical statistics, enriching our knowledge of studies findings.
  • Social Sciences: Qualitative information analysis is widely utilized in social sciences to apprehend human conduct, attitudes, and perceptions. Researchers employ qualitative methods to delve into the complexities of social interactions, cultural dynamics, and societal norms. By analyzing qualitative records which include interviews, observations, and textual resources, social scientists benefit insights into the elaborate nuances of human relationships, identity formation, and societal structures.
  • Psychology : In psychology, qualitative data evaluation is instrumental in exploring and deciphering person reports, emotions, and motivations. Qualitative methods along with in-depth interviews, cognizance businesses, and narrative evaluation allow psychologists to delve deep into the subjective stories of individuals. This approach facilitates discover underlying meanings, beliefs, and emotions, dropping light on psychological processes, coping mechanisms, and personal narratives.
  • Anthropology : Anthropologists use qualitative records evaluation to look at cultural practices, ideals, and social interactions inside various groups and societies. Through ethnographic research strategies such as player statement and interviews, anthropologists immerse themselves within the cultural contexts of different agencies. Qualitative analysis permits them to find the symbolic meanings, rituals, and social systems that form cultural identification and behavior.
  • Qualitative Market Research : In the sphere of marketplace research, qualitative statistics analysis is vital for exploring consumer options, perceptions, and behaviors. Qualitative techniques which include consciousness groups, in-depth interviews, and ethnographic research permit marketplace researchers to gain a deeper understanding of customer motivations, choice-making methods, and logo perceptions· By analyzing qualitative facts, entrepreneurs can identify emerging developments, discover unmet wishes, and tell product development and advertising and marketing techniques.
  • Healthcare: Qualitative statistics analysis plays a important function in healthcare studies via investigating patient experiences, delight, and healthcare practices. Researchers use qualitative techniques which includes interviews, observations, and patient narratives to explore the subjective reviews of people inside healthcare settings. Qualitative evaluation helps find affected person perspectives on healthcare services, treatment consequences, and pleasant of care, facilitating enhancements in patient-targeted care delivery and healthcare policy.

Qualitative data evaluation offers intensity, context, and know-how to investigate endeavors, enabling researchers to find wealthy insights and discover complicated phenomena via systematic examination of non-numerical information.

Please Login to comment...

Similar reads.

  • Data Analysis

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

  • Open access
  • Published: 09 May 2024

Getting an outsider’s perspective - sick-listed workers’ experiences with early follow-up sessions in the return to work process: a qualitative interview study

  • Martin Inge Standal 1 , 2 ,
  • Vegard Stolsmo Foldal 1 ,
  • Lene Aasdahl 1 , 3 ,
  • Egil A. Fors 1 &
  • Marit Solbjør 1  

BMC Health Services Research volume  24 , Article number:  609 ( 2024 ) Cite this article

141 Accesses

Metrics details

The aim of this study was to explore how early follow-up sessions (after 14 and 16 weeks of sick leave) with social insurance caseworkers was experienced by sick-listed workers, and how these sessions influenced their return-to-work process.

A qualitative interview study with sick-listed workers who completed two early follow-up sessions with caseworkers from the Norwegian Labor and Welfare Administration (NAV). Twenty-six individuals aged 30 to 60 years with a sick leave status of 50–100% participated in semi-structured interviews. The data was analyzed with thematic analysis.

Participants’ experiences of the early follow-up sessions could be categorized into three themes: (1) Getting an outsider’s perspective, (2) enhanced understanding of the framework for long term sick-leave, and (3) the empathic and personal face of the social insurance system. Meeting a caseworker enabled an outsider perspective that promoted critical reflection and calibration of their thoughts. This was experienced as a useful addition to the support many received from their informal network, such as friends, family, and co-workers. The meetings also enabled a greater understanding of their rights and duties, possibilities, and limitations regarding welfare benefits, while also displaying an unexpected empathic and understanding perspective from those working in the social insurance system.

For sick-listed individuals, receiving an early follow-up session from social insurance caseworkers was a positive experience that enhanced their understanding of their situation, and promoted reflection towards RTW. Thus, from the perspective of the sick-listed workers, early sessions with social insurance caseworkers could be a useful addition to the overall sickness absence follow-up.

Peer Review reports

Introduction

Returning to work (RTW) from long-term sick leave is a complex and multifaceted process [ 1 ]. Prolonged sick leave has been linked to poorer health [ 2 ] and is thought to increase the psychosocial obstacles for RTW [ 3 ]. Therefore, early RTW interventions have been suggested to be central to the RTW-process [ 3 ]. Long-term sickness absence is often understood as sick-leave beyond 4–8 weeks of work absence. Most workers return to work on their own within the first few months of absence [ 4 ] and interventions in the following weeks, can improve the likelihood of RTW for those remaining [ 5 , 6 , 7 , 8 ]. Furthermore, in the context of long-term sick leave, interventions contributing to earlier RTW can be highly cost-effective [ 9 , 10 ].

In Norway, the responsibility of early sick-leave follow-up is shared between the general practitioner (GP), who certify sick leave and assess remaining work capabilities, and the employer who should make accommodations at the workplace to facilitate RTW [ 11 ]. The employer has the main responsibility to assist their employees back to work but many employers lack the resources to properly facilitate RTW [ 12 ], and GPs may not see RTW as one of their primary focuses [ 13 ]. Thus, the existing system for early RTW follow-up in Norway, which largely rely on the cooperation between employer and employee, may not be sufficient to promote RTW [ 14 ]. This means that more effort to promote RTW might be needed. For instance, in other legislative systems RTW coordinators that assist other stakeholders and facilitate the RTW process are frequently used [ 15 , 16 ]. In Norway, there are no formal RTW coordinator roles, and the task of facilitating cooperation between stakeholders, such as the employer, healthcare services and the sick-listed, fall on social insurance caseworkers working in the Norwegian Labour and Welfare Administration (NAV). They have a counseling role in sickness absence follow-up by providing support for the employer and sick-listed worker, but they also act as a controller of eligibility for sickness benefits [ 17 ]. Ordinarily, there are few meeting points between the sick listed worker and their NAV caseworker, and most sick listed workers have their first meeting with NAV when they have been sick-listed for six months.

The impact of RTW coordinators is contested. A broad systematic review determined that RTW coordinators had little effect on RTW [ 18 ]. However, face-to-face meetings with RTW coordinators have also been shown to increase RTW rates [ 19 ]. Evidence from Norway suggest that meetings between NAV caseworkers, sick-listed individuals and other stakeholders at 26 weeks could be cost-beneficial for RTW [ 20 ]. Caseworkers reviewing possibilities and barriers to RTW has also been found to improve the caseworkers’ knowledge of the sick-listed’s situation and consequently improved RTW rates in the following months [ 21 ]. Social insurance caseworkers could thus be in a position to provide additional case-management and support in the earlier stages of sick leave. Researchers have also suggested that NAV should play a more active part in the earlier phases of long-term sick leave [ 22 ]. Similarly, caseworkers have also called for being involved earlier in the RTW process [ 23 ]. In their experience, the longer workers are on sick leave, the harder it is to facilitate RTW [ 14 ]. Moreover, sick-listed individuals in Norway also expect some form of NAV involvement in the early stage of long-term sick-leave [ 24 ].

In a recent study, sick-listed workers experienced that early follow-up sessions where NAV caseworkers used motivational interviewing helped normalize their situation and improved their beliefs in their RTW plan [ 25 ]. Given the extensive resources required to implement and adopt motivational interviewing in a social insurance setting [ 23 ], it is also useful to know how early additional follow-up sessions without a guided focus is experienced, and how they could fit within the standard follow-up for workers on long term sick-leave.

Thus, the aim of this study was to investigate how sick-listed workers experienced early additional follow-up sessions with NAV and how they experienced the influence of the sessions on their RTW process.

Materials and methods

The present study was based on 26 semi-structured individual interviews with sick-listed workers participating in a randomized controlled trial (RCT). The aim of the RCT was to evaluate the effect of motivational interviewing as an instrument for caseworkers at NAV in facilitating RTW for sick-listed workers [ 26 ]. The early follow-up sessions, which this paper focuses on served as an active control group.

The Norwegian welfare system and sickness absence follow-up

In Norway, employees are entitled to full wage benefits in the case of sickness absence, from the first day of absence to a maximum period of 52 weeks. Sick leave is in most cases certified by the individual’s general practitioner. During the first 16 days, the employer is responsible for the payment, while the rest is paid for by the National Insurance Scheme through NAV [ 27 ]. The employer must initiate a follow-up plan in cooperation with the employee before the end of the fourth week of sick leave and is responsible for arranging a meeting with the sick-listed worker within the seventh week of absence, including other stakeholders if relevant. If the employer facilitates work-related activities, the sick-listed worker is required to participate. NAV is responsible for arranging a meeting including the employer and the sick-listed worker at 26 weeks of sick leave. The attendance of the sick-listed worker’s GP is optional. However, the GP is obliged to attend if NAV deems it necessary for the coordination of the RTW process. This is the only obligatory meeting point between a sick listed worker and NAV. Additional meetings can also be held if one or more of the stakeholders find it necessary. Thus, the sick-listed worker may also ask for a meeting with NAV to coordinate a plan for RTW outside this schedule [ 27 ]. After 12 months of sick leave, it is possible to apply for the more long-term benefits, work assessment allowance and permanent disability pension.

The early follow-up sessions

The early follow-up sessions for this study were in addition to ordinary follow-up and consisted of two counseling sessions held at 14 and 16 weeks of sick leave. The sessions, offered by a NAV caseworker, lasted a maximum of 60 min and were in addition to standard NAV follow-up. During the first session, the caseworker opted to map out the sick-listed worker’s work situation, their relationship to their employer, their RTW plan, treatment plans and work ability, in addition to informing the sick-listed worker about their rights and duties as sick-listed. The caseworkers also informed about possible RTW measures through NAV. The second session focused on following up on the topics discussed in the first session, as well as focusing on any changes in the sick-listed workers’ situation that might have occurred between the first and second session.

These sessions functioned as an active control group in the RCT and were designed to be similar to the motivational interviewing sessions provided in terms of dose and timing. Caseworkers providing the active control sessions were separate from those providing the motivational interviewing sessions and they received no formal motivational interviewing training. They were, however, recruited voluntarily to the study from the same NAV-office as those performing the motivational interviewing sessions. Caseworkers were not randomized to group in the RCT and thus joined knowing that they would provide early follow-up using their usual methods.

Study population and recruitment

The study population consisted of sick-listed workers who were enrolled in the RCT. Eligible participants were sick listed workers aged 18–60 years old, living in central Norway, with any diagnoses. Their sick-leave status at the time of inclusion in the RCT were 50–100% for at least 8 weeks. Exclusion criteria were pregnancy-related sick-leave, unemployment, and being self-employed. To be eligible to participate in this interview study the sick-listed worker had to have been randomized to the active control group in the RCT and completed the early follow-up sessions. Eligible participants were identified by NAV and contact info was forwarded to the researchers. A member of the project group invited the participants to take part in the research interview by phone. A total of 40 individuals were invited to participate in the interview study, of which 14 did not answer, declined the invitation, or did not show up at the interview. Twenty-six individuals participated in the interviews, including 19 women and 7 men aged 31–61. Participants showed diversity in their self-reported reasons for being sick listed, with 11 having mental health disorders, 8 having musculoskeletal disorders, and 7 individuals reported other disorders.

Data collection

We conducted semi-structured individual interviews which allowed the participants to provide in-depth descriptions of their experiences. Interviews were based on an interview guide with five main questions concerning their experiences during sick leave, the RTW process, experiences of the two follow-up sessions, and whether these sessions led to any changes during their RTW process. The interviews were conducted between November 2018 and September 2019 and were audio recorded and transcribed verbatim. The duration of the interviews ranged from 35 min to 65 min.

Data analysis

For our data analysis, we used reflexive thematic analysis which is a method for identifying, analyzing, and reporting patterns within qualitative data [ 28 ]. Thematic analysis is a flexible approach which allows researchers to interpret the data through a six phased recursive process, moving back and forth between phases to build themes from codes. The first step of the analysis involved becoming familiar with the data [ 28 ] where transcripts of all interviews were read and re-read by authors VSF, MIS and MS to get an overall impression of the contents. Preliminary codes and patterns were identified, as a start of the coding process. The second step of the analysis was the coding process, where items of interest related to the aim were coded by author VSF. These codes were then used to create core categories for further development of initial themes [ 28 ]. The third step was combining the codes into initial themes, which is a data reducing process which allows interpretation from the researchers [ 28 ]. Initial themes were discussed among all authors. The fourth step was reviewing the generated themes and checking them against the coded data, in order to further expand or revise the developed themes [ 28 ]. When reviewing the generated themes against the coded data, the preliminary analysis indicated a tendency where participants who received good support and follow-up by their employer considered the early follow-up sessions by NAV as less useful than the participants who lacked support and follow-up by their employer. However, a coding of the interviews focusing on this aspect showed no clear tendency of favoring early follow-up sessions based on high or low employer support. Thus, the initial themes were further developed into the three main themes which will be presented below. All authors had several meetings to discuss, define and refine the final themes in order to tell a coherent and compelling story about the data [ 28 ].

All participants received written and oral information about the study and gave their written consent before the interview started. Participants were informed that participation was voluntary and that they could withdraw from the study at any time, if the data had not been anonymized and integrated in the analysis.

The study was approved by the Regional Committee for Medical and Health Research Ethics in Southeast Norway (No: 2016/2300).

Regarding receiving the two sessions, the participants had overall positive experiences with the content and timing of the first session. The second session, however, was frequently experienced as an unnecessary repetition of the first as much of the content was already covered. In the following we present our results of participants’ experience of the early follow-up sessions as three themes: (1) Getting an outsider’s perspective, (2) enhanced understanding of the framework for long term sick-leave, and (3) the empathic and personal faces of the social insurance system.

Getting an outsider’s perspective

Participants describe the meetings with a NAV caseworker as a positive experience that also challenged their current view of their situation and their RTW process. Meeting a NAV caseworker was experienced as an arena where they received guidance from an individual who examined their situation through an outsider’s perspective. NAV caseworkers provided support and encouragement, but also asked critical questions regarding their situation and their plans for RTW.

“… we talked primarily about my situation, and I felt like I was allowed to talk to someone unbiased, without you know, being limited in the conversation. And I felt like I could talk about those things important to me. […] it turned out to be a good dialogue where she pulled me further, and made me think about a couple of things” - Interview 3 .

The outside perspective was described as useful due to the participants’ context prior to the meeting, which was their everyday lives with friends, colleagues, family, GPs, and employers. This informal network was described as significant supporters during the sick leave and served an important role as confidants to whom the sick-listed worker could talk about their difficult or confusing situation. The formal support from the employer varied, where some experienced several supportive phone calls and meetings with the employer during their sick leave, while others had only had a single formal meeting. Having support from the employer was experienced as crucial for a good RTW process, and absence of support and a distant relationship to the employer led to a difficult RTW process with negative emotions and reduced belief in their RTW capabilities. Participants also experience that being able to talk freely with the employer could be difficult, and that they would be held accountable if confiding about difficulties in RTW. Thus, in contrast to the largely supportive informal network, and the restrained environment surrounding employer-support, meeting the NAV caseworkers provided a useful outside perspective. When describing the early sessions compared to their overall sick leave follow-up, participants described meeting NAV as a calibration of their thoughts and providing a new perspective compared to their other RTW supporters.

Enhanced understanding of the framework for long term sick leave

An important element of the first meeting was receiving information about rights, obligations as sick-listed, and the frame for future economic benefits. Receiving information about potential future loss of income and the possibility of having disability benefits was novel and useful for the participants. For some, this information led to new reflections on how being long-term sick-listed would have financial consequences, thereby providing another push for returning to work. For one participant, information about possible future loss of income provoked a feeling of panic and challenged her sense of identity.

“I remember that when he started talking about work assessment allowance, I panicked a bit. Because I couldn’t identify with that category. But at the same time, I thought, okay, it’s good information to have you know.” - Interview 2 .

Furthermore, the participants were happy with agenda of the first meeting where the NAV caseworkers focused on short-term, as well as long-term plans for RTW and gave personal feedback about participants’ RTW plan. Included in the short- and long-term focus was receiving information from NAV about available RTW measures and interventions. Whether the sick-listed workers were planning on a fast or slow paced RTW plan, they experienced that receiving support on their plans and ideas strengthened their beliefs in managing RTW. NAV caseworkers also presented different strategies relating to possible accommodations at work, such as adjusting workload, work tasks and working time. Information such as the possibility of adjusting their time spent at work and their sick-leave status enabled the sick-listed workers to reorient their perception towards returning to work.

“… in a way I hadn’t thought so carefully about when it’s smart to return and in what percentage. Because when I got that deal with the GP where I was still 100% sick-listed but could regulate it myself within 20% it was the first step to beginning to test myself.” - Interview 10 .

Participants received individually tailored information regarding the possibility of flexibility in the time spent at work and the amount of work they produced (i.e., sick leave percentage does not reflect hours spent at work, only the amount of work one does). This was highlighted as new and important information that was experienced as a contribution towards RTW.

The empathetic and personal face of the social insurance system

All study participants had taken part in two sessions with a caseworker from NAV. Prior to these sessions, NAV had been perceived as difficult to get in touch with and some feared that cooperation with NAV would be either difficult or absent. However, when meeting the NAV-caseworker, their fears were diminished and to their surprise, they were met by supportive, accommodating, and friendly caseworkers.

“NAV got a face; a personal face and NAV was no longer the huge colossus. The anonymous colossus that no one understands that just spews rules you have to relate to, which can be very … I can react with fear, I get afraid. “Am I doing this right?” you know. Am I following all these rules that I do not understand? What happened when NAV suddenly became a person was that they were on my side. They helped me, and it was possible to talk to NAV. A nice person helped me instead of rules that try to hinder me that I have to follow.” – Interview 19 .

The early follow-up sessions were experienced as more relevant when comparing them with other follow-up with their employer or later meetings with other caseworkers from NAV.

“I wished that the other later conversations and meetings [with NAV] was comprised of the same understanding and competence that this counselor had. So that is what I’m sitting here thinking, that this was a star example of how one should be met, you know.” – Interview 5 .

The positive experiences of the early follow-up session were due to the understanding atmosphere that was created by the caseworkers, who was perceived as genuinely interested in their situation, cooperative and jointly reflecting about their RTW plan. Caseworkers asked questions about aspects of the participants’ lives that could be related to their situation as a sick-listed worker, and they appeared attentive when listening. This led to the experience of being met as a whole person and contributed to the early follow-up sessions being experienced as an arena where they felt acknowledged and cared for.

“So, I came to NAV in high spirits and was well received and excellently informed and had a great conversation, really. Felt like I was to a psychologist, but that may be what I needed, and a neutral third-party that I feel listens to me. […] that is good medicine I think - that someone listens to what I say.” – Interview 6 .

Although some of the topics were considered quite personal, the sick-listed workers mostly experienced a respectful and reassuring dialogue with the caseworker. This personal and accommodating approach was overall positive for the participants, where the caseworkers matched their personality and behavior quite well. For several participants, the early follow-up sessions were considered almost therapeutic:

“You know, I experienced [the sessions] very positively. I met a counselor that displayed a lot of understanding and for me it was almost therapeutic to talk to her. I sat there and though wow, either something has happened to NAV or this person is hand-picked for me.” – Interview 5 .

On the other hand, talking about health-related topics such as psychological well-being while being sick-listed could be emotionally straining. Some considered this therapeutic approach to a session as out of place. When these participants experienced questions from the caseworker as too personal, they saw their caseworker as intrusive and prying into personal issues. Such situations emphasized caseworkers’ position as representative for the social insurance system with its function for control and surveillance.

The results from this study showed that the participants experienced early follow-up sessions by social insurance caseworkers as positive. They described the value of receiving an outside view of their situation and practical information about being on sick leave, while at the same time being met with a supportive and respectful demeanor. These aspects were described as promoting reflection on their situation and their thoughts on RTW. The second session was, however, frequently experienced as superfluous and a repetition of the first session. This can also be seen in the results, where participants to a large degree describe the benefits of simply meeting an understanding NAV caseworker who provide practical information and helps them reflect on their situation, which could be achieved through a single session.

The sick-listed workers who experienced good supportive contact in the current study considered this to be instrumental for their RTW process. Comparatively, some sick-listed workers experienced an absence of support and a distant relationship to their employer. Supportive contact with the employer and workplace has been found to be critical in preventing work disability [ 29 , 30 ] and important for facilitating RTW for sick-listed workers [ 31 ]. The negative impact of lack of workplace support on RTW has also been demonstrated previously [ 29 , 30 , 32 , 33 ]. In the present study, participants to a large degree experienced support from their surrounding network. However, the type of support received has been suggested to play a role, where validation and empathy-based support may promote coping behaviors that are beneficial for RTW, while solicitousness could be detrimental through encouraging illness behavior [ 34 ]. Thus, an outside view of the situation at an early stage of sick leave may be sensible. The present study show that regardless of the support from other stakeholders, getting a second opinion was an exceedingly positive experience which provided an avenue for reflection upon their current situation and their plans going forward. Openness in the dialogue with caseworkers has also been identified as relevant to experience a fair and acceptable sick leave process [ 35 ], and RTW-coordinators arguably are in a position to provide an unbiased perspective on RTW plans, independent of the other stakeholders [ 36 ].

One of the benefits experienced in the present study was a greater understanding of the framework of sick leave. Social insurance literacy relates to the sick-listed individual’s understanding of the social insurance system, how to act on the information obtained, and why decisions surrounding their situation are being made [ 36 , 37 ]. As individuals rarely have thorough knowledge of the social insurance system prior to sick-listing, social insurance literacy is also concerned with how well the system enables them to understand the process [ 38 ]. Previous research has suggested that enhancing the workers’ understanding of the system could improve their feelings of legitimacy and fairness in the process [ 35 ], and the present study provides some insight into how RTW coordinators could be experienced as helpful in this regard. Participants also described the clear agenda, in which the RTW plan was discussed, as useful. Examining barriers and facilitators for RTW and creating and re-examining the RTW plan is considered crucial to facilitate the RTW process [ 36 ]. The RTW-coordinator has also previously been suggested to have an important role in ensuring joint understanding and communication surrounding expectations and the context of long-term sick leave [ 39 ]. Thus, findings suggest that providing information on the system while inviting the sick-listed workers to reflect on their situation was experienced positively and possibly increased their social insurance literacy. However, the results in this study could also partly be explained by the context. It is possible that by voluntarily enrolling caseworkers and sick-listed workers in a research trial, a more individualized atmosphere was created in contrast to a more standardized RTW-follow-up scheme.

Nonetheless, experiences of the participants in the present study were largely positive and participants experienced being met with respect and understanding. Müssener and colleagues [ 40 ] also concluded in their study that how sick-listed individuals are treated affects their self-confidence and their perception of their ability to RTW. They suggest that the structural prerequisites for the RTW professional, such as having a gatekeeper role compared to a supportive role, seems to impact their treatment of sick-listed people [ 40 ]. The potential of the RTW coordinator to establish a good and trustful relationship with emphasis on the sick-listed workers’ motivation and resources in the RTW process has also been found to be important for RTW [ 41 , 42 , 43 ]. The conflicting roles of social insurance officers, being both facilitators and authority of benefits could potentially hinder the development of this relationship [ 41 ]. As identified by Karlsson [ 36 ], interactions between social insurance caseworkers and clients were perceived as either supportive or mistrustful. In the present study, the results suggest that the NAV-caseworkers may have had a stronger focus on the facilitator role, rather than the role of being gatekeepers of benefits.

In a recent study we found that sick-listed workers’ experienced early follow-up sessions with NAV as a positive experience and that it increased their RTW self-efficacy, when the caseworkers used motivational interviewing [ 25 ]. In the current study, the sick-listed workers met with NAV caseworkers who were not using motivational interviewing but rather using their ordinary approach when assisting sick-listed individuals. However, the experiences of the participants were strikingly similar in these two studies. The caseworker and sick-listed worker engaged in cooperatively reflections about when and how to RTW, which the sick-listed workers experienced to be valuable support and feedback for their RTW process. There may be some parallels to research on clinical psychotherapy, where studies have shown that the method of therapy may not be as important as the characteristics of the therapist [ 44 , 45 ]. For instance, having interpersonal skills that enable a therapeutic alliance in which one can effectively promote a course of action and create belief in change is considered vital [ 46 ]. Thus, being met by an emphatic and understanding caseworker may be beneficial, regardless of approach to the sessions. The present study supports the notion that having an early face-to-face meeting with a NAV caseworker can be a positive experience in the RTW-process for long-term sick-listed workers.

Whether positive experiences with the social insurance system translates into RTW-rates is still debatable. On the one hand, a recent systematic review on RTW coordinators’ impact on RTW found that work absence duration and intervention costs were reduced when sick-listed workers had face-to-face contact with a RTW coordinator [ 19 ]. On the other hand, previous research has discussed the lock-in effect of programs through the social insurance service, which may lead to longer periods on sick leave [ 47 ]. Similarly, regular contact with the social insurance office has been shown to have a negative effect on RTW-rates, which may indicate the risk of developing a ‘social insurance career’ [ 48 ]. In a previous study we found that sick-listed individuals also experienced that caseworkers frequently recommended a slower RTW pace than what was originally planned [ 25 ]. Furthermore, even though the experiences of early contact with NAV-caseworkers in the present study was positive, no impact on RTW outcomes could be identified in the trial results [ 49 ].

Strengths and limitation

A strength of the current study was the use of semi-structured interviews. This allowed the participants to elaborate and describe their experience of the early follow-up sessions in relation to their RTW process. In order to explore and uncover different experiences and nuances of the early follow-up sessions, a broad exploratory approach was used with a heterogenous sample. All analytical steps and preliminary findings were discussed with members of the research group to strengthen the interpretations, and final results were validated by all authors. The study also has some limitations. First, caseworkers performing the sessions voluntarily submitted to take part in the RCT and to undertake the follow-up sessions. They received no motivational interviewing training but were recruited from the same offices that those in the motivational interviewing group. This means there could be selection where caseworkers who were more interested in early follow-up were more likely to take part. Furthermore, there could be a spillover effect in the office, where caseworkers receiving motivational interviewing training pass on their knowledge to others in the office. We do however believe the impact of the spillover effect was small as recruitment was from one of the largest NAV-offices in Norway, and our previous study show that extensive training in motivational interviewing was required to achieve beginning proficiency [ 23 ].

Some participants in the study may have failed to recall information and details from the early follow-up sessions, since the interviews were conducted several months (ranging from 1 to 6 months) after the intervention. Although none of the participants expressed any difficulties in the interviews, there is a risk that the sick-listed workers held back information if they feared there would be consequences for their benefits. The current study recruited participants from a RCT with a response rate of approximately 15%. From this sample, the current nested study had a response rate of 65%. This indicates a selection bias, where participants agreeing to participate have different characteristics than those declining. Such bias might reduce variety in the experiences of the early follow-up sessions.

Sick-listed workers considered additional early sessions with social insurance caseworkers as a positive addition to ordinary RTW follow-up. Having these early face-to-face meeting with respectful and accommodating caseworkers that also asked critical questions about participants’ situation, provided sick-listed workers with an outside perspective that enabled them to reflect on their situation. This was experienced as a useful addition to their friends, family and colleagues who were largely supportive. Furthermore, the sessions provided the sick-listed workers with an arena for receiving practical information on the framework of sick-leave follow-up, such as rights, obligations, and possibilities in strategies for RTW. This enabled them to adjust their plan towards RTW. Finally, having individual face-to-face sessions also changed participants’ perceptions of NAV from a anonymous entity to emphatic and understanding individuals, who seemed genuinely interested in assisting them back to work. Thus, from the perspective of the sick-listed individuals, early additional follow-up sessions were experienced as exceedingly positive and would be welcomed in addition to standard follow-up.

Data availability

To protect the anonymity of the participants, the datasets generated and analyzed during the current study are not publicly available. Redacted versions are available from the corresponding author upon reasonable request.

Abbreviations

General practitioner

Norwegian Labor and Welfare Administration

  • Return to work

Randomized controlled trial

Andersen MF, Nielsen KM, Brinkmann S. Meta-synthesis of qualitative research on return to work among employees with common mental disorders. Scand J Work Environ Health. 2012;38(2):93–104.

Article   PubMed   Google Scholar  

Waddell G, Burton AK. Is work good for your health and wellbeing? London, UK: The Stationery Office; 2006.

Google Scholar  

Aylward SM. Overcoming barriers to recovery and return to work: towards behavioral and cultural change. In: Schultz I, Gatchel R, editors. Handbook of return to work. Boston, MA: Springer; 2016. https://doi.org/10.1007/978-1-4899-7627-7_7 .

Chapter   Google Scholar  

McLeod CB, Reiff E, Maas E, Bültmann U. Identifying return-to-work trajectories using sequence analysis in a cohort of workers with work-related musculoskeletal disorders. Scand J Work Env Hea. 2018;44(2):147–55. https://doi.org/10.5271/sjweh.3701 .

Article   Google Scholar  

Bültmann U, Sherson D, Olsen J, Hansen CL, Lund T, Kilsgaard J. Coordinated and tailored work rehabilitation: a randomized controlled trial with economic evaluation undertaken with workers on sick leave due to musculoskeletal disorders. J Occup Rehabil. 2009;19(1):81–93.

Palmer KT, Harris EC, Linaker C, Barker M, Lawrence W, Cooper C, Coggon D. Effectiveness of community-and workplace-based interventions to manage musculoskeletal-related sickness absence and job loss: a systematic review. Rheumatology. 2012;51(2):230–42. https://doi.org/10.1093/rheumatology/ker086 .

Roelen CA, Norder G, Koopmans PC, Van Rhenen W, Van Der Klink JJ, Bültmann U. Employees sick-listed with mental disorders: who returns to work and when? J Occup Rehabil. 2012;22(3):409–17. https://doi.org/10.1007/s10926-012-9363-3 .

Article   CAS   PubMed   Google Scholar  

Steenstra IA, Anema JR, Van Tulder MW, Bongers PM, De Vet HC, Van Mechelen W. Economic evaluation of a multi-stage return to work program for workers on sick-leave due to low back pain. J Occup Rehabil. 2006;16(4):557–78. https://doi.org/10.1007/s10926-006-9053-0 .

Dagenais S, Caro J, Haldeman S. A systematic review of low back pain cost of illness studies in the United States and internationally. Spine J. 2008;8(1):8–20. https://doi.org/10.1016/j.spinee.2007.10.005 .

van Duijn M, Eijkemans MJ, Koes BW, Koopmanschap MA, Burton KA, Burdorf A. The effects of timing on the cost-effectiveness of interventions for workers on sick leave due to low back pain. Occup Environ Med. 2010;67(11):744–50. https://doi.org/10.1136/oem.2009.049874 .

Norwegian Directorate of Health. Sykmelderveileder. Nasjonal veileder. [Guidance for sickness certification. National guideline]. https://www.helsedirektoratet.no/veiledere/sykmelderveileder . Accessed 19.03.2024.

Holmgren K, Ivanoff SD. Supervisors’ views on employer responsibility in the return to work process. A focus group study. J Occup Rehabil. 2007;17(1):93–106.

Mazza D, Brijnath B, Singh N, Kosny A, Ruseckaite R, Collie A. General practitioners and sickness certification for injury in Australia. BMC Fam Pract. 2015;16:100. https://doi.org/10.1186/s12875-015-0307-9 .

Article   PubMed Central   PubMed   Google Scholar  

Ose SO, Dyrstad K, Brattlid I, Slettebak R, Jensberg H, Mandal R, Lippestad J, Pettersen I. Oppfølging av sykmeldte–fungerer dagens regime? [Follow-up of sick-listed – does today’s regime work?]. Trondheim, NO: SINTEF; 2013.

Shaw W, Hong QN, Pransky G, Loisel P. A literature review describing the role of return-to-work coordinators in trial programs and interventions designed to prevent workplace disability. J Occup Rehabil. 2008;18(1):2–15.

MacEachen E, McDonald E, Neiterman E, et al. Return to work for Mental Ill-Health: a scoping review exploring the impact and role of return-to-work coordinators. J Occup Rehabil. 2020;30:455–65. https://doi.org/10.1007/s10926-020-09873-3 .

Article   CAS   PubMed Central   PubMed   Google Scholar  

Norwegian Ministry of Labour and Social Affairs. (2016). NAV i en ny tid – for arbeid og aktivitet. Meld. St. 33 (2015–2016). [NAV in a new age – for work and activity] Retrieved from https://www.regjeringen.no Accessed 19.03.2024.

Vogel N, Schandelmaier S, Zumbrunn T, Ebrahim S, de Boer WE, Busse JW, Kunz R. Return-to‐work coordination programmes for improving return to work in workers on sick leave. Cochrane Database Syst Rev. 2017(3). https://doi.org/10.1002/14651858.CD011618.pub2 .

Dol M, Varatharajan S, Neiterman E, McKnight E, Crouch M, McDonald E, Malachowski C, Dali N, Giau E, MacEachen E. Systematic review of the impact on return to work of return-to-work coordinators. J Occup Rehabil. 2021;31(4):675–98.

Markussen S, Røed K, Schreiner RC. Can compulsory dialogues nudge sick-listed workers back to work? Econ J. 2018;128(610):1276–303. https://doi.org/10.1111/ecoj.12468 .

Nossen JP, Brage S. Aktivitetskrav Og midlertidig stans av sykepenger - hvordan påvirkes sykefraværet? [Activity demands and temporary stop in paid sick leave – how is sickness absence affected?]. Arbeid Og Velferd. 2015;3.

Mandal R, Jakobsen Ofte H, Jensen C, Ose SO. Hvordan fungerer arbeidsavklaringspenger (AAP) som ytelse og ordning? [How does work assessment allowance work as a benefit and arrangement?]. Trondheim, Norway: SINTEF; 2015.

Foldal VS, Solbjør M, Standal MI, Fors EA, Hagen R, Bagøien G. Mfl. Barriers and facilitators for implementing motivational interviewing as a return to work intervention in a Norwegian Social Insurance setting: a mixed methods process evaluation. J Occup Rehabil. 2021;31(4):785–95.

Standal MI, Foldal VS, Hagen R, Aasdahl L, Johnsen R, Fors EA. Mfl. Health, Work, and Family strain–psychosocial experiences at the early stages of long-term sickness absence. Front Psychol. 2021;12:596073.

Foldal VS, Standal MI, Aasdahl L, Hagen R, Bagøien G, Fors EA. Mfl. Sick-listed workers’ experiences with motivational interviewing in the return to work process: a qualitative interview study. BMC Public Health. 2020;20(1):1–10.

Aasdahl L, Foldal VS, Standal MI, Hagen R, Johnsen R, Solbjør M. Mfl. Motivational interviewing in long-term sickness absence: study protocol of a randomized controlled trial followed by qualitative and economic studies. BMC Public Health. 2018;18(1):1–8.

Norwegian Labour and Welfare Administration. Sickness benefits for employees. 2023. Retrieved from https://www.nav.no/en/home/benefits-and-services/Sickness-benefit-for-employees . Accessed 19.03.2024.

Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.

Shaw WS, Robertson MM, Pransky G, McLellan RK. Employee perspectives on the role of supervisors to prevent workplace disability after injuries. J Occup Rehabil. 2003;13(3):129–42.

Jansen J, van Ooijen R, Koning PWC, et al. The role of the employer in supporting work participation of workers with disabilities: a systematic literature review using an Interdisciplinary Approach. J Occup Rehabil. 2021;31:916–49. https://doi.org/10.1007/s10926-021-09978-3 .

Buys NJ, Selander J, Sun J. Employee experience of workplace supervisor contact and support during long-term sickness absence. Disabil Rehabil. 2019;41(7):808–14.

Buys N, Wagner S, Randall C, Harder H, Geisen T, Yu I, Hassler B, Howe C, Fraess-Phillips A. Disability management and organizational culture in Australia and Canada. Work. 2017;57(3):409–19.

Kristman VL, Shaw WS, Boot CR, Delclos GL, Sullivan MJ, Ehrhart MG. Researching complex and multi-level workplace factors affecting disability and prolonged sickness absence. J Occup Rehabil. 2016;26(4):399–416.

Reme SE. Common Mental disorders and work: barriers and opportunities. Handbook of disability, work and health. 2020:467 – 81. In: Bültmann U, Siegrist J, editors. Handbook of disability, work and health. Handbook Series in Occupational Health Sciences. Volume 1. Cham, CH: Springer; 2020.

Karlsson E, Legitimacy. and comprehensibility of work-related assessments and official decisions within the sickness insurance system [Internet] [PhD dissertation]. [Linköping]: Linköping University Electronic Press; 2022. (Linköping University Medical Dissertations). https://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-183867 .

Kristman VL, Boot CR, Sanderson K, Sinden KE, Williams-Whitt K. Implementing best practice models of return to work. Handbook of disability, work and health. 2020:1–25. In: Bültmann U., Siegrist J, editors Handbook of Disability, Work and Health. Handbook Series in Occupational Health Sciences, vol 1. Cham, CH: Springer; 2020.

Christian Ståhl EA, Karlsson J, Sandqvist G, Hensing S, Brouwer EF, Ellen MacEachen. Social insurance literacy: a scoping review on how to define and measure it. Disabil Rehabil. 2021;43(12):1776–85. https://doi.org/10.1080/09638288.2019.1672111 .

Karlsson EA, Hellgren M, Sandqvist JL, et al. Social Insurance Literacy among the sick-listed—A study of clients’ comprehension and self-rated system comprehensibility of the Sickness Insurance System. J Occup Rehabil. 2024. https://doi.org/10.1007/s10926-023-10166-8 .

Corbière M, Mazaniello-Chézol M, Bastien MF, et al. Stakeholders’ role and actions in the return-to-work process of workers on sick-leave due to Common Mental disorders: a scoping review. J Occup Rehabil. 2020;30:381–419. https://doi.org/10.1007/s10926-019-09861-2 .

Müssener U, Ståhl C, Söderberg E. Does the quality of encounters affect return to work? Lay people describe their experiences of meeting various professionals during their rehabilitation process. Work. 2015;52(2):447–55.

Andersen MF, Nielsen K, Brinkmann S. How do workers with common mental disorders experience a multidisciplinary return-to-work intervention? A qualitative study. J Occup Rehabil. 2014;24(4):709–24.

Haugli L, Maeland S, Magnussen LH. What facilitates return to work? Patients experiences 3 years after occupational rehabilitation. J Occup Rehabil. 2011;21(4):573–81.

Scharf J, Angerer P, Müting G, Loerbroks A. Return to work after common mental disorders: a qualitative study exploring the expectations of the involved stakeholders. Int J Environ Res Public Health. 2020;17(18):6635.

Saxon D, Firth N, Barkham M. The relationship between therapist effects and therapy delivery factors: therapy modality, dosage, and non-completion. Adm Policy Ment Health Ment Health Serv Res. 2017;44(5):705–15.

Wampold BE, Bolt DM. Therapist effects: Clever ways to make them (and everything else) disappear. Psychother Res. 2006;16(02):184–7.

Anderson T, McClintock AS, Himawan L, Song X, Patterson CL. A prospective study of therapist facilitative interpersonal skills as a predictor of treatment outcome. J Consult Clin Psychol. 2016;84:57–66.

Røed K. Active social insurance. IZA J Labor Policy. 2012;1(1):8.

Landstad BJ, Wendelborg C, Hedlund M. Factors explaining return to work for long-term sick workers in Norway. Disabil Rehabil. 2009;31(15):1215–26.

Aasdahl L, Standal MI, Hagen R, Solbjør M, Bagøien G, Fossen H, Foldal VS, Bjørngaard JH, Rysstad T, Grotle M, Johnsen R. Effectiveness of’motivational interviewing’on sick leave: a randomized controlled trial in a social insurance setting. Scand J Work Env Hea. 2023;49(7):477.

Download references

Acknowledgements

We thank the caseworkers at NAV and the participants of the study.

Funding granted by The Research Council of Norway (Grant number: 256633). The funding organization had no role in the planning, execution or analyses of the study.

Open access funding provided by Norwegian University of Science and Technology

Author information

Authors and affiliations.

Faculty of Medicine and Health Sciences, Department of Public Health and Nursing, Norwegian University of Science and Technology (NTNU), Trondheim, Norway

Martin Inge Standal, Vegard Stolsmo Foldal, Lene Aasdahl, Egil A. Fors & Marit Solbjør

NTNU Social Research, Trondheim, Norway

Martin Inge Standal

Unicare Helsefort Rehabilitation Centre, Rissa, Norway

Lene Aasdahl

You can also search for this author in PubMed   Google Scholar

Contributions

MIS and VSF co-wrote the article. LA, EAF and MS contributed in the conception of the project. All authors designed the interview study. VSF analyzed and interpreted the data, and MIS, LA, EAF and MS contributed during the analysis process. The final categories were validated by all authors. VSF drafted the manuscript while MIS, LA, EAF and MS revised the manuscript. MIS finalized the article, and all authors revised the final version. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Martin Inge Standal .

Ethics declarations

Ethics approval and consent to participate.

The study was approved by the Regional Committees for Medical and Health Research Ethics in South East Norway (No: 2016/2300), and the trial was prospectively registered at clinicaltrials.gov NCT03212118 (registered July 11, 2017). The sick-listed workers were informed that the intervention was part of a research project and did not affect their rights or obligations as sick listed. Written informed consent was obtained from all participants prior to conducting interviews. The study was performed in accordance with the Declaration of Helsinki and the Guidelines by The Norwegian National Research Ethics Committee for medical and health research.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Standal, M.I., Foldal, V.S., Aasdahl, L. et al. Getting an outsider’s perspective - sick-listed workers’ experiences with early follow-up sessions in the return to work process: a qualitative interview study. BMC Health Serv Res 24 , 609 (2024). https://doi.org/10.1186/s12913-024-11007-x

Download citation

Received : 02 October 2023

Accepted : 18 April 2024

Published : 09 May 2024

DOI : https://doi.org/10.1186/s12913-024-11007-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Early follow-up
  • Interview study
  • Social insurance

BMC Health Services Research

ISSN: 1472-6963

data analysis examples qualitative research

IMAGES

  1. FREE 10+ Qualitative Data Analysis Samples in PDF

    data analysis examples qualitative research

  2. What Is A Qualitative Data Analysis And What Are The Steps Involved In

    data analysis examples qualitative research

  3. FREE 10+ Sample Data Analysis Templates in PDF

    data analysis examples qualitative research

  4. Methods of qualitative data analysis.

    data analysis examples qualitative research

  5. Understanding Qualitative Research: An In-Depth Study Guide

    data analysis examples qualitative research

  6. Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic

    data analysis examples qualitative research

VIDEO

  1. Analysis of Data? Some Examples to Explore

  2. Qualitative Data Analysis Workshop

  3. Qualitative Data Analysis Procedures in Linguistics

  4. Data Analysis In Excel

  5. Session 04: Data Analysis techniques in Qualitative Research

  6. Qualitative Data Analysis: From Analysis to Writing

COMMENTS

  1. Qualitative Data Analysis: What is it, Methods + Examples

    Qualitative data analysis is a systematic process of examining non-numerical data to extract meaning, patterns, and insights. In contrast to quantitative analysis, which focuses on numbers and statistical metrics, the qualitative study focuses on the qualitative aspects of data, such as text, images, audio, and videos.

  2. Qualitative Data Analysis Methods: Top 6 + Examples

    QDA Method #1: Qualitative Content Analysis. Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

  3. Learning to Do Qualitative Data Analysis: A Starting Point

    For many researchers unfamiliar with qualitative research, determining how to conduct qualitative analyses is often quite challenging. Part of this challenge is due to the seemingly limitless approaches that a qualitative researcher might leverage, as well as simply learning to think like a qualitative researcher when analyzing data. From framework analysis (Ritchie & Spencer, 1994) to content ...

  4. What Is Qualitative Research?

    Qualitative research is the opposite of quantitative research, which involves collecting and analyzing numerical data for statistical analysis. Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc. Qualitative research question examples

  5. Qualitative Data Analysis: Step-by-Step Guide (Manual vs ...

    Step 1: Gather your qualitative data and conduct research (Conduct qualitative research) The first step of qualitative research is to do data collection. Put simply, data collection is gathering all of your data for analysis. A common situation is when qualitative data is spread across various sources.

  6. PDF The SAGE Handbook of Qualitative Data Analysis

    The SAGE Handbook of. Qualitative Data Analysis. Uwe Flick. 00-Flick-Prelims.indd 5 29-Oct-13 2:00:39 PM. Data analysis is the central step in qualitative research. Whatever the data are, it is their analysis that, in a decisive way, forms the outcomes of the research. Sometimes, data collection is limited to recording and docu- menting ...

  7. (PDF) Qualitative Data Analysis and Interpretation: Systematic Search

    Qualitative data analysis is. concerned with transforming raw data by searching, evaluating, recogni sing, cod ing, mapping, exploring and describing patterns, trends, themes an d categories in ...

  8. PDF A Step-by-Step Guide to Qualitative Data Analysis

    Step 1: Organizing the Data. "Valid analysis is immensely aided by data displays that are focused enough to permit viewing of a full data set in one location and are systematically arranged to answer the research question at hand." (Huberman and Miles, 1994, p. 432) The best way to organize your data is to go back to your interview guide.

  9. PDF 12 Qualitative Data, Analysis, and Design

    The goal of qualitative data analysis is to uncover emerg - ing themes, patterns, concepts, insights, and understandings (Patton, 2002). ... for example, whereas quantitative researchers tend to value large sample sizes, manipulation ... as it does in the design and data collection phase. Qualitative research methods

  10. Data Analysis for Qualitative Research: 6 Step Guide

    How to analyze qualitative data from an interview. To analyze qualitative data from an interview, follow the same 6 steps for quantitative data analysis: Perform the interviews. Transcribe the interviews onto paper. Decide whether to either code analytical data (open, axial, selective), analyze word frequencies, or both.

  11. Qualitative Data Analysis 101 Tutorial: 6 Analysis Methods + Examples

    Learn about qualitative data analysis (QDA) and the 6 most popular qualitative data analysis methods in this simple tutorial. We explain qualitative content ...

  12. Data Analysis in Qualitative Research: A Brief Guide to Using Nvivo

    Data analysis in qualitative research is defined as the process of systematically searching and arranging the interview transcripts, observation notes, or other non-textual materials that the researcher accumulates to increase the understanding of the phenomenon.7 The process of analysing qualitative data predominantly involves coding or ...

  13. How to Analyze Qualitative Data?

    Qualitative data analysis is an important part of research and building greater understanding across fields for a number of reasons. First, cases for qualitative data analysis can be selected purposefully according to whether they typify certain characteristics or contextual locations. In other words, qualitative data permits deep immersion into a topic, phenomenon, or area of interest.

  14. 6 Qualitative Data Analysis Examples To Inspire you

    Here are six qualitative data analysis examples to inspire you to improve your own process: 1. Art.com. Art.com is an ecommerce company selling art prints. Their 100% happiness guarantee—they'll issue a full refund, no questions asked—shows their commitment to putting customers first.

  15. Qualitative data analysis: a practical example

    The aim of this paper is to equip readers with an understanding of the principles of qualitative data analysis and offer a practical example of how analysis might be undertaken in an interview-based study. Qualitative research is a generic term that refers to a group of methods, and ways of collecting and analysing data that are interpretative or explanatory in nature and focus on meaning ...

  16. How to Do Thematic Analysis

    When to use thematic analysis. Thematic analysis is a good approach to research where you're trying to find out something about people's views, opinions, knowledge, experiences or values from a set of qualitative data - for example, interview transcripts, social media profiles, or survey responses. Some types of research questions you might use thematic analysis to answer:

  17. Qualitative Research: Data Collection, Analysis, and Management

    Doing qualitative research is not easy and may require a complete rethink of how research is conducted, particularly for researchers who are more familiar with quantitative approaches. There are many ways of conducting qualitative research, and this paper has covered some of the practical issues regarding data collection, analysis, and management.

  18. Chapter 18. Data Analysis and Coding

    Qualitative Data-Analysis Samples. The following three passages are examples of how qualitative researchers describe their data-analysis practices. The first, by Harvey, is a useful example of how data analysis can shift the original research questions.

  19. How to use and assess qualitative research methods

    How to conduct qualitative research? Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [13, 14].As Fossey puts it: "sampling, data collection, analysis and interpretation are related to each other in a cyclical ...

  20. Qualitative Data Analysis Strategies

    This chapter provides an overview of selected qualitative data analysis strategies with a particular focus on codes and coding. Preparatory strategies for a qualitative research study and data management are first outlined. Six coding methods are then profiled using comparable interview data: process coding, in vivo coding, descriptive coding ...

  21. PDF Qualitative data analysis: a practical example

    Qualitative research is a generic term that refers to a group of methods, and ways of collecting and analysing data that are interpretative or explanatory in nature and focus on meaning. Data collection is undertaken in the natural setting, such as a clinic, hospital or a partici-pant's home because qualitative methods seek to describe ...

  22. Qualitative Data

    Small sample size: Qualitative data collection methods often involve a small sample size, which limits the generalizability of the findings. Time-consuming: Qualitative data collection and analysis can be time-consuming, as it requires in-depth engagement with the data and often involves iterative processes.

  23. Qualitative vs. Quantitative Research

    When collecting and analyzing data, quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings. Both are important for gaining different kinds of knowledge. Quantitative research. Quantitative research is expressed in numbers and graphs. It is used to test or confirm theories and assumptions.

  24. Data Analysis in Research: Types & Methods

    Data analysis and qualitative data research work a little differently from the numerical data as the quality data is made up of words, descriptions, images, objects, and sometimes symbols. Getting insight from such complicated information is a complicated process. ... More often, an extensive research data sample comes loaded with errors ...

  25. Coding the Real World: Understanding Real-World Evidence in Healthcare

    While it's possible to analyze qualitative research in a variety of ways - for example, through discourse analysis or grounded theory methodologies - the Cerner Enviza team uses thematic analysis because it allows for a bottom-up approach that lets patient concerns or experiences emerge from the data.

  26. Thematic analysis

    In this video, I explain how to present and discuss findings in qualitative research, focusing on 4 common mistakes and how to fix them.#qualitativeresearch ...

  27. What is Qualitative Data Analysis Software (QDA Software)?

    Qualitative Data Analysis Software works with any qualitative research methodology used by a researcher For example, software for qualitative data analysis can be used by a social scientist wanting to develop new concepts or theories may take a 'grounded theory' approach.

  28. What is Qualitative Data Analysis?

    Understanding Qualitative Data Analysis. Qualitative data analysis is the process of systematically examining and deciphering qualitative facts (such as textual content, pix, motion pictures, or observations) to discover patterns, themes, and meanings inside the statistics· Unlike quantitative statistics evaluation, which focuses on numerical measurements and statistical strategies ...

  29. Patient medication management, understanding and adherence during the

    Study design. This qualitative longitudinal study, conducted from October 2020 to July 2021, used a qualitative descriptive methodology through four consecutive in-depth semi-structured interviews per participant at three, 10-, 30- and 60-days post-discharge, as illustrated in Fig. 1.Longitudinal qualitative research is characterized by qualitative data collection at different points in time ...

  30. Getting an outsider's perspective

    Data analysis. For our data analysis, we used reflexive thematic analysis which is a method for identifying, analyzing, and reporting patterns within qualitative data . Thematic analysis is a flexible approach which allows researchers to interpret the data through a six phased recursive process, moving back and forth between phases to build ...