• Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case AskWhy Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

data analysis of qualitative research

Home Market Research

Qualitative Data Analysis: What is it, Methods + Examples

Explore qualitative data analysis with diverse methods and real-world examples. Uncover the nuances of human experiences with this guide.

In a world rich with information and narrative, understanding the deeper layers of human experiences requires a unique vision that goes beyond numbers and figures. This is where the power of qualitative data analysis comes to light.

In this blog, we’ll learn about qualitative data analysis, explore its methods, and provide real-life examples showcasing its power in uncovering insights.

What is Qualitative Data Analysis?

Qualitative data analysis is a systematic process of examining non-numerical data to extract meaning, patterns, and insights.

In contrast to quantitative analysis, which focuses on numbers and statistical metrics, the qualitative study focuses on the qualitative aspects of data, such as text, images, audio, and videos. It seeks to understand every aspect of human experiences, perceptions, and behaviors by examining the data’s richness.

Companies frequently conduct this analysis on customer feedback. You can collect qualitative data from reviews, complaints, chat messages, interactions with support centers, customer interviews, case notes, or even social media comments. This kind of data holds the key to understanding customer sentiments and preferences in a way that goes beyond mere numbers.

Importance of Qualitative Data Analysis

Qualitative data analysis plays a crucial role in your research and decision-making process across various disciplines. Let’s explore some key reasons that underline the significance of this analysis:

In-Depth Understanding

It enables you to explore complex and nuanced aspects of a phenomenon, delving into the ‘how’ and ‘why’ questions. This method provides you with a deeper understanding of human behavior, experiences, and contexts that quantitative approaches might not capture fully.

Contextual Insight

You can use this analysis to give context to numerical data. It will help you understand the circumstances and conditions that influence participants’ thoughts, feelings, and actions. This contextual insight becomes essential for generating comprehensive explanations.

Theory Development

You can generate or refine hypotheses via qualitative data analysis. As you analyze the data attentively, you can form hypotheses, concepts, and frameworks that will drive your future research and contribute to theoretical advances.

Participant Perspectives

When performing qualitative research, you can highlight participant voices and opinions. This approach is especially useful for understanding marginalized or underrepresented people, as it allows them to communicate their experiences and points of view.

Exploratory Research

The analysis is frequently used at the exploratory stage of your project. It assists you in identifying important variables, developing research questions, and designing quantitative studies that will follow.

Types of Qualitative Data

When conducting qualitative research, you can use several qualitative data collection methods , and here you will come across many sorts of qualitative data that can provide you with unique insights into your study topic. These data kinds add new views and angles to your understanding and analysis.

Interviews and Focus Groups

Interviews and focus groups will be among your key methods for gathering qualitative data. Interviews are one-on-one talks in which participants can freely share their thoughts, experiences, and opinions.

Focus groups, on the other hand, are discussions in which members interact with one another, resulting in dynamic exchanges of ideas. Both methods provide rich qualitative data and direct access to participant perspectives.

Observations and Field Notes

Observations and field notes are another useful sort of qualitative data. You can immerse yourself in the research environment through direct observation, carefully documenting behaviors, interactions, and contextual factors.

These observations will be recorded in your field notes, providing a complete picture of the environment and the behaviors you’re researching. This data type is especially important for comprehending behavior in their natural setting.

Textual and Visual Data

Textual and visual data include a wide range of resources that can be qualitatively analyzed. Documents, written narratives, and transcripts from various sources, such as interviews or speeches, are examples of textual data.

Photographs, films, and even artwork provide a visual layer to your research. These forms of data allow you to investigate what is spoken and the underlying emotions, details, and symbols expressed by language or pictures.

When to Choose Qualitative Data Analysis over Quantitative Data Analysis

As you begin your research journey, understanding why the analysis of qualitative data is important will guide your approach to understanding complex events. If you analyze qualitative data, it will provide new insights that complement quantitative methodologies, which will give you a broader understanding of your study topic.

It is critical to know when to use qualitative analysis over quantitative procedures. You can prefer qualitative data analysis when:

  • Complexity Reigns: When your research questions involve deep human experiences, motivations, or emotions, qualitative research excels at revealing these complexities.
  • Exploration is Key: Qualitative analysis is ideal for exploratory research. It will assist you in understanding a new or poorly understood topic before formulating quantitative hypotheses.
  • Context Matters: If you want to understand how context affects behaviors or results, qualitative data analysis provides the depth needed to grasp these relationships.
  • Unanticipated Findings: When your study provides surprising new viewpoints or ideas, qualitative analysis helps you to delve deeply into these emerging themes.
  • Subjective Interpretation is Vital: When it comes to understanding people’s subjective experiences and interpretations, qualitative data analysis is the way to go.

You can make informed decisions regarding the right approach for your research objectives if you understand the importance of qualitative analysis and recognize the situations where it shines.

Qualitative Data Analysis Methods and Examples

Exploring various qualitative data analysis methods will provide you with a wide collection for making sense of your research findings. Once the data has been collected, you can choose from several analysis methods based on your research objectives and the data type you’ve collected.

There are five main methods for analyzing qualitative data. Each method takes a distinct approach to identifying patterns, themes, and insights within your qualitative data. They are:

Method 1: Content Analysis

Content analysis is a methodical technique for analyzing textual or visual data in a structured manner. In this method, you will categorize qualitative data by splitting it into manageable pieces and assigning the manual coding process to these units.

As you go, you’ll notice ongoing codes and designs that will allow you to conclude the content. This method is very beneficial for detecting common ideas, concepts, or themes in your data without losing the context.

Steps to Do Content Analysis

Follow these steps when conducting content analysis:

  • Collect and Immerse: Begin by collecting the necessary textual or visual data. Immerse yourself in this data to fully understand its content, context, and complexities.
  • Assign Codes and Categories: Assign codes to relevant data sections that systematically represent major ideas or themes. Arrange comparable codes into groups that cover the major themes.
  • Analyze and Interpret: Develop a structured framework from the categories and codes. Then, evaluate the data in the context of your research question, investigate relationships between categories, discover patterns, and draw meaning from these connections.

Benefits & Challenges

There are various advantages to using content analysis:

  • Structured Approach: It offers a systematic approach to dealing with large data sets and ensures consistency throughout the research.
  • Objective Insights: This method promotes objectivity, which helps to reduce potential biases in your study.
  • Pattern Discovery: Content analysis can help uncover hidden trends, themes, and patterns that are not always obvious.
  • Versatility: You can apply content analysis to various data formats, including text, internet content, images, etc.

However, keep in mind the challenges that arise:

  • Subjectivity: Even with the best attempts, a certain bias may remain in coding and interpretation.
  • Complexity: Analyzing huge data sets requires time and great attention to detail.
  • Contextual Nuances: Content analysis may not capture all of the contextual richness that qualitative data analysis highlights.

Example of Content Analysis

Suppose you’re conducting market research and looking at customer feedback on a product. As you collect relevant data and analyze feedback, you’ll see repeating codes like “price,” “quality,” “customer service,” and “features.” These codes are organized into categories such as “positive reviews,” “negative reviews,” and “suggestions for improvement.”

According to your findings, themes such as “price” and “customer service” stand out and show that pricing and customer service greatly impact customer satisfaction. This example highlights the power of content analysis for obtaining significant insights from large textual data collections.

Method 2: Thematic Analysis

Thematic analysis is a well-structured procedure for identifying and analyzing recurring themes in your data. As you become more engaged in the data, you’ll generate codes or short labels representing key concepts. These codes are then organized into themes, providing a consistent framework for organizing and comprehending the substance of the data.

The analysis allows you to organize complex narratives and perspectives into meaningful categories, which will allow you to identify connections and patterns that may not be visible at first.

Steps to Do Thematic Analysis

Follow these steps when conducting a thematic analysis:

  • Code and Group: Start by thoroughly examining the data and giving initial codes that identify the segments. To create initial themes, combine relevant codes.
  • Code and Group: Begin by engaging yourself in the data, assigning first codes to notable segments. To construct basic themes, group comparable codes together.
  • Analyze and Report: Analyze the data within each theme to derive relevant insights. Organize the topics into a consistent structure and explain your findings, along with data extracts that represent each theme.

Thematic analysis has various benefits:

  • Structured Exploration: It is a method for identifying patterns and themes in complex qualitative data.
  • Comprehensive knowledge: Thematic analysis promotes an in-depth understanding of the complications and meanings of the data.
  • Application Flexibility: This method can be customized to various research situations and data kinds.

However, challenges may arise, such as:

  • Interpretive Nature: Interpreting qualitative data in thematic analysis is vital, and it is critical to manage researcher bias.
  • Time-consuming: The study can be time-consuming, especially with large data sets.
  • Subjectivity: The selection of codes and topics might be subjective.

Example of Thematic Analysis

Assume you’re conducting a thematic analysis on job satisfaction interviews. Following your immersion in the data, you assign initial codes such as “work-life balance,” “career growth,” and “colleague relationships.” As you organize these codes, you’ll notice themes develop, such as “Factors Influencing Job Satisfaction” and “Impact on Work Engagement.”

Further investigation reveals the tales and experiences included within these themes and provides insights into how various elements influence job satisfaction. This example demonstrates how thematic analysis can reveal meaningful patterns and insights in qualitative data.

Method 3: Narrative Analysis

The narrative analysis involves the narratives that people share. You’ll investigate the histories in your data, looking at how stories are created and the meanings they express. This method is excellent for learning how people make sense of their experiences through narrative.

Steps to Do Narrative Analysis

The following steps are involved in narrative analysis:

  • Gather and Analyze: Start by collecting narratives, such as first-person tales, interviews, or written accounts. Analyze the stories, focusing on the plot, feelings, and characters.
  • Find Themes: Look for recurring themes or patterns in various narratives. Think about the similarities and differences between these topics and personal experiences.
  • Interpret and Extract Insights: Contextualize the narratives within their larger context. Accept the subjective nature of each narrative and analyze the narrator’s voice and style. Extract insights from the tales by diving into the emotions, motivations, and implications communicated by the stories.

There are various advantages to narrative analysis:

  • Deep Exploration: It lets you look deeply into people’s personal experiences and perspectives.
  • Human-Centered: This method prioritizes the human perspective, allowing individuals to express themselves.

However, difficulties may arise, such as:

  • Interpretive Complexity: Analyzing narratives requires dealing with the complexities of meaning and interpretation.
  • Time-consuming: Because of the richness and complexities of tales, working with them can be time-consuming.

Example of Narrative Analysis

Assume you’re conducting narrative analysis on refugee interviews. As you read the stories, you’ll notice common themes of toughness, loss, and hope. The narratives provide insight into the obstacles that refugees face, their strengths, and the dreams that guide them.

The analysis can provide a deeper insight into the refugees’ experiences and the broader social context they navigate by examining the narratives’ emotional subtleties and underlying meanings. This example highlights how narrative analysis can reveal important insights into human stories.

Method 4: Grounded Theory Analysis

Grounded theory analysis is an iterative and systematic approach that allows you to create theories directly from data without being limited by pre-existing hypotheses. With an open mind, you collect data and generate early codes and labels that capture essential ideas or concepts within the data.

As you progress, you refine these codes and increasingly connect them, eventually developing a theory based on the data. Grounded theory analysis is a dynamic process for developing new insights and hypotheses based on details in your data.

Steps to Do Grounded Theory Analysis

Grounded theory analysis requires the following steps:

  • Initial Coding: First, immerse yourself in the data, producing initial codes that represent major concepts or patterns.
  • Categorize and Connect: Using axial coding, organize the initial codes, which establish relationships and connections between topics.
  • Build the Theory: Focus on creating a core category that connects the codes and themes. Regularly refine the theory by comparing and integrating new data, ensuring that it evolves organically from the data.

Grounded theory analysis has various benefits:

  • Theory Generation: It provides a one-of-a-kind opportunity to generate hypotheses straight from data and promotes new insights.
  • In-depth Understanding: The analysis allows you to deeply analyze the data and reveal complex relationships and patterns.
  • Flexible Process: This method is customizable and ongoing, which allows you to enhance your research as you collect additional data.

However, challenges might arise with:

  • Time and Resources: Because grounded theory analysis is a continuous process, it requires a large commitment of time and resources.
  • Theoretical Development: Creating a grounded theory involves a thorough understanding of qualitative data analysis software and theoretical concepts.
  • Interpretation of Complexity: Interpreting and incorporating a newly developed theory into existing literature can be intellectually hard.

Example of Grounded Theory Analysis

Assume you’re performing a grounded theory analysis on workplace collaboration interviews. As you open code the data, you will discover notions such as “communication barriers,” “team dynamics,” and “leadership roles.” Axial coding demonstrates links between these notions, emphasizing the significance of efficient communication in developing collaboration.

You create the core “Integrated Communication Strategies” category through selective coding, which unifies new topics.

This theory-driven category serves as the framework for understanding how numerous aspects contribute to effective team collaboration. This example shows how grounded theory analysis allows you to generate a theory directly from the inherent nature of the data.

Method 5: Discourse Analysis

Discourse analysis focuses on language and communication. You’ll look at how language produces meaning and how it reflects power relations, identities, and cultural influences. This strategy examines what is said and how it is said; the words, phrasing, and larger context of communication.

The analysis is precious when investigating power dynamics, identities, and cultural influences encoded in language. By evaluating the language used in your data, you can identify underlying assumptions, cultural standards, and how individuals negotiate meaning through communication.

Steps to Do Discourse Analysis

Conducting discourse analysis entails the following steps:

  • Select Discourse: For analysis, choose language-based data such as texts, speeches, or media content.
  • Analyze Language: Immerse yourself in the conversation, examining language choices, metaphors, and underlying assumptions.
  • Discover Patterns: Recognize the dialogue’s reoccurring themes, ideologies, and power dynamics. To fully understand the effects of these patterns, put them in their larger context.

There are various advantages of using discourse analysis:

  • Understanding Language: It provides an extensive understanding of how language builds meaning and influences perceptions.
  • Uncovering Power Dynamics: The analysis reveals how power dynamics appear via language.
  • Cultural Insights: This method identifies cultural norms, beliefs, and ideologies stored in communication.

However, the following challenges may arise:

  • Complexity of Interpretation: Language analysis involves navigating multiple levels of nuance and interpretation.
  • Subjectivity: Interpretation can be subjective, so controlling researcher bias is important.
  • Time-Intensive: Discourse analysis can take a lot of time because careful linguistic study is required in this analysis.

Example of Discourse Analysis

Consider doing discourse analysis on media coverage of a political event. You notice repeating linguistic patterns in news articles that depict the event as a conflict between opposing parties. Through deconstruction, you can expose how this framing supports particular ideologies and power relations.

You can illustrate how language choices influence public perceptions and contribute to building the narrative around the event by analyzing the speech within the broader political and social context. This example shows how discourse analysis can reveal hidden power dynamics and cultural influences on communication.

How to do Qualitative Data Analysis with the QuestionPro Research suite?

QuestionPro is a popular survey and research platform that offers tools for collecting and analyzing qualitative and quantitative data. Follow these general steps for conducting qualitative data analysis using the QuestionPro Research Suite:

  • Collect Qualitative Data: Set up your survey to capture qualitative responses. It might involve open-ended questions, text boxes, or comment sections where participants can provide detailed responses.
  • Export Qualitative Responses: Export the responses once you’ve collected qualitative data through your survey. QuestionPro typically allows you to export survey data in various formats, such as Excel or CSV.
  • Prepare Data for Analysis: Review the exported data and clean it if necessary. Remove irrelevant or duplicate entries to ensure your data is ready for analysis.
  • Code and Categorize Responses: Segment and label data, letting new patterns emerge naturally, then develop categories through axial coding to structure the analysis.
  • Identify Themes: Analyze the coded responses to identify recurring themes, patterns, and insights. Look for similarities and differences in participants’ responses.
  • Generate Reports and Visualizations: Utilize the reporting features of QuestionPro to create visualizations, charts, and graphs that help communicate the themes and findings from your qualitative research.
  • Interpret and Draw Conclusions: Interpret the themes and patterns you’ve identified in the qualitative data. Consider how these findings answer your research questions or provide insights into your study topic.
  • Integrate with Quantitative Data (if applicable): If you’re also conducting quantitative research using QuestionPro, consider integrating your qualitative findings with quantitative results to provide a more comprehensive understanding.

Qualitative data analysis is vital in uncovering various human experiences, views, and stories. If you’re ready to transform your research journey and apply the power of qualitative analysis, now is the moment to do it. Book a demo with QuestionPro today and begin your journey of exploration.

LEARN MORE         FREE TRIAL

MORE LIKE THIS

Gallup Access alternatives

Best Gallup Access Alternatives & Competitors in 2024

Sep 6, 2024

Experimental vs Observational Studies: Differences & Examples

Experimental vs Observational Studies: Differences & Examples

Sep 5, 2024

Interactive forms

Interactive Forms: Key Features, Benefits, Uses + Design Tips

Sep 4, 2024

closed-loop management

Closed-Loop Management: The Key to Customer Centricity

Sep 3, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • What’s Coming Up
  • Workforce Intelligence
  • AI & Tech
  • Churn & Loyalty
  • Customer Experience
  • Customer Journeys
  • Data analytics
  • Feedback Analysis
  • Product Experience
  • Product Updates
  • Sentiment Analysis
  • Surveys & Feedback Collection
  • Text Analytics
  • Thematic vs Others
  • Using Thematic
  • Voice of Customer
  • Try Thematic
  • AI & NLP
  • Customer Metrics

Welcome to the community

data analysis of qualitative research

Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic)

When we conduct qualitative methods of research, need to explain changes in metrics or understand people's opinions, we always turn to qualitative data. Qualitative data is typically generated through:

  • Interview transcripts
  • Surveys with open-ended questions
  • Contact center transcripts
  • Texts and documents
  • Audio and video recordings
  • Observational notes

Compared to quantitative data, which captures structured information, qualitative data is unstructured and has more depth. It can answer our questions, can help formulate hypotheses and build understanding.

It's important to understand the differences between quantitative data & qualitative data . But unfortunately, analyzing qualitative data is difficult. While tools like Excel, Tableau and PowerBI crunch and visualize quantitative data with ease, there are a limited number of mainstream tools for analyzing qualitative data . The majority of qualitative data analysis still happens manually.

That said, there are two new trends that are changing this. First, there are advances in natural language processing (NLP) which is focused on understanding human language. Second, there is an explosion of user-friendly software designed for both researchers and businesses. Both help automate the qualitative data analysis process.

In this post we want to teach you how to conduct a successful qualitative data analysis. There are two primary qualitative data analysis methods; manual & automatic. We will teach you how to conduct the analysis manually, and also, automatically using software solutions powered by NLP. We’ll guide you through the steps to conduct a manual analysis, and look at what is involved and the role technology can play in automating this process.

More businesses are switching to fully-automated analysis of qualitative customer data because it is cheaper, faster, and just as accurate. Primarily, businesses purchase subscriptions to feedback analytics platforms so that they can understand customer pain points and sentiment.

Overwhelming quantity of feedback

We’ll take you through 5 steps to conduct a successful qualitative data analysis. Within each step we will highlight the key difference between the manual, and automated approach of qualitative researchers. Here's an overview of the steps:

The 5 steps to doing qualitative data analysis

  • Gathering and collecting your qualitative data
  • Organizing and connecting into your qualitative data
  • Coding your qualitative data
  • Analyzing the qualitative data for insights
  • Reporting on the insights derived from your analysis

What is Qualitative Data Analysis?

Qualitative data analysis is a process of gathering, structuring and interpreting qualitative data to understand what it represents.

Qualitative data is non-numerical and unstructured. Qualitative data generally refers to text, such as open-ended responses to survey questions or user interviews, but also includes audio, photos and video.

Businesses often perform qualitative data analysis on customer feedback. And within this context, qualitative data generally refers to verbatim text data collected from sources such as reviews, complaints, chat messages, support centre interactions, customer interviews, case notes or social media comments.

How is qualitative data analysis different from quantitative data analysis?

Understanding the differences between quantitative & qualitative data is important. When it comes to analyzing data, Qualitative Data Analysis serves a very different role to Quantitative Data Analysis. But what sets them apart?

Qualitative Data Analysis dives into the stories hidden in non-numerical data such as interviews, open-ended survey answers, or notes from observations. It uncovers the ‘whys’ and ‘hows’ giving a deep understanding of people’s experiences and emotions.

Quantitative Data Analysis on the other hand deals with numerical data, using statistics to measure differences, identify preferred options, and pinpoint root causes of issues.  It steps back to address questions like "how many" or "what percentage" to offer broad insights we can apply to larger groups.

In short, Qualitative Data Analysis is like a microscope,  helping us understand specific detail. Quantitative Data Analysis is like the telescope, giving us a broader perspective. Both are important, working together to decode data for different objectives.

Qualitative Data Analysis methods

Once all the data has been captured, there are a variety of analysis techniques available and the choice is determined by your specific research objectives and the kind of data you’ve gathered.  Common qualitative data analysis methods include:

Content Analysis

This is a popular approach to qualitative data analysis. Other qualitative analysis techniques may fit within the broad scope of content analysis. Thematic analysis is a part of the content analysis.  Content analysis is used to identify the patterns that emerge from text, by grouping content into words, concepts, and themes. Content analysis is useful to quantify the relationship between all of the grouped content. The Columbia School of Public Health has a detailed breakdown of content analysis .

Narrative Analysis

Narrative analysis focuses on the stories people tell and the language they use to make sense of them.  It is particularly useful in qualitative research methods where customer stories are used to get a deep understanding of customers’ perspectives on a specific issue. A narrative analysis might enable us to summarize the outcomes of a focused case study.

Discourse Analysis

Discourse analysis is used to get a thorough understanding of the political, cultural and power dynamics that exist in specific situations.  The focus of discourse analysis here is on the way people express themselves in different social contexts. Discourse analysis is commonly used by brand strategists who hope to understand why a group of people feel the way they do about a brand or product.

Thematic Analysis

Thematic analysis is used to deduce the meaning behind the words people use. This is accomplished by discovering repeating themes in text. These meaningful themes reveal key insights into data and can be quantified, particularly when paired with sentiment analysis . Often, the outcome of thematic analysis is a code frame that captures themes in terms of codes, also called categories. So the process of thematic analysis is also referred to as “coding”. A common use-case for thematic analysis in companies is analysis of customer feedback.

Grounded Theory

Grounded theory is a useful approach when little is known about a subject. Grounded theory starts by formulating a theory around a single data case. This means that the theory is “grounded”. Grounded theory analysis is based on actual data, and not entirely speculative. Then additional cases can be examined to see if they are relevant and can add to the original grounded theory.

Methods of qualitative data analysis; approaches and techniques to qualitative data analysis

Challenges of Qualitative Data Analysis

While Qualitative Data Analysis offers rich insights, it comes with its challenges. Each unique QDA method has its unique hurdles. Let’s take a look at the challenges researchers and analysts might face, depending on the chosen method.

  • Time and Effort (Narrative Analysis): Narrative analysis, which focuses on personal stories, demands patience. Sifting through lengthy narratives to find meaningful insights can be time-consuming, requires dedicated effort.
  • Being Objective (Grounded Theory): Grounded theory, building theories from data, faces the challenges of personal biases. Staying objective while interpreting data is crucial, ensuring conclusions are rooted in the data itself.
  • Complexity (Thematic Analysis): Thematic analysis involves identifying themes within data, a process that can be intricate. Categorizing and understanding themes can be complex, especially when each piece of data varies in context and structure. Thematic Analysis software can simplify this process.
  • Generalizing Findings (Narrative Analysis): Narrative analysis, dealing with individual stories, makes drawing broad challenging. Extending findings from a single narrative to a broader context requires careful consideration.
  • Managing Data (Thematic Analysis): Thematic analysis involves organizing and managing vast amounts of unstructured data, like interview transcripts. Managing this can be a hefty task, requiring effective data management strategies.
  • Skill Level (Grounded Theory): Grounded theory demands specific skills to build theories from the ground up. Finding or training analysts with these skills poses a challenge, requiring investment in building expertise.

Benefits of qualitative data analysis

Qualitative Data Analysis (QDA) is like a versatile toolkit, offering a tailored approach to understanding your data. The benefits it offers are as diverse as the methods. Let’s explore why choosing the right method matters.

  • Tailored Methods for Specific Needs: QDA isn't one-size-fits-all. Depending on your research objectives and the type of data at hand, different methods offer unique benefits. If you want emotive customer stories, narrative analysis paints a strong picture. When you want to explain a score, thematic analysis reveals insightful patterns
  • Flexibility with Thematic Analysis: thematic analysis is like a chameleon in the toolkit of QDA. It adapts well to different types of data and research objectives, making it a top choice for any qualitative analysis.
  • Deeper Understanding, Better Products: QDA helps you dive into people's thoughts and feelings. This deep understanding helps you build products and services that truly matches what people want, ensuring satisfied customers
  • Finding the Unexpected: Qualitative data often reveals surprises that we miss in quantitative data. QDA offers us new ideas and perspectives, for insights we might otherwise miss.
  • Building Effective Strategies: Insights from QDA are like strategic guides. They help businesses in crafting plans that match people’s desires.
  • Creating Genuine Connections: Understanding people’s experiences lets businesses connect on a real level. This genuine connection helps build trust and loyalty, priceless for any business.

How to do Qualitative Data Analysis: 5 steps

Now we are going to show how you can do your own qualitative data analysis. We will guide you through this process step by step. As mentioned earlier, you will learn how to do qualitative data analysis manually , and also automatically using modern qualitative data and thematic analysis software.

To get best value from the analysis process and research process, it’s important to be super clear about the nature and scope of the question that’s being researched. This will help you select the research collection channels that are most likely to help you answer your question.

Depending on if you are a business looking to understand customer sentiment, or an academic surveying a school, your approach to qualitative data analysis will be unique.

Once you’re clear, there’s a sequence to follow. And, though there are differences in the manual and automatic approaches, the process steps are mostly the same.

The use case for our step-by-step guide is a company looking to collect data (customer feedback data), and analyze the customer feedback - in order to improve customer experience. By analyzing the customer feedback the company derives insights about their business and their customers. You can follow these same steps regardless of the nature of your research. Let’s get started.

Step 1: Gather your qualitative data and conduct research (Conduct qualitative research)

The first step of qualitative research is to do data collection. Put simply, data collection is gathering all of your data for analysis. A common situation is when qualitative data is spread across various sources.

Classic methods of gathering qualitative data

Most companies use traditional methods for gathering qualitative data: conducting interviews with research participants, running surveys, and running focus groups. This data is typically stored in documents, CRMs, databases and knowledge bases. It’s important to examine which data is available and needs to be included in your research project, based on its scope.

Using your existing qualitative feedback

As it becomes easier for customers to engage across a range of different channels, companies are gathering increasingly large amounts of both solicited and unsolicited qualitative feedback.

Most organizations have now invested in Voice of Customer programs , support ticketing systems, chatbot and support conversations, emails and even customer Slack chats.

These new channels provide companies with new ways of getting feedback, and also allow the collection of unstructured feedback data at scale.

The great thing about this data is that it contains a wealth of valubale insights and that it’s already there! When you have a new question about user behavior or your customers, you don’t need to create a new research study or set up a focus group. You can find most answers in the data you already have.

Typically, this data is stored in third-party solutions or a central database, but there are ways to export it or connect to a feedback analysis solution through integrations or an API.

Utilize untapped qualitative data channels

There are many online qualitative data sources you may not have considered. For example, you can find useful qualitative data in social media channels like Twitter or Facebook. Online forums, review sites, and online communities such as Discourse or Reddit also contain valuable data about your customers, or research questions.

If you are considering performing a qualitative benchmark analysis against competitors - the internet is your best friend, and review analysis is a great place to start. Gathering feedback in competitor reviews on sites like Trustpilot, G2, Capterra, Better Business Bureau or on app stores is a great way to perform a competitor benchmark analysis.

Customer feedback analysis software often has integrations into social media and review sites, or you could use a solution like DataMiner to scrape the reviews.

G2.com reviews of the product Airtable. You could pull reviews from G2 for your analysis.

Step 2: Connect & organize all your qualitative data

Now you all have this qualitative data but there’s a problem, the data is unstructured. Before feedback can be analyzed and assigned any value, it needs to be organized in a single place. Why is this important? Consistency!

If all data is easily accessible in one place and analyzed in a consistent manner, you will have an easier time summarizing and making decisions based on this data.

The manual approach to organizing your data

The classic method of structuring qualitative data is to plot all the raw data you’ve gathered into a spreadsheet.

Typically, research and support teams would share large Excel sheets and different business units would make sense of the qualitative feedback data on their own. Each team collects and organizes the data in a way that best suits them, which means the feedback tends to be kept in separate silos.

An alternative and a more robust solution is to store feedback in a central database, like Snowflake or Amazon Redshift .

Keep in mind that when you organize your data in this way, you are often preparing it to be imported into another software. If you go the route of a database, you would need to use an API to push the feedback into a third-party software.

Computer-assisted qualitative data analysis software (CAQDAS)

Traditionally within the manual analysis approach (but not always), qualitative data is imported into CAQDAS software for coding.

In the early 2000s, CAQDAS software was popularised by developers such as ATLAS.ti, NVivo and MAXQDA and eagerly adopted by researchers to assist with the organizing and coding of data.  

The benefits of using computer-assisted qualitative data analysis software:

  • Assists in the organizing of your data
  • Opens you up to exploring different interpretations of your data analysis
  • Allows you to share your dataset easier and allows group collaboration (allows for secondary analysis)

However you still need to code the data, uncover the themes and do the analysis yourself. Therefore it is still a manual approach.

The user interface of CAQDAS software 'NVivo'

Organizing your qualitative data in a feedback repository

Another solution to organizing your qualitative data is to upload it into a feedback repository where it can be unified with your other data , and easily searchable and taggable. There are a number of software solutions that act as a central repository for your qualitative research data. Here are a couple solutions that you could investigate:  

  • Dovetail: Dovetail is a research repository with a focus on video and audio transcriptions. You can tag your transcriptions within the platform for theme analysis. You can also upload your other qualitative data such as research reports, survey responses, support conversations ( conversational analytics ), and customer interviews. Dovetail acts as a single, searchable repository. And makes it easier to collaborate with other people around your qualitative research.
  • EnjoyHQ: EnjoyHQ is another research repository with similar functionality to Dovetail. It boasts a more sophisticated search engine, but it has a higher starting subscription cost.

Organizing your qualitative data in a feedback analytics platform

If you have a lot of qualitative customer or employee feedback, from the likes of customer surveys or employee surveys, you will benefit from a feedback analytics platform. A feedback analytics platform is a software that automates the process of both sentiment analysis and thematic analysis . Companies use the integrations offered by these platforms to directly tap into their qualitative data sources (review sites, social media, survey responses, etc.). The data collected is then organized and analyzed consistently within the platform.

If you have data prepared in a spreadsheet, it can also be imported into feedback analytics platforms.

Once all this rich data has been organized within the feedback analytics platform, it is ready to be coded and themed, within the same platform. Thematic is a feedback analytics platform that offers one of the largest libraries of integrations with qualitative data sources.

Some of qualitative data integrations offered by Thematic

Step 3: Coding your qualitative data

Your feedback data is now organized in one place. Either within your spreadsheet, CAQDAS, feedback repository or within your feedback analytics platform. The next step is to code your feedback data so we can extract meaningful insights in the next step.

Coding is the process of labelling and organizing your data in such a way that you can then identify themes in the data, and the relationships between these themes.

To simplify the coding process, you will take small samples of your customer feedback data, come up with a set of codes, or categories capturing themes, and label each piece of feedback, systematically, for patterns and meaning. Then you will take a larger sample of data, revising and refining the codes for greater accuracy and consistency as you go.

If you choose to use a feedback analytics platform, much of this process will be automated and accomplished for you.

The terms to describe different categories of meaning (‘theme’, ‘code’, ‘tag’, ‘category’ etc) can be confusing as they are often used interchangeably.  For clarity, this article will use the term ‘code’.

To code means to identify key words or phrases and assign them to a category of meaning. “I really hate the customer service of this computer software company” would be coded as “poor customer service”.

How to manually code your qualitative data

  • Decide whether you will use deductive or inductive coding. Deductive coding is when you create a list of predefined codes, and then assign them to the qualitative data. Inductive coding is the opposite of this, you create codes based on the data itself. Codes arise directly from the data and you label them as you go. You need to weigh up the pros and cons of each coding method and select the most appropriate.
  • Read through the feedback data to get a broad sense of what it reveals. Now it’s time to start assigning your first set of codes to statements and sections of text.
  • Keep repeating step 2, adding new codes and revising the code description as often as necessary.  Once it has all been coded, go through everything again, to be sure there are no inconsistencies and that nothing has been overlooked.
  • Create a code frame to group your codes. The coding frame is the organizational structure of all your codes. And there are two commonly used types of coding frames, flat, or hierarchical. A hierarchical code frame will make it easier for you to derive insights from your analysis.
  • Based on the number of times a particular code occurs, you can now see the common themes in your feedback data. This is insightful! If ‘bad customer service’ is a common code, it’s time to take action.

We have a detailed guide dedicated to manually coding your qualitative data .

Example of a hierarchical coding frame in qualitative data analysis

Using software to speed up manual coding of qualitative data

An Excel spreadsheet is still a popular method for coding. But various software solutions can help speed up this process. Here are some examples.

  • CAQDAS / NVivo - CAQDAS software has built-in functionality that allows you to code text within their software. You may find the interface the software offers easier for managing codes than a spreadsheet.
  • Dovetail/EnjoyHQ - You can tag transcripts and other textual data within these solutions. As they are also repositories you may find it simpler to keep the coding in one platform.
  • IBM SPSS - SPSS is a statistical analysis software that may make coding easier than in a spreadsheet.
  • Ascribe - Ascribe’s ‘Coder’ is a coding management system. Its user interface will make it easier for you to manage your codes.

Automating the qualitative coding process using thematic analysis software

In solutions which speed up the manual coding process, you still have to come up with valid codes and often apply codes manually to pieces of feedback. But there are also solutions that automate both the discovery and the application of codes.

Advances in machine learning have now made it possible to read, code and structure qualitative data automatically. This type of automated coding is offered by thematic analysis software .

Automation makes it far simpler and faster to code the feedback and group it into themes. By incorporating natural language processing (NLP) into the software, the AI looks across sentences and phrases to identify common themes meaningful statements. Some automated solutions detect repeating patterns and assign codes to them, others make you train the AI by providing examples. You could say that the AI learns the meaning of the feedback on its own.

Thematic automates the coding of qualitative feedback regardless of source. There’s no need to set up themes or categories in advance. Simply upload your data and wait a few minutes. You can also manually edit the codes to further refine their accuracy.  Experiments conducted indicate that Thematic’s automated coding is just as accurate as manual coding .

Paired with sentiment analysis and advanced text analytics - these automated solutions become powerful for deriving quality business or research insights.

You could also build your own , if you have the resources!

The key benefits of using an automated coding solution

Automated analysis can often be set up fast and there’s the potential to uncover things that would never have been revealed if you had given the software a prescribed list of themes to look for.

Because the model applies a consistent rule to the data, it captures phrases or statements that a human eye might have missed.

Complete and consistent analysis of customer feedback enables more meaningful findings. Leading us into step 4.

Step 4: Analyze your data: Find meaningful insights

Now we are going to analyze our data to find insights. This is where we start to answer our research questions. Keep in mind that step 4 and step 5 (tell the story) have some overlap . This is because creating visualizations is both part of analysis process and reporting.

The task of uncovering insights is to scour through the codes that emerge from the data and draw meaningful correlations from them. It is also about making sure each insight is distinct and has enough data to support it.

Part of the analysis is to establish how much each code relates to different demographics and customer profiles, and identify whether there’s any relationship between these data points.

Manually create sub-codes to improve the quality of insights

If your code frame only has one level, you may find that your codes are too broad to be able to extract meaningful insights. This is where it is valuable to create sub-codes to your primary codes. This process is sometimes referred to as meta coding.

Note: If you take an inductive coding approach, you can create sub-codes as you are reading through your feedback data and coding it.

While time-consuming, this exercise will improve the quality of your analysis. Here is an example of what sub-codes could look like.

Example of sub-codes

You need to carefully read your qualitative data to create quality sub-codes. But as you can see, the depth of analysis is greatly improved. By calculating the frequency of these sub-codes you can get insight into which  customer service problems you can immediately address.

Correlate the frequency of codes to customer segments

Many businesses use customer segmentation . And you may have your own respondent segments that you can apply to your qualitative analysis. Segmentation is the practise of dividing customers or research respondents into subgroups.

Segments can be based on:

  • Demographic
  • And any other data type that you care to segment by

It is particularly useful to see the occurrence of codes within your segments. If one of your customer segments is considered unimportant to your business, but they are the cause of nearly all customer service complaints, it may be in your best interest to focus attention elsewhere. This is a useful insight!

Manually visualizing coded qualitative data

There are formulas you can use to visualize key insights in your data. The formulas we will suggest are imperative if you are measuring a score alongside your feedback.

If you are collecting a metric alongside your qualitative data this is a key visualization. Impact answers the question: “What’s the impact of a code on my overall score?”. Using Net Promoter Score (NPS) as an example, first you need to:

  • Calculate overall NPS
  • Calculate NPS in the subset of responses that do not contain that theme
  • Subtract B from A

Then you can use this simple formula to calculate code impact on NPS .

Visualizing qualitative data: Calculating the impact of a code on your score

You can then visualize this data using a bar chart.

You can download our CX toolkit - it includes a template to recreate this.

Trends over time

This analysis can help you answer questions like: “Which codes are linked to decreases or increases in my score over time?”

We need to compare two sequences of numbers: NPS over time and code frequency over time . Using Excel, calculate the correlation between the two sequences, which can be either positive (the more codes the higher the NPS, see picture below), or negative (the more codes the lower the NPS).

Now you need to plot code frequency against the absolute value of code correlation with NPS. Here is the formula:

Analyzing qualitative data: Calculate which codes are linked to increases or decreases in my score

The visualization could look like this:

Visualizing qualitative data trends over time

These are two examples, but there are more. For a third manual formula, and to learn why word clouds are not an insightful form of analysis, read our visualizations article .

Using a text analytics solution to automate analysis

Automated text analytics solutions enable codes and sub-codes to be pulled out of the data automatically. This makes it far faster and easier to identify what’s driving negative or positive results. And to pick up emerging trends and find all manner of rich insights in the data.

Another benefit of AI-driven text analytics software is its built-in capability for sentiment analysis, which provides the emotive context behind your feedback and other qualitative textual data therein.

Thematic provides text analytics that goes further by allowing users to apply their expertise on business context to edit or augment the AI-generated outputs.

Since the move away from manual research is generally about reducing the human element, adding human input to the technology might sound counter-intuitive. However, this is mostly to make sure important business nuances in the feedback aren’t missed during coding. The result is a higher accuracy of analysis. This is sometimes referred to as augmented intelligence .

Codes displayed by volume within Thematic. You can 'manage themes' to introduce human input.

Step 5: Report on your data: Tell the story

The last step of analyzing your qualitative data is to report on it, to tell the story. At this point, the codes are fully developed and the focus is on communicating the narrative to the audience.

A coherent outline of the qualitative research, the findings and the insights is vital for stakeholders to discuss and debate before they can devise a meaningful course of action.

Creating graphs and reporting in Powerpoint

Typically, qualitative researchers take the tried and tested approach of distilling their report into a series of charts, tables and other visuals which are woven into a narrative for presentation in Powerpoint.

Using visualization software for reporting

With data transformation and APIs, the analyzed data can be shared with data visualisation software, such as Power BI or Tableau , Google Studio or Looker. Power BI and Tableau are among the most preferred options.

Visualizing your insights inside a feedback analytics platform

Feedback analytics platforms, like Thematic, incorporate visualisation tools that intuitively turn key data and insights into graphs.  This removes the time consuming work of constructing charts to visually identify patterns and creates more time to focus on building a compelling narrative that highlights the insights, in bite-size chunks, for executive teams to review.

Using a feedback analytics platform with visualization tools means you don’t have to use a separate product for visualizations. You can export graphs into Powerpoints straight from the platforms.

Two examples of qualitative data visualizations within Thematic

Conclusion - Manual or Automated?

There are those who remain deeply invested in the manual approach - because it’s familiar, because they’re reluctant to spend money and time learning new software, or because they’ve been burned by the overpromises of AI.  

For projects that involve small datasets, manual analysis makes sense. For example, if the objective is simply to quantify a simple question like “Do customers prefer X concepts to Y?”. If the findings are being extracted from a small set of focus groups and interviews, sometimes it’s easier to just read them

However, as new generations come into the workplace, it’s technology-driven solutions that feel more comfortable and practical. And the merits are undeniable.  Especially if the objective is to go deeper and understand the ‘why’ behind customers’ preference for X or Y. And even more especially if time and money are considerations.

The ability to collect a free flow of qualitative feedback data at the same time as the metric means AI can cost-effectively scan, crunch, score and analyze a ton of feedback from one system in one go. And time-intensive processes like focus groups, or coding, that used to take weeks, can now be completed in a matter of hours or days.

But aside from the ever-present business case to speed things up and keep costs down, there are also powerful research imperatives for automated analysis of qualitative data: namely, accuracy and consistency.

Finding insights hidden in feedback requires consistency, especially in coding.  Not to mention catching all the ‘unknown unknowns’ that can skew research findings and steering clear of cognitive bias.

Some say without manual data analysis researchers won’t get an accurate “feel” for the insights. However, the larger data sets are, the harder it is to sort through the feedback and organize feedback that has been pulled from different places.  And, the more difficult it is to stay on course, the greater the risk of drawing incorrect, or incomplete, conclusions grows.

Though the process steps for qualitative data analysis have remained pretty much unchanged since psychologist Paul Felix Lazarsfeld paved the path a hundred years ago, the impact digital technology has had on types of qualitative feedback data and the approach to the analysis are profound.  

If you want to try an automated feedback analysis solution on your own qualitative data, you can get started with Thematic .

data analysis of qualitative research

Community & Marketing

Tyler manages our community of CX, insights & analytics professionals. Tyler's goal is to help unite insights professionals around common challenges.

We make it easy to discover the customer and product issues that matter.

Unlock the value of feedback at scale, in one platform. Try it for free now!

  • Questions to ask your Feedback Analytics vendor
  • How to end customer churn for good
  • Scalable analysis of NPS verbatims
  • 5 Text analytics approaches
  • How to calculate the ROI of CX

Our experts will show you how Thematic works, how to discover pain points and track the ROI of decisions. To access your free trial, book a personal demo today.

Recent posts

Become a qualitative theming pro! Creating a perfect code frame is hard, but thematic analysis software makes the process much easier.

Discover the power of thematic analysis to unlock insights from qualitative data. Learn about manual vs. AI-powered approaches, best practices, and how Thematic software can revolutionize your analysis workflow.

When two major storms wreaked havoc on Auckland and Watercare’s infrastructurem the utility went through a CX crisis. With a massive influx of calls to their support center, Thematic helped them get inisghts from this data to forge a new approach to restore services and satisfaction levels.

data analysis of qualitative research

Qualitative Data Analysis Methods 101:

The “big 6” methods + examples.

By: Kerryn Warren (PhD) | Reviewed By: Eunice Rautenbach (D.Tech) | May 2020 (Updated April 2023)

Qualitative data analysis methods. Wow, that’s a mouthful. 

If you’re new to the world of research, qualitative data analysis can look rather intimidating. So much bulky terminology and so many abstract, fluffy concepts. It certainly can be a minefield!

Don’t worry – in this post, we’ll unpack the most popular analysis methods , one at a time, so that you can approach your analysis with confidence and competence – whether that’s for a dissertation, thesis or really any kind of research project.

Qualitative data analysis methods

What (exactly) is qualitative data analysis?

To understand qualitative data analysis, we need to first understand qualitative data – so let’s step back and ask the question, “what exactly is qualitative data?”.

Qualitative data refers to pretty much any data that’s “not numbers” . In other words, it’s not the stuff you measure using a fixed scale or complex equipment, nor do you analyse it using complex statistics or mathematics.

So, if it’s not numbers, what is it?

Words, you guessed? Well… sometimes , yes. Qualitative data can, and often does, take the form of interview transcripts, documents and open-ended survey responses – but it can also involve the interpretation of images and videos. In other words, qualitative isn’t just limited to text-based data.

So, how’s that different from quantitative data, you ask?

Simply put, qualitative research focuses on words, descriptions, concepts or ideas – while quantitative research focuses on numbers and statistics . Qualitative research investigates the “softer side” of things to explore and describe , while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them. If you’re keen to learn more about the differences between qual and quant, we’ve got a detailed post over here .

qualitative data analysis vs quantitative data analysis

So, qualitative analysis is easier than quantitative, right?

Not quite. In many ways, qualitative data can be challenging and time-consuming to analyse and interpret. At the end of your data collection phase (which itself takes a lot of time), you’ll likely have many pages of text-based data or hours upon hours of audio to work through. You might also have subtle nuances of interactions or discussions that have danced around in your mind, or that you scribbled down in messy field notes. All of this needs to work its way into your analysis.

Making sense of all of this is no small task and you shouldn’t underestimate it. Long story short – qualitative analysis can be a lot of work! Of course, quantitative analysis is no piece of cake either, but it’s important to recognise that qualitative analysis still requires a significant investment in terms of time and effort.

Need a helping hand?

data analysis of qualitative research

In this post, we’ll explore qualitative data analysis by looking at some of the most common analysis methods we encounter. We’re not going to cover every possible qualitative method and we’re not going to go into heavy detail – we’re just going to give you the big picture. That said, we will of course includes links to loads of extra resources so that you can learn more about whichever analysis method interests you.

Without further delay, let’s get into it.

The “Big 6” Qualitative Analysis Methods 

There are many different types of qualitative data analysis, all of which serve different purposes and have unique strengths and weaknesses . We’ll start by outlining the analysis methods and then we’ll dive into the details for each.

The 6 most popular methods (or at least the ones we see at Grad Coach) are:

  • Content analysis
  • Narrative analysis
  • Discourse analysis
  • Thematic analysis
  • Grounded theory (GT)
  • Interpretive phenomenological analysis (IPA)

Let’s take a look at each of them…

QDA Method #1: Qualitative Content Analysis

Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

With content analysis, you could, for instance, identify the frequency with which an idea is shared or spoken about – like the number of times a Kardashian is mentioned on Twitter. Or you could identify patterns of deeper underlying interpretations – for instance, by identifying phrases or words in tourist pamphlets that highlight India as an ancient country.

Because content analysis can be used in such a wide variety of ways, it’s important to go into your analysis with a very specific question and goal, or you’ll get lost in the fog. With content analysis, you’ll group large amounts of text into codes , summarise these into categories, and possibly even tabulate the data to calculate the frequency of certain concepts or variables. Because of this, content analysis provides a small splash of quantitative thinking within a qualitative method.

Naturally, while content analysis is widely useful, it’s not without its drawbacks . One of the main issues with content analysis is that it can be very time-consuming , as it requires lots of reading and re-reading of the texts. Also, because of its multidimensional focus on both qualitative and quantitative aspects, it is sometimes accused of losing important nuances in communication.

Content analysis also tends to concentrate on a very specific timeline and doesn’t take into account what happened before or after that timeline. This isn’t necessarily a bad thing though – just something to be aware of. So, keep these factors in mind if you’re considering content analysis. Every analysis method has its limitations , so don’t be put off by these – just be aware of them ! If you’re interested in learning more about content analysis, the video below provides a good starting point.

QDA Method #2: Narrative Analysis 

As the name suggests, narrative analysis is all about listening to people telling stories and analysing what that means . Since stories serve a functional purpose of helping us make sense of the world, we can gain insights into the ways that people deal with and make sense of reality by analysing their stories and the ways they’re told.

You could, for example, use narrative analysis to explore whether how something is being said is important. For instance, the narrative of a prisoner trying to justify their crime could provide insight into their view of the world and the justice system. Similarly, analysing the ways entrepreneurs talk about the struggles in their careers or cancer patients telling stories of hope could provide powerful insights into their mindsets and perspectives . Simply put, narrative analysis is about paying attention to the stories that people tell – and more importantly, the way they tell them.

Of course, the narrative approach has its weaknesses , too. Sample sizes are generally quite small due to the time-consuming process of capturing narratives. Because of this, along with the multitude of social and lifestyle factors which can influence a subject, narrative analysis can be quite difficult to reproduce in subsequent research. This means that it’s difficult to test the findings of some of this research.

Similarly, researcher bias can have a strong influence on the results here, so you need to be particularly careful about the potential biases you can bring into your analysis when using this method. Nevertheless, narrative analysis is still a very useful qualitative analysis method – just keep these limitations in mind and be careful not to draw broad conclusions . If you’re keen to learn more about narrative analysis, the video below provides a great introduction to this qualitative analysis method.

Private Coaching

QDA Method #3: Discourse Analysis 

Discourse is simply a fancy word for written or spoken language or debate . So, discourse analysis is all about analysing language within its social context. In other words, analysing language – such as a conversation, a speech, etc – within the culture and society it takes place. For example, you could analyse how a janitor speaks to a CEO, or how politicians speak about terrorism.

To truly understand these conversations or speeches, the culture and history of those involved in the communication are important factors to consider. For example, a janitor might speak more casually with a CEO in a company that emphasises equality among workers. Similarly, a politician might speak more about terrorism if there was a recent terrorist incident in the country.

So, as you can see, by using discourse analysis, you can identify how culture , history or power dynamics (to name a few) have an effect on the way concepts are spoken about. So, if your research aims and objectives involve understanding culture or power dynamics, discourse analysis can be a powerful method.

Because there are many social influences in terms of how we speak to each other, the potential use of discourse analysis is vast . Of course, this also means it’s important to have a very specific research question (or questions) in mind when analysing your data and looking for patterns and themes, or you might land up going down a winding rabbit hole.

Discourse analysis can also be very time-consuming  as you need to sample the data to the point of saturation – in other words, until no new information and insights emerge. But this is, of course, part of what makes discourse analysis such a powerful technique. So, keep these factors in mind when considering this QDA method. Again, if you’re keen to learn more, the video below presents a good starting point.

QDA Method #4: Thematic Analysis

Thematic analysis looks at patterns of meaning in a data set – for example, a set of interviews or focus group transcripts. But what exactly does that… mean? Well, a thematic analysis takes bodies of data (which are often quite large) and groups them according to similarities – in other words, themes . These themes help us make sense of the content and derive meaning from it.

Let’s take a look at an example.

With thematic analysis, you could analyse 100 online reviews of a popular sushi restaurant to find out what patrons think about the place. By reviewing the data, you would then identify the themes that crop up repeatedly within the data – for example, “fresh ingredients” or “friendly wait staff”.

So, as you can see, thematic analysis can be pretty useful for finding out about people’s experiences , views, and opinions . Therefore, if your research aims and objectives involve understanding people’s experience or view of something, thematic analysis can be a great choice.

Since thematic analysis is a bit of an exploratory process, it’s not unusual for your research questions to develop , or even change as you progress through the analysis. While this is somewhat natural in exploratory research, it can also be seen as a disadvantage as it means that data needs to be re-reviewed each time a research question is adjusted. In other words, thematic analysis can be quite time-consuming – but for a good reason. So, keep this in mind if you choose to use thematic analysis for your project and budget extra time for unexpected adjustments.

Thematic analysis takes bodies of data and groups them according to similarities (themes), which help us make sense of the content.

QDA Method #5: Grounded theory (GT) 

Grounded theory is a powerful qualitative analysis method where the intention is to create a new theory (or theories) using the data at hand, through a series of “ tests ” and “ revisions ”. Strictly speaking, GT is more a research design type than an analysis method, but we’ve included it here as it’s often referred to as a method.

What’s most important with grounded theory is that you go into the analysis with an open mind and let the data speak for itself – rather than dragging existing hypotheses or theories into your analysis. In other words, your analysis must develop from the ground up (hence the name). 

Let’s look at an example of GT in action.

Assume you’re interested in developing a theory about what factors influence students to watch a YouTube video about qualitative analysis. Using Grounded theory , you’d start with this general overarching question about the given population (i.e., graduate students). First, you’d approach a small sample – for example, five graduate students in a department at a university. Ideally, this sample would be reasonably representative of the broader population. You’d interview these students to identify what factors lead them to watch the video.

After analysing the interview data, a general pattern could emerge. For example, you might notice that graduate students are more likely to read a post about qualitative methods if they are just starting on their dissertation journey, or if they have an upcoming test about research methods.

From here, you’ll look for another small sample – for example, five more graduate students in a different department – and see whether this pattern holds true for them. If not, you’ll look for commonalities and adapt your theory accordingly. As this process continues, the theory would develop . As we mentioned earlier, what’s important with grounded theory is that the theory develops from the data – not from some preconceived idea.

So, what are the drawbacks of grounded theory? Well, some argue that there’s a tricky circularity to grounded theory. For it to work, in principle, you should know as little as possible regarding the research question and population, so that you reduce the bias in your interpretation. However, in many circumstances, it’s also thought to be unwise to approach a research question without knowledge of the current literature . In other words, it’s a bit of a “chicken or the egg” situation.

Regardless, grounded theory remains a popular (and powerful) option. Naturally, it’s a very useful method when you’re researching a topic that is completely new or has very little existing research about it, as it allows you to start from scratch and work your way from the ground up .

Grounded theory is used to create a new theory (or theories) by using the data at hand, as opposed to existing theories and frameworks.

QDA Method #6:   Interpretive Phenomenological Analysis (IPA)

Interpretive. Phenomenological. Analysis. IPA . Try saying that three times fast…

Let’s just stick with IPA, okay?

IPA is designed to help you understand the personal experiences of a subject (for example, a person or group of people) concerning a major life event, an experience or a situation . This event or experience is the “phenomenon” that makes up the “P” in IPA. Such phenomena may range from relatively common events – such as motherhood, or being involved in a car accident – to those which are extremely rare – for example, someone’s personal experience in a refugee camp. So, IPA is a great choice if your research involves analysing people’s personal experiences of something that happened to them.

It’s important to remember that IPA is subject – centred . In other words, it’s focused on the experiencer . This means that, while you’ll likely use a coding system to identify commonalities, it’s important not to lose the depth of experience or meaning by trying to reduce everything to codes. Also, keep in mind that since your sample size will generally be very small with IPA, you often won’t be able to draw broad conclusions about the generalisability of your findings. But that’s okay as long as it aligns with your research aims and objectives.

Another thing to be aware of with IPA is personal bias . While researcher bias can creep into all forms of research, self-awareness is critically important with IPA, as it can have a major impact on the results. For example, a researcher who was a victim of a crime himself could insert his own feelings of frustration and anger into the way he interprets the experience of someone who was kidnapped. So, if you’re going to undertake IPA, you need to be very self-aware or you could muddy the analysis.

IPA can help you understand the personal experiences of a person or group concerning a major life event, an experience or a situation.

How to choose the right analysis method

In light of all of the qualitative analysis methods we’ve covered so far, you’re probably asking yourself the question, “ How do I choose the right one? ”

Much like all the other methodological decisions you’ll need to make, selecting the right qualitative analysis method largely depends on your research aims, objectives and questions . In other words, the best tool for the job depends on what you’re trying to build. For example:

  • Perhaps your research aims to analyse the use of words and what they reveal about the intention of the storyteller and the cultural context of the time.
  • Perhaps your research aims to develop an understanding of the unique personal experiences of people that have experienced a certain event, or
  • Perhaps your research aims to develop insight regarding the influence of a certain culture on its members.

As you can probably see, each of these research aims are distinctly different , and therefore different analysis methods would be suitable for each one. For example, narrative analysis would likely be a good option for the first aim, while grounded theory wouldn’t be as relevant. 

It’s also important to remember that each method has its own set of strengths, weaknesses and general limitations. No single analysis method is perfect . So, depending on the nature of your research, it may make sense to adopt more than one method (this is called triangulation ). Keep in mind though that this will of course be quite time-consuming.

As we’ve seen, all of the qualitative analysis methods we’ve discussed make use of coding and theme-generating techniques, but the intent and approach of each analysis method differ quite substantially. So, it’s very important to come into your research with a clear intention before you decide which analysis method (or methods) to use.

Start by reviewing your research aims , objectives and research questions to assess what exactly you’re trying to find out – then select a qualitative analysis method that fits. Never pick a method just because you like it or have experience using it – your analysis method (or methods) must align with your broader research aims and objectives.

No single analysis method is perfect, so it can often make sense to adopt more than one  method (this is called triangulation).

Let’s recap on QDA methods…

In this post, we looked at six popular qualitative data analysis methods:

  • First, we looked at content analysis , a straightforward method that blends a little bit of quant into a primarily qualitative analysis.
  • Then we looked at narrative analysis , which is about analysing how stories are told.
  • Next up was discourse analysis – which is about analysing conversations and interactions.
  • Then we moved on to thematic analysis – which is about identifying themes and patterns.
  • From there, we went south with grounded theory – which is about starting from scratch with a specific question and using the data alone to build a theory in response to that question.
  • And finally, we looked at IPA – which is about understanding people’s unique experiences of a phenomenon.

Of course, these aren’t the only options when it comes to qualitative data analysis, but they’re a great starting point if you’re dipping your toes into qualitative research for the first time.

If you’re still feeling a bit confused, consider our private coaching service , where we hold your hand through the research process to help you develop your best work.

data analysis of qualitative research

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

87 Comments

Richard N

This has been very helpful. Thank you.

netaji

Thank you madam,

Mariam Jaiyeola

Thank you so much for this information

Nzube

I wonder it so clear for understand and good for me. can I ask additional query?

Lee

Very insightful and useful

Susan Nakaweesi

Good work done with clear explanations. Thank you.

Titilayo

Thanks so much for the write-up, it’s really good.

Hemantha Gunasekara

Thanks madam . It is very important .

Gumathandra

thank you very good

Faricoh Tushera

Great presentation

Pramod Bahulekar

This has been very well explained in simple language . It is useful even for a new researcher.

Derek Jansen

Great to hear that. Good luck with your qualitative data analysis, Pramod!

Adam Zahir

This is very useful information. And it was very a clear language structured presentation. Thanks a lot.

Golit,F.

Thank you so much.

Emmanuel

very informative sequential presentation

Shahzada

Precise explanation of method.

Alyssa

Hi, may we use 2 data analysis methods in our qualitative research?

Thanks for your comment. Most commonly, one would use one type of analysis method, but it depends on your research aims and objectives.

Dr. Manju Pandey

You explained it in very simple language, everyone can understand it. Thanks so much.

Phillip

Thank you very much, this is very helpful. It has been explained in a very simple manner that even a layman understands

Anne

Thank nicely explained can I ask is Qualitative content analysis the same as thematic analysis?

Thanks for your comment. No, QCA and thematic are two different types of analysis. This article might help clarify – https://onlinelibrary.wiley.com/doi/10.1111/nhs.12048

Rev. Osadare K . J

This is my first time to come across a well explained data analysis. so helpful.

Tina King

I have thoroughly enjoyed your explanation of the six qualitative analysis methods. This is very helpful. Thank you!

Bromie

Thank you very much, this is well explained and useful

udayangani

i need a citation of your book.

khutsafalo

Thanks a lot , remarkable indeed, enlighting to the best

jas

Hi Derek, What other theories/methods would you recommend when the data is a whole speech?

M

Keep writing useful artikel.

Adane

It is important concept about QDA and also the way to express is easily understandable, so thanks for all.

Carl Benecke

Thank you, this is well explained and very useful.

Ngwisa

Very helpful .Thanks.

Hajra Aman

Hi there! Very well explained. Simple but very useful style of writing. Please provide the citation of the text. warm regards

Hillary Mophethe

The session was very helpful and insightful. Thank you

This was very helpful and insightful. Easy to read and understand

Catherine

As a professional academic writer, this has been so informative and educative. Keep up the good work Grad Coach you are unmatched with quality content for sure.

Keep up the good work Grad Coach you are unmatched with quality content for sure.

Abdulkerim

Its Great and help me the most. A Million Thanks you Dr.

Emanuela

It is a very nice work

Noble Naade

Very insightful. Please, which of this approach could be used for a research that one is trying to elicit students’ misconceptions in a particular concept ?

Karen

This is Amazing and well explained, thanks

amirhossein

great overview

Tebogo

What do we call a research data analysis method that one use to advise or determining the best accounting tool or techniques that should be adopted in a company.

Catherine Shimechero

Informative video, explained in a clear and simple way. Kudos

Van Hmung

Waoo! I have chosen method wrong for my data analysis. But I can revise my work according to this guide. Thank you so much for this helpful lecture.

BRIAN ONYANGO MWAGA

This has been very helpful. It gave me a good view of my research objectives and how to choose the best method. Thematic analysis it is.

Livhuwani Reineth

Very helpful indeed. Thanku so much for the insight.

Storm Erlank

This was incredibly helpful.

Jack Kanas

Very helpful.

catherine

very educative

Wan Roslina

Nicely written especially for novice academic researchers like me! Thank you.

Talash

choosing a right method for a paper is always a hard job for a student, this is a useful information, but it would be more useful personally for me, if the author provide me with a little bit more information about the data analysis techniques in type of explanatory research. Can we use qualitative content analysis technique for explanatory research ? or what is the suitable data analysis method for explanatory research in social studies?

ramesh

that was very helpful for me. because these details are so important to my research. thank you very much

Kumsa Desisa

I learnt a lot. Thank you

Tesfa NT

Relevant and Informative, thanks !

norma

Well-planned and organized, thanks much! 🙂

Dr. Jacob Lubuva

I have reviewed qualitative data analysis in a simplest way possible. The content will highly be useful for developing my book on qualitative data analysis methods. Cheers!

Nyi Nyi Lwin

Clear explanation on qualitative and how about Case study

Ogobuchi Otuu

This was helpful. Thank you

Alicia

This was really of great assistance, it was just the right information needed. Explanation very clear and follow.

Wow, Thanks for making my life easy

C. U

This was helpful thanks .

Dr. Alina Atif

Very helpful…. clear and written in an easily understandable manner. Thank you.

Herb

This was so helpful as it was easy to understand. I’m a new to research thank you so much.

cissy

so educative…. but Ijust want to know which method is coding of the qualitative or tallying done?

Ayo

Thank you for the great content, I have learnt a lot. So helpful

Tesfaye

precise and clear presentation with simple language and thank you for that.

nneheng

very informative content, thank you.

Oscar Kuebutornye

You guys are amazing on YouTube on this platform. Your teachings are great, educative, and informative. kudos!

NG

Brilliant Delivery. You made a complex subject seem so easy. Well done.

Ankit Kumar

Beautifully explained.

Thanks a lot

Kidada Owen-Browne

Is there a video the captures the practical process of coding using automated applications?

Thanks for the comment. We don’t recommend using automated applications for coding, as they are not sufficiently accurate in our experience.

Mathewos Damtew

content analysis can be qualitative research?

Hend

THANK YOU VERY MUCH.

Dev get

Thank you very much for such a wonderful content

Kassahun Aman

do you have any material on Data collection

Prince .S. mpofu

What a powerful explanation of the QDA methods. Thank you.

Kassahun

Great explanation both written and Video. i have been using of it on a day to day working of my thesis project in accounting and finance. Thank you very much for your support.

BORA SAMWELI MATUTULI

very helpful, thank you so much

ngoni chibukire

The tutorial is useful. I benefited a lot.

Thandeka Hlatshwayo

This is an eye opener for me and very informative, I have used some of your guidance notes on my Thesis, I wonder if you can assist with your 1. name of your book, year of publication, topic etc., this is for citing in my Bibliography,

I certainly hope to hear from you

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

data analysis of qualitative research

  • Print Friendly

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is Qualitative Research? | Methods & Examples

What Is Qualitative Research? | Methods & Examples

Published on June 19, 2020 by Pritha Bhandari . Revised on September 5, 2024.

Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

Qualitative research is the opposite of quantitative research , which involves collecting and analyzing numerical data for statistical analysis.

Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc.

  • How does social media shape body image in teenagers?
  • How do children and adults interpret healthy eating in the UK?
  • What factors influence employee retention in a large organization?
  • How is anxiety experienced around the world?
  • How can teachers integrate social issues into science curriculums?

Table of contents

Approaches to qualitative research, qualitative research methods, qualitative data analysis, advantages of qualitative research, disadvantages of qualitative research, other interesting articles, frequently asked questions about qualitative research.

Qualitative research is used to understand how people experience the world. While there are many approaches to qualitative research, they tend to be flexible and focus on retaining rich meaning when interpreting data.

Common approaches include grounded theory, ethnography , action research , phenomenological research, and narrative research. They share some similarities, but emphasize different aims and perspectives.

Qualitative research approaches
Approach What does it involve?
Grounded theory Researchers collect rich data on a topic of interest and develop theories .
Researchers immerse themselves in groups or organizations to understand their cultures.
Action research Researchers and participants collaboratively link theory to practice to drive social change.
Phenomenological research Researchers investigate a phenomenon or event by describing and interpreting participants’ lived experiences.
Narrative research Researchers examine how stories are told to understand how participants perceive and make sense of their experiences.

Note that qualitative research is at risk for certain research biases including the Hawthorne effect , observer bias , recall bias , and social desirability bias . While not always totally avoidable, awareness of potential biases as you collect and analyze your data can prevent them from impacting your work too much.

Prevent plagiarism. Run a free check.

Each of the research approaches involve using one or more data collection methods . These are some of the most common qualitative methods:

  • Observations: recording what you have seen, heard, or encountered in detailed field notes.
  • Interviews:  personally asking people questions in one-on-one conversations.
  • Focus groups: asking questions and generating discussion among a group of people.
  • Surveys : distributing questionnaires with open-ended questions.
  • Secondary research: collecting existing data in the form of texts, images, audio or video recordings, etc.
  • You take field notes with observations and reflect on your own experiences of the company culture.
  • You distribute open-ended surveys to employees across all the company’s offices by email to find out if the culture varies across locations.
  • You conduct in-depth interviews with employees in your office to learn about their experiences and perspectives in greater detail.

Qualitative researchers often consider themselves “instruments” in research because all observations, interpretations and analyses are filtered through their own personal lens.

For this reason, when writing up your methodology for qualitative research, it’s important to reflect on your approach and to thoroughly explain the choices you made in collecting and analyzing the data.

Qualitative data can take the form of texts, photos, videos and audio. For example, you might be working with interview transcripts, survey responses, fieldnotes, or recordings from natural settings.

Most types of qualitative data analysis share the same five steps:

  • Prepare and organize your data. This may mean transcribing interviews or typing up fieldnotes.
  • Review and explore your data. Examine the data for patterns or repeated ideas that emerge.
  • Develop a data coding system. Based on your initial ideas, establish a set of codes that you can apply to categorize your data.
  • Assign codes to the data. For example, in qualitative survey analysis, this may mean going through each participant’s responses and tagging them with codes in a spreadsheet. As you go through your data, you can create new codes to add to your system if necessary.
  • Identify recurring themes. Link codes together into cohesive, overarching themes.

There are several specific approaches to analyzing qualitative data. Although these methods share similar processes, they emphasize different concepts.

Qualitative data analysis
Approach When to use Example
To describe and categorize common words, phrases, and ideas in qualitative data. A market researcher could perform content analysis to find out what kind of language is used in descriptions of therapeutic apps.
To identify and interpret patterns and themes in qualitative data. A psychologist could apply thematic analysis to travel blogs to explore how tourism shapes self-identity.
To examine the content, structure, and design of texts. A media researcher could use textual analysis to understand how news coverage of celebrities has changed in the past decade.
To study communication and how language is used to achieve effects in specific contexts. A political scientist could use discourse analysis to study how politicians generate trust in election campaigns.

Qualitative research often tries to preserve the voice and perspective of participants and can be adjusted as new research questions arise. Qualitative research is good for:

  • Flexibility

The data collection and analysis process can be adapted as new ideas or patterns emerge. They are not rigidly decided beforehand.

  • Natural settings

Data collection occurs in real-world contexts or in naturalistic ways.

  • Meaningful insights

Detailed descriptions of people’s experiences, feelings and perceptions can be used in designing, testing or improving systems or products.

  • Generation of new ideas

Open-ended responses mean that researchers can uncover novel problems or opportunities that they wouldn’t have thought of otherwise.

Researchers must consider practical and theoretical limitations in analyzing and interpreting their data. Qualitative research suffers from:

  • Unreliability

The real-world setting often makes qualitative research unreliable because of uncontrolled factors that affect the data.

  • Subjectivity

Due to the researcher’s primary role in analyzing and interpreting data, qualitative research cannot be replicated . The researcher decides what is important and what is irrelevant in data analysis, so interpretations of the same data can vary greatly.

  • Limited generalizability

Small samples are often used to gather detailed data about specific contexts. Despite rigorous analysis procedures, it is difficult to draw generalizable conclusions because the data may be biased and unrepresentative of the wider population .

  • Labor-intensive

Although software can be used to manage and record large amounts of text, data analysis often has to be checked or performed manually.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Chi square goodness of fit test
  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Mixed methods research
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

There are five common approaches to qualitative research :

  • Grounded theory involves collecting data in order to develop new theories.
  • Ethnography involves immersing yourself in a group or organization to understand its culture.
  • Narrative research involves interpreting stories to understand how people make sense of their experiences and perceptions.
  • Phenomenological research involves investigating phenomena through people’s lived experiences.
  • Action research links theory and practice in several cycles to drive innovative changes.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organize your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2024, September 05). What Is Qualitative Research? | Methods & Examples. Scribbr. Retrieved September 9, 2024, from https://www.scribbr.com/methodology/qualitative-research/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, how to do thematic analysis | step-by-step guide & examples, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

data analysis of qualitative research

  • Get new issue alerts Get alerts
  • Submit a Manuscript

Secondary Logo

Journal logo.

Colleague's E-mail is Invalid

Your message has been successfully sent to your colleague.

Save my selection

Data Analysis in Qualitative Research

Ravindran, Vinitha 1,

1 College of Nursing, CMC, Vellore, Tamil Nadu, India

Address for correspondence: Dr. Vinitha Ravindran, College of Nursing, CMC, Vellore, Tamil Nadu, India. E-Mail: [email protected]

This is an open access journal, and articles are distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 License, which allows others to remix, tweak, and build upon the work non-commercially, as long as appropriate credit is given and the new creations are licensed under the identical terms.

Data analysis in qualitative research is an iterative and complex process. The focus of analysis is to bring out tacit meanings that people attach to their actions and responses related to a phenomenon. Although qualitative data analysis softwares are available, the researcher is the primary instrument who attempts to bring out these meanings by a deep engagement with the data and the individuals who share their stories. Although different approaches are suggested in different qualitative methods, the basic steps of content analysis that includes preparing the data, reading and reflection, coding, categorising and developing themes are integral to all approaches. The analysis process moves the researcher from describing the phenomenon to conceptualisation and abstraction of themes without losing the voice of the participants which are represented by the findings.

INTRODUCTION

Qualitative data analysis appears simple to those who have limited knowledge of qualitative research approach, but for the seasoned qualitative researcher, it is one of the most difficult tasks. According to Thorn,[ 1 ] it is the complex and elusive part of the qualitative research process. Many challenges that are inherent in the research approach makes the analysis process demanding. The first challenge is to convert the data from visual or auditory recording to textual data. As qualitative approach includes data generation through sharing of experiences, it becomes fundamentally necessary to record data rather than writing down accounts as the stories are shared. Essential data which may become apparent or uncovered when reflecting on audiotaped interviews may be missed or overlooked if interviews are not recorded.[ 2 ] Although field notes are written they often augment the experiences conveyed by participants rather than being the primary data source. Therefore, the researcher needs to spend effort and time to record as well as transcribe data to texts which can be analysed.

The second challenge is managing the quantum of textual data. One hour of interview may produce 20–40 pages of text. Even with fewer participants as generally is in qualitative research, the researcher may have many pages of data which need coding and analysing. Although software packages such as NVivo and Atlas-ti are available, they only help to organise, sort and categorise data and will not give meaning to the text.[ 3 ] The researcher has to read, reflect, compare and analyse data. The categories and themes have to be brought forth by the researcher. The third challenge is doing data generation and data analysis at the same time. Concurrent data generation and analysis is a predominant feature in qualitative research. An iterative or cyclic method of data collection and analysis is emphasised in qualitative approach. What it means is that as the researcher collects data, the analysis process is also initiated. The researcher does not wait to complete collecting data and then do analysis.[ 2 ] Iterative process enhances the researcher to focus on emerging concepts and categories in subsequent interviews and observations. It enables the researcher to address the gaps in the data and get information to saturate the gaps in subsequent contacts with earlier or new research participants. Sufficient time and resources are needed for sustaining the iterative process throughout the research process.

The above challenges are mentioned at the beginning of this article not to discourage the researchers but to emphasise the complexity of data analysis which has to be seriously considered by all researchers who are interested in doing qualitative research. In addition to the general challenges, data analysis in qualitative research also varies between different approaches and designs. There is also the possibility of flexibility and fluidity that enhances the researcher to choose different approaches to analysis, either one specific approach or a combination of approaches.[ 4 ] The framework for the analysis should, however, be made explicit at the beginning of the analysis. In qualitative research, the researcher is a bricoleur (weaver of stories) who is creating a bricolage.[ 5 ]

CHARACTERISTICS OF DATA ANALYSIS

In qualitative data analysis.

  • Researcher attempts to understand the meaning behind actions and behaviours of participants
  • Researcher becomes the instrument to generate data and ask analytical questions
  • Emphasis is given to quality and depth of narration about a phenomenon rather than the number of study participants
  • The context and a holistic view of the participants' experience are stressed
  • The research is sensitive to what the influence he/she has on the interpretation of data
  • Analytical themes are projected as findings rather than quantified variables.

Process of data analysis

Qualitative data analysis can be both deductive and inductive. The deductive process, in which there is an attempt to establish causal relationships, is although associated with quantitative research, can be applied also in qualitative research as a deductive explanatory process or deductive category application.[ 6 ] When the researcher's interest is on specific aspects of the phenomenon, and the research question is focused and not general, a deductive approach to analysis may be used. For example, in a study done by Manoranjitham et al .,[ 7 ] focus group discussions were conducted to identify perceptions of suicide in terms of causes, methods of attempting suicide, impact of suicide and availability of support as perceived by family and community members and health-care professionals. Focused questions were asked to elicit information on what people thought about the above aspects of suicide. The answers from participants in focus groups were coded under each question, which was considered as categories and the number of responders and the responses were elaborated under the said questions as perceived by the participants. Deductive process in qualitative data analysis allows the researcher to be at a descriptive level where the results are closer to participants accounts, rather than moving to a more interpretive or conceptual level. This process is often used when qualitative research is used as a part of the mixed methods approach or as a part of an elaborate research study.

In contrast, the inductive process which is the hallmark of qualitative data analysis involves asking questions of the in-depth and vast data that have been generated from different sources regarding a phenomenon.[ 2 , 4 ] The inductive process is applicable to all qualitative research in which the research question has been more explorative and overarching in terms of understanding the phenomenon in peoples' lives. For example, in Rempel et al .'s[ 8 ] study on parenting children with life-threatening congenital heart disease, the researchers explored the process of parenting children with a lethal heart condition. Volumes of data generated through individual interviews with parents and grandparents were inductively analysed to understand the 'facets of parenting' children with heart disease. Inductive analysis motivates and enhances researchers to rise above describing what the participants say about their experience to interpretive conceptualisation and abstraction. The process of deduction and induction in qualitative data analysis is depicted in Figure 1 .

F1-9

GENERAL STEPS IN DATA ANALYSIS

Although different analytical processes are proposed by different researchers, there are generally four basic steps to qualitative data analysis. These steps are similar to what is generally known as qualitative content analysis.[ 4 , 9 ] In any qualitative approach, the analysis starts with the steps of content analysis. The content analysis ends generally at an interpretive descriptive level. Further analysis to raise data to abstraction may be needed in some approaches such as grounded theory.

Preparation of data

Reading and reflecting, coding, categorising and memoing.

  • Developing themes/conceptual models or theory.

As already discussed, the inductive process in qualitative research begins when data collection starts. Each recorded data set from individual interviews, focus groups or conversations should be first transcribed and edited. The researcher may decide on units of data that can be analysed to further help in organising.[ 10 ] The units can be the whole interview from one individual or interview transcripts from one family or data from different individuals connected with in a case (as in case study). On some occasions, the unit may consist of all answers to one question or one aspect of the phenomenon. Many researchers may not form any such units at the beginning of the analysis which is also accepted. The essential aspect of the preparation is to ensure that participants' accounts are truly represented in transcribing. Researchers who have a large amount of content will need assistance in transcription. One hour of interview may take 4–6 h to transcribe.[ 2 ] An official transcriber will do a good job than a researcher who may spend a long time in transcribing volumes of data. However, the researcher has to edit the transcription by listening to the audiotaped version and include words and connotations that are missed to maintain accuracy in transcription.[ 11 ] Another important point to note is to transcribe and prepare the data as soon as interviews are completed. This facilitates the iterative process of data collection and analysis. All data, including field notes, should be organised with date, time and identification number or pseudonym for easy retrieval.[ 2 ] Assigning numbers or pseudonyms help to maintain the confidentiality of the participants.

Reading the data as a whole, and reflecting on what the participants are sharing gives an initial understanding of the narrative. The reflection may start at the time of the interview itself. However, reading and rereading the transcribed text from an interview gives an understanding of context, situations, events and actions related to the phenomenon of interest before the data can be analysed for concepts and themes.[ 12 ] Reading and reflection help the researcher to get immersed in the data, understand the perspectives of participants and decide on an analytical framework for further data analysis.[ 13 ] As texts are read, the researcher may jot down points or questions that are striking or unusual or does or does not support assumptions. Such reflective notes assist the researcher to decide on questions to be asked in further interviews or look for similarities or differences in interview texts from other participants. These initial reflections do not complete analysis; rather, it provides a platform for the analysis to develop. An example of initial reflections when analysing interviews from a study on home care of children with chronic illness is given below.

Reflections-family 1 interview

'This family has a lot of issues related to home care. Their conversation is a list of complaints about the system and the personnel. Even though it appears that help is being rendered for support of child at home, nothing seems to satisfy the parents. The conversation revolves around how they have not been given their due in terms of material and personnel support rather than about their sick child or the siblings.

After a while, it became tedious for me to read this transcript as I resent the complaints (which I should not do I suppose). I wonder how other families perceive home care.'

The initial reflections also help to understand our position as a researcher and the assumptions the researcher brings to the study. It helps us to be aware of one's own professional and personal prejudices which may influence the interpretation of data.

For analysis to progress further the researcher has to decide on an organised way of sorting and categorising data to come to an understanding about the phenomenon or the concepts embedded in the phenomenon. Researchers may choose to analyse only the manifest content in a descriptive qualitative study or may move further to look for latent content in an analytical-qualitative study.[ 4 ] The manifest content analysis includes looking for specific words or phrases used by the participants and accounting for how many have expressed the same or similar words/phrases in the data. It looks at what is obvious. Latent content analysis, on the other hand, involves coding and categorising to identify patterns and themes that are implicit in the data.

Coding is an essential first step in sorting and organising data.[ 4 ] Codes are labels given to phrases, expressions, behaviours, images and sentences as the researcher goes through the data.[ 13 ] It can be 'in vivo' codes or 'interpretive codes'. When participants' exact expressions itself are used as codes it is called ' in vivo ' codes.[ 14 ] If the researcher interprets the expression or behaviour of the participant depicted in the text, then it is called interpretive codes. In the grounded theory method, different levels of coding are suggested. The first level is called the 'open coding' that involves sifting through the initial data line by line and creating in vivo or interpretive codes. Questions such as what are this person saying or doing or what is happening here? will help in the initial coding of data. Initial coding may reveal gaps in the data or raise questions.[ 15 , 16 ] These gaps and questions will help the researcher to locate the sources from where further data are to be collected. The second level is known as 'focus or selective coding' will be used in subsequent interviews. Focused coding involves using the most frequent or most significant earlier codes to sift through large amounts of data. Focused codes are more directed, selective and conceptual and are employed to raise the sorting of data to an analytical level.[ 17 ] The first level of coding can be done manually or can be done using qualitative software packages. In other types of content analysis, the different levels of coding may not be followed instead the researcher engages in interpretive coding as the text is read. In a grounded theory study on parenting children with burn injury open codes such as scolded, accused, unwanted, guilt, nonsupport, difficult to care, terrible pain, blaming oneself and tired came up as the data were coded [ Table 1 ]. These codes gave the researcher an initial insight into the traumatic experiences that the parents undergo when caring for their burn-injured children. As texts were coded, the researcher attempted to understand further the struggles of parents in the successive interviews with other families.

T1-9

Categorising

Categorising involves grouping similar codes together and formulating an understandable set within which related data can be clubbed. A category is 'a collection of similar data sorted into the same place' – the 'what', developed using content analysis and developing trajectories and relationships over time.[ 18 ] It is a group of content that shares commonality. Data can be categorised generally when the researcher realises that the same codes or codes that are relatively similar are emerging from the data. When categories are developed based on codes, they can be still at descriptive level or can be at an abstract level.[ 10 ] By developing categories a conceptual coding structure can be formulated. At this level, there is no need to continue line by line coding. Instead, the researcher uses the coding structure to sort data. In other words, parts of data that best fit the categories, and the codes are grouped appropriately from across the data sets. The grouping of data into categories is enabled by comparing and contrasting data from different sources or individuals.[ 19 ] As constant comparison continues[ 15 ] questions such as 'What is different between the accounts of two families? What are similar? Will help in grouping data into categories. As the researcher compares data, questions such as 'what if' may come up which will propel the researcher to return to participants to know more or even purposively include participants who will answer the question. The data under each category should be read again to ensure that they appropriately represent the category.[ 4 ] Qualitative software packages are very useful in sorting and organising data from this level. Any part of data which is not fitting into any category needs to be coded newly, and the new codes should be added. The emerging new codes may later fit into a category or form new categories. All data are thus accounted for during this phase of analysis.

As analysis and grouping of further data continue, the researcher may rearrange data within categories or come up with subcategories.[ 4 ] The researcher may also go from data to codes, to sub-categories which then can be abstracted into categories.[ 10 ] In the burn study, similar codes that were repeated in many transcripts were grouped together. Grouping these codes helped in developing subcategories such as physical trauma, emotional trauma, self-blame and shame. The sub-categories were then grouped to develop meaningful categories such as facing blame and enduring the burn [ Table 2 ]. Creating categories thus assists the researcher to move from describing phenomenon to interpretation and abstraction.

T2-9

Memoing is 'the researcher's record of analysis, thoughts, interpretations, questions and directions for further data collection' (pp 110).[ 20 ] Memos are elaborations of thoughts on data, codes and categories that are written down.[ 17 ] Simply put, memoing is writing down the reflections and ideas that arise from the data as data analysis progresses. As data are coded, the researcher writes down his/her thoughts on the codes and their relationships as they occur. Memo-writing is an on-going process, and memos lead to abstraction and theorising for the write-up of ideas.[ 15 ] Initial or early memos help in exploring and filling out the initial qualitative codes. It helps the researcher to understand what the participants are saying or doing and what is happening. Advanced memos help in the emergence of categories and identify the beliefs and assumptions that support the categories. Memos also help in looking at the categories and the data from different vantage points.[ 21 ] One of the early memos from burn study is given as an example.

Extensive wound

24 June, 2010, 10 pm – After coding interview texts from three families.

'I am struck by the enormity of a burn injury. I realize that family members cannot do many things for the child at home after discharge of a severely burned child because the injury is so big that even some clinics and doctors who are not familiar with burn care cannot manage care. These children need continuous attention of the health care professionals. They need professional assistance with dressing. They need professional assistance with splints and gadgets, and therapies. The injury is extensive that it is difficult for family members to do many things on their own. It is very hard, very hard for the parents to take up a role of the caregiver for children with burns because it involves large wound which has not healed or is in the process of early healing and the child suffers severe pain. The post burn care is very different from caring for other children with chronic illness or congenital defects which most often does not involve pain. The child's suffering makes it easy for the parents to view them as vulnerable. Yet the parents do their best. They try to follow the Health Care Professionals advice, they try to go for follow-up, but it seems simply not enough. I think the parents are doing all that they can within the context of severe injury, lack of finances, lack of resources in home town, or blame and ridicule from neighbors and others…'

Stopping to memo helps the researcher to reflect on data, move towards developing themes and models and lay the ground for discussion of findings later. Memos need to include the time, date, place and context at which they were written.

Developing themes, conceptual models and theory

Developing themes involves the 'threading together of the underlying meaning' that run through all the categories. It is the interpretation of the latent content in the texts.[ 10 ] Theming involves integrating all the categories and explicating the relationship in the categories.[ 4 ] In coding and categorising the researcher is involved in deconstructing or dividing the data to understand the feelings, behaviours and actions. In the phase of theming, the researcher is trying to connect the deconstructed part by understanding the implicit meaning that connects the behaviour, actions and reactions related to a phenomenon. To identify theme, the grounded theorist asks: What is the core issue which the participants are dealing with? The phenomenologist will ask about the central essence or structure of the lived experience related to the phenomenon of interest. The ethnographer may look at the cultural themes that link the categories. The researcher generally comes up with one to three themes.[ 4 ] Too many categories or themes may indicate that the analysis is prematurely closed and implies the need for the researcher to further interpret and conceptualise the data.[ 4 ] In the study on parenting children with burn injury, the researcher came up with the theme of 'Double Trauma' which explicated the experiences of parents living the burn with their children and also enduring the blame within the context of both the hospital and home [ Table 2 ].

In phenomenology and ethnography, the analysis may end with identifying themes. In other approaches, such as grounded theory and interpretive description, the analysis may progress further to developing theory or conceptual models. Identifying the core category/variable from the coding activity, memos and constant comparisons are the first step in moving towards theory development in grounded theory.[ 15 ] The core category is the main theme that the researcher identifies in the data. The next step in grounded theory is to identify the basic social process (BSP). The BSP evolves from understanding how participants are dealing with the core issue. In real-world situations, individuals develop their own strategies and process to deal with the core issue in any situation. Identifying this process is the stepping stone to theory development in grounded theory. In the example of burn study, the theme 'Double Trauma' was the core category and parenting in the burn study involved a dual process of 'embracing the survival' and 'enduring the blame'.[ 22 , 23 ] A conceptual model was developed based on these processes.

PITFALLS IN QUALITATIVE ANALYSIS

Large data sets for analysis.

As already explained, the amount of data text or field notes from observations and other sources in qualitative research can become overwhelming if data analysis is not initiated concurrently with data collection/generation. Coding large data text is tedious and takes much of the researcher's time. Postponing analysis to the end of data collection also prevents the researcher from becoming focused in subsequent interviews and filling gaps in data in further data collection. Therefore, deferring data analysis should be avoided.

Premature closure

Researcher should not hasten to conclude analysis with developing categories or themes. This may lead to 'premature closure' of the research and the danger that the participants' experiences are misunderstood or incompletely understood.[ 15 ] Qualitative data analysis involves in-depth interaction with the data and understanding the nuances in the experiences and the meanings behind actions. The researcher continues to generate data until all the categories are saturated, which means that the categories are mutually exclusive and can be explained from all aspects or angle.[ 21 ] In the burn study, although the table in this article appears simple, the codes and categories were developed from larger data sets representing multiple participant interviews and field notes. The category 'facing blame' was brought forth with parents' accounts of experiencing blame in almost all the families in one or multiple ways: from family members, health-care professionals, strangers and the child itself. The researcher needs to be reflexive and iteratively do data generation and analysis until there is no new information forthcoming in the data. Inferring conclusions too soon which is otherwise known as 'inferential leaps', will prevent the researcher from getting the whole picture of the phenomenon.[ 2 ]

Interpretation of meanings

During the analysis process as the researcher interprets and conceptualises the participants' experiences, he/she delves into the tacit meanings of actions and feelings expressed by participants or observed in various situations. The researcher endeavours to keep the interpretations as close to the participants' accounts as possible. However, it should be understood that the meanings are co-constructed by both the participant and researcher by collaborative effort which is also a hallmark of qualitative research.[ 2 ] In the process of co-construction, researcher should be cautious to not lose the voice of the participants. Discussion with peer at all steps of analysis or checks on codes and categories by others in the research team may help to avoid this problem.

Qualitative data analysis is a complex process that demands much of reading, thinking and reflection on the part of researcher. It is time-consuming as the researcher has to be constantly engaged with the texts to tease out the hidden meanings. Beyond the differences in data analysis in different qualitative methods, coding, categorising and developing themes are the essential phases of data analysis in most methods. Researchers should avoid premature conclusions and ensure that the findings are comprehensively represented by participants' accounts. Qualitative data analysis is an iterative process.

Financial support and sponsorship

Conflicts of interest.

There are no conflicts of interest.

  • Cited Here |
  • View Full Text | CrossRef
  • PubMed | CrossRef
  • View Full Text | PubMed | CrossRef

Categories; codes; data analysis; qualitative research; theme

  • + Favorites
  • View in Gallery

Readers Of this Article Also Read

Artificial intelligence in healthcare, q-methodology as a research design: a brief overview, scholarship in nursing, nursing management of patients with psychiatric emergencies, a study to assess the insight and motivation towards quitting alcohol among....

data analysis of qualitative research

The Ultimate Guide to Qualitative Research - Part 2: Handling Qualitative Data

data analysis of qualitative research

  • Handling qualitative data
  • Transcripts
  • Field notes
  • Survey data and responses
  • Visual and audio data
  • Data organization
  • Data coding
  • Coding frame
  • Auto and smart coding
  • Organizing codes
  • Introduction

What is qualitative data analysis?

Qualitative data analysis methods, how do you analyze qualitative data, content analysis, thematic analysis.

  • Thematic analysis vs. content analysis
  • Narrative research

Phenomenological research

Discourse analysis, grounded theory.

  • Deductive reasoning
  • Inductive reasoning
  • Inductive vs. deductive reasoning
  • Qualitative data interpretation
  • Qualitative data analysis software

Qualitative data analysis

Analyzing qualitative data is the next step after you have completed the use of qualitative data collection methods . The qualitative analysis process aims to identify themes and patterns that emerge across the data.

data analysis of qualitative research

In simplified terms, qualitative research methods involve non-numerical data collection followed by an explanation based on the attributes of the data . For example, if you are asked to explain in qualitative terms a thermal image displayed in multiple colors, then you would explain the color differences rather than the heat's numerical value. If you have a large amount of data (e.g., of group discussions or observations of real-life situations), the next step is to transcribe and prepare the raw data for subsequent analysis.

Researchers can conduct studies fully based on qualitative methodology, or researchers can preface a quantitative research study with a qualitative study to identify issues that were not originally envisioned but are important to the study. Quantitative researchers may also collect and analyze qualitative data following their quantitative analyses to better understand the meanings behind their statistical results.

Conducting qualitative research can especially help build an understanding of how and why certain outcomes were achieved (in addition to what was achieved). For example, qualitative data analysis is often used for policy and program evaluation research since it can answer certain important questions more efficiently and effectively than quantitative approaches.

data analysis of qualitative research

Qualitative data analysis can also answer important questions about the relevance, unintended effects, and impact of programs, such as:

  • Were expectations reasonable?
  • Did processes operate as expected?
  • Were key players able to carry out their duties?
  • Were there any unintended effects of the program?

The importance of qualitative data analysis

Qualitative approaches have the advantage of allowing for more diversity in responses and the capacity to adapt to new developments or issues during the research process itself. While qualitative data analysis can be demanding and time-consuming to conduct, many fields of research utilize qualitative software tools that have been specifically developed to provide more succinct, cost-efficient, and timely results.

data analysis of qualitative research

Qualitative data analysis is an important part of research and building greater understanding across fields for a number of reasons. First, cases for qualitative data analysis can be selected purposefully according to whether they typify certain characteristics or contextual locations. In other words, qualitative data permits deep immersion into a topic, phenomenon, or area of interest. Rather than seeking generalizability to the population the sample of participants represent, qualitative research aims to construct an in-depth and nuanced understanding of the research topic.

Secondly, the role or position of the researcher in qualitative data analysis is given greater critical attention. This is because, in qualitative data analysis, the possibility of the researcher taking a ‘neutral' or transcendent position is seen as more problematic in practical and/or philosophical terms. Hence, qualitative researchers are often exhorted to reflect on their role in the research process and make this clear in the analysis.

data analysis of qualitative research

Thirdly, while qualitative data analysis can take a wide variety of forms, it largely differs from quantitative research in the focus on language, signs, experiences, and meaning. In addition, qualitative approaches to analysis are often holistic and contextual rather than analyzing the data in a piecemeal fashion or removing the data from its context. Qualitative approaches thus allow researchers to explore inquiries from directions that could not be accessed with only numerical quantitative data.

Establishing research rigor

Systematic and transparent approaches to the analysis of qualitative data are essential for rigor . For example, many qualitative research methods require researchers to carefully code data and discern and document themes in a consistent and credible way.

data analysis of qualitative research

Perhaps the most traditional division in the way qualitative and quantitative research have been used in the social sciences is for qualitative methods to be used for exploratory purposes (e.g., to generate new theory or propositions) or to explain puzzling quantitative results, while quantitative methods are used to test hypotheses .

data analysis of qualitative research

After you’ve collected relevant data , what is the best way to look at your data ? As always, it will depend on your research question . For instance, if you employed an observational research method to learn about a group’s shared practices, an ethnographic approach could be appropriate to explain the various dimensions of culture. If you collected textual data to understand how people talk about something, then a discourse analysis approach might help you generate key insights about language and communication.

data analysis of qualitative research

The qualitative data coding process involves iterative categorization and recategorization, ensuring the evolution of the analysis to best represent the data. The procedure typically concludes with the interpretation of patterns and trends identified through the coding process.

To start off, let’s look at two broad approaches to data analysis.

Deductive analysis

Deductive analysis is guided by pre-existing theories or ideas. It starts with a theoretical framework , which is then used to code the data. The researcher can thus use this theoretical framework to interpret their data and answer their research question .

The key steps include coding the data based on the predetermined concepts or categories and using the theory to guide the interpretation of patterns among the codings. Deductive analysis is particularly useful when researchers aim to verify or extend an existing theory within a new context.

Inductive analysis

Inductive analysis involves the generation of new theories or ideas based on the data. The process starts without any preconceived theories or codes, and patterns, themes, and categories emerge out of the data.

data analysis of qualitative research

The researcher codes the data to capture any concepts or patterns that seem interesting or important to the research question . These codes are then compared and linked, leading to the formation of broader categories or themes. The main goal of inductive analysis is to allow the data to 'speak for itself' rather than imposing pre-existing expectations or ideas onto the data.

Deductive and inductive approaches can be seen as sitting on opposite poles, and all research falls somewhere within that spectrum. Most often, qualitative data analysis approaches blend both deductive and inductive elements to contribute to the existing conversation around a topic while remaining open to potential unexpected findings. To help you make informed decisions about which qualitative data analysis approach fits with your research objectives, let's look at some of the common approaches for qualitative data analysis.

Content analysis is a research method used to identify patterns and themes within qualitative data. This approach involves systematically coding and categorizing specific aspects of the content in the data to uncover trends and patterns. An often important part of content analysis is quantifying frequencies and patterns of words or characteristics present in the data .

It is a highly flexible technique that can be adapted to various data types , including text, images, and audiovisual content . While content analysis can be exploratory in nature, it is also common to use pre-established theories and follow a more deductive approach to categorizing and quantifying the qualitative data.

data analysis of qualitative research

Thematic analysis is a method used to identify, analyze, and report patterns or themes within the data. This approach moves beyond counting explicit words or phrases and focuses on also identifying implicit concepts and themes within the data.

data analysis of qualitative research

Researchers conduct detailed coding of the data to ascertain repeated themes or patterns of meaning. Codes can be categorized into themes, and the researcher can analyze how the themes relate to one another. Thematic analysis is flexible in terms of the research framework, allowing for both inductive (data-driven) and deductive (theory-driven) approaches. The outcome is a rich, detailed, and complex account of the data.

Grounded theory is a systematic qualitative research methodology that is used to inductively generate theory that is 'grounded' in the data itself. Analysis takes place simultaneously with data collection , and researchers iterate between data collection and analysis until a comprehensive theory is developed.

Grounded theory is characterized by simultaneous data collection and analysis, the development of theoretical codes from the data, purposeful sampling of participants, and the constant comparison of data with emerging categories and concepts. The ultimate goal is to create a theoretical explanation that fits the data and answers the research question .

Discourse analysis is a qualitative research approach that emphasizes the role of language in social contexts. It involves examining communication and language use beyond the level of the sentence, considering larger units of language such as texts or conversations.

data analysis of qualitative research

Discourse analysts typically investigate how social meanings and understandings are constructed in different contexts, emphasizing the connection between language and power. It can be applied to texts of all kinds, including interviews , documents, case studies , and social media posts.

Phenomenological research focuses on exploring how human beings make sense of an experience and delves into the essence of this experience. It strives to understand people's perceptions, perspectives, and understandings of a particular situation or phenomenon.

data analysis of qualitative research

It involves in-depth engagement with participants, often through interviews or conversations, to explore their lived experiences. The goal is to derive detailed descriptions of the essence of the experience and to interpret what insights or implications this may bear on our understanding of this phenomenon.

data analysis of qualitative research

Whatever your data analysis approach, start with ATLAS.ti

Qualitative data analysis done quickly and intuitively with ATLAS.ti. Download a free trial today.

Now that we've summarized the major approaches to data analysis, let's look at the broader process of research and data analysis. Suppose you need to do some research to find answers to any kind of research question, be it an academic inquiry, business problem, or policy decision. In that case, you need to collect some data. There are many methods of collecting data: you can collect primary data yourself by conducting interviews, focus groups , or a survey , for instance. Another option is to use secondary data sources. These are data previously collected for other projects, historical records, reports, statistics – basically everything that exists already and can be relevant to your research.

data analysis of qualitative research

The data you collect should always be a good fit for your research question . For example, if you are interested in how many people in your target population like your brand compared to others, it is no use to conduct interviews or a few focus groups . The sample will be too small to get a representative picture of the population. If your questions are about "how many….", "what is the spread…" etc., you need to conduct quantitative research . If you are interested in why people like different brands, their motives, and their experiences, then conducting qualitative research can provide you with the answers you are looking for.

Let's describe the important steps involved in conducting research.

Step 1: Planning the research

As the saying goes: "Garbage in, garbage out." Suppose you find out after you have collected data that

  • you talked to the wrong people
  • asked the wrong questions
  • a couple of focus groups sessions would have yielded better results because of the group interaction, or
  • a survey including a few open-ended questions sent to a larger group of people would have been sufficient and required less effort.

Think thoroughly about sampling, the questions you will be asking, and in which form. If you conduct a focus group or an interview, you are the research instrument, and your data collection will only be as good as you are. If you have never done it before, seek some training and practice. If you have other people do it, make sure they have the skills.

data analysis of qualitative research

Step 2: Preparing the data

When you conduct focus groups or interviews, think about how to transcribe them. Do you want to run them online or offline? If online, check out which tools can serve your needs, both in terms of functionality and cost. For any audio or video recordings , you can consider using automatic transcription software or services. Automatically generated transcripts can save you time and money, but they still need to be checked. If you don't do this yourself, make sure that you instruct the person doing it on how to prepare the data.

  • How should the final transcript be formatted for later analysis?
  • Which names and locations should be anonymized?
  • What kind of speaker IDs to use?

What about survey data ? Some survey data programs will immediately provide basic descriptive-level analysis of the responses. ATLAS.ti will support you with the analysis of the open-ended questions. For this, you need to export your data as an Excel file. ATLAS.ti's survey import wizard will guide you through the process.

Other kinds of data such as images, videos, audio recordings, text, and more can be imported to ATLAS.ti. You can organize all your data into groups and write comments on each source of data to maintain a systematic organization and documentation of your data.

data analysis of qualitative research

Step 3: Exploratory data analysis

You can run a few simple exploratory analyses to get to know your data. For instance, you can create a word list or word cloud of all your text data or compare and contrast the words in different documents. You can also let ATLAS.ti find relevant concepts for you. There are many tools available that can automatically code your text data, so you can also use these codings to explore your data and refine your coding.

data analysis of qualitative research

For instance, you can get a feeling for the sentiments expressed in the data. Who is more optimistic, pessimistic, or neutral in their responses? ATLAS.ti can auto-code the positive, negative, and neutral sentiments in your data. Naturally, you can also simply browse through your data and highlight relevant segments that catch your attention or attach codes to begin condensing the data.

data analysis of qualitative research

Step 4: Build a code system

Whether you start with auto-coding or manual coding, after having generated some first codes, you need to get some order in your code system to develop a cohesive understanding. You can build your code system by sorting codes into groups and creating categories and subcodes. As this process requires reading and re-reading your data, you will become very familiar with your data. Counting on a tool like ATLAS.ti qualitative data analysis software will support you in the process and make it easier to review your data, modify codings if necessary, change code labels, and write operational definitions to explain what each code means.

data analysis of qualitative research

Step 5: Query your coded data and write up the analysis

Once you have coded your data, it is time to take the analysis a step further. When using software for qualitative data analysis , it is easy to compare and contrast subsets in your data, such as groups of participants or sets of themes.

data analysis of qualitative research

For instance, you can query the various opinions of female vs. male respondents. Is there a difference between consumers from rural or urban areas or among different age groups or educational levels? Which codes occur together throughout the data set? Are there relationships between various concepts, and if so, why?

Step 6: Data visualization

Data visualization brings your data to life. It is a powerful way of seeing patterns and relationships in your data. For instance, diagrams allow you to see how your codes are distributed across documents or specific subpopulations in your data.

data analysis of qualitative research

Exploring coded data on a canvas, moving around code labels in a virtual space, linking codes and other elements of your data set, and thinking about how they are related and why – all of these will advance your analysis and spur further insights. Visuals are also great for communicating results to others.

Step 7: Data presentation

The final step is to summarize the analysis in a written report . You can now put together the memos you have written about the various topics, select some salient quotes that illustrate your writing, and add visuals such as tables and diagrams. If you follow the steps above, you will already have all the building blocks, and you just have to put them together in a report or presentation.

When preparing a report or a presentation, keep your audience in mind. Does your audience better understand numbers than long sections of detailed interpretations? If so, add more tables, charts, and short supportive data quotes to your report or presentation. If your audience loves a good interpretation, add your full-length memos and walk your audience through your conceptual networks and illustrative data quotes.

data analysis of qualitative research

Qualitative data analysis begins with ATLAS.ti

For tools that can make the most out of your data, check out ATLAS.ti with a free trial.

Analyst Answers

Data & Finance for Work & Life

man doing qualitative research

Data Analysis for Qualitative Research: 6 Step Guide

Data analysis for qualitative research is not intuitive. This is because qualitative data stands in opposition to traditional data analysis methodologies: while data analysis is concerned with quantities, qualitative data is by definition unquantified . But there is an easy, methodical approach that anyone can take use to get reliable results when performing data analysis for qualitative research. The process consists of 6 steps that I’ll break down in this article:

  • Perform interviews(if necessary )
  • Gather all documents and transcribe any non-paper records
  • Decide whether to either code analytical data, analyze word frequencies, or both
  • Decide what interpretive angle you want to take: content analysis , narrative analysis, discourse analysis, framework analysis, and/or grounded theory
  • Compile your data in a spreadsheet using document saving techniques (windows and mac)
  • Identify trends in words, themes, metaphors, natural patterns, and more

To complete these steps, you will need:

  • Microsoft word
  • Microsoft excel
  • Internet access

You can get the free Intro to Data Analysis eBook to cover the fundamentals and ensure strong progression in all your data endeavors.

What is qualitative research?

Qualitative research is not the same as quantitative research. In short, qualitative research is the interpretation of non-numeric data. It usually aims at drawing conclusions that explain why a phenomenon occurs, rather than that one does occur. Here’s a great quote from a nursing magazine about quantitative vs qualitative research:

“A traditional quantitative study… uses a predetermined (and auditable) set of steps to confirm or refute [a] hypothesis. “In contrast, qualitative research often takes the position that an interpretive understanding is only possible by way of uncovering or deconstructing the meanings of a phenomenon. Thus, a distinction between explaining how something operates (explanation) and why it operates in the manner that it does (interpretation) may be [an] effective way to distinguish quantitative from qualitative analytic processes involved in any particular study.” (bold added) (( EBN ))

Learn to Interpret Your Qualitative Data

This article explain what data analysis is and how to do it. To learn how to interpret the results, visualize, and write an insightful report, sign up for our handbook below.

data analysis of qualitative research

Step 1a: Data collection methods and techniques in qualitative research: interviews and focus groups

Step 1 is collecting the data that you will need for the analysis. If you are not performing any interviews or focus groups to gather data, then you can skip this step. It’s for people who need to go into the field and collect raw information as part of their qualitative analysis.

Since the whole point of an interview and of qualitative analysis in general is to understand a research question better, you should start by making sure you have a specific, refined research question . Whether you’re a researcher by trade or a data analyst working on one-time project, you must know specifically what you want to understand in order to get results.

Good research questions are specific enough to guide action but open enough to leave room for insight and growth. Examples of good research questions include:

  • Good : To what degree does living in a city impact the quality of a person’s life? (open-ended, complex)
  • Bad : Does living in a city impact the quality of a person’s life? (closed, simple)

Once you understand the research question, you need to develop a list of interview questions. These questions should likewise be open-ended and provide liberty of expression to the responder. They should support the research question in an active way without prejudicing the response. Examples of good interview questions include:

  • Good : Tell me what it’s like to live in a city versus in the country. (open, not leading)
  • Bad : Don’t you prefer the city to the country because there are more people? (closed, leading)

Some additional helpful tips include:

  • Begin each interview with a neutral question to get the person relaxed
  • Limit each question to a single idea
  • If you don’t understand, ask for clarity
  • Do not pass any judgements
  • Do not spend more than 15m on an interview, lest the quality of responses drop

Focus groups

The alternative to interviews is focus groups. Focus groups are a great way for you to get an idea for how people communicate their opinions in a group setting, rather than a one-on-one setting as in interviews.

In short, focus groups are gatherings of small groups of people from representative backgrounds who receive instruction, or “facilitation,” from a focus group leader. Typically, the leader will ask questions to stimulate conversation, reformulate questions to bring the discussion back to focus, and prevent the discussion from turning sour or giving way to bad faith.

Focus group questions should be open-ended like their interview neighbors, and they should stimulate some degree of disagreement. Disagreement often leads to valuable information about differing opinions, as people tend to say what they mean if contradicted.

However, focus group leaders must be careful not to let disagreements escalate, as anger can make people lie to be hurtful or simply to win an argument. And lies are not helpful in data analysis for qualitative research.

Step 1b: Tools for qualitative data collection

When it comes to data analysis for qualitative analysis, the tools you use to collect data should align to some degree with the tools you will use to analyze the data.

As mentioned in the intro, you will be focusing on analysis techniques that only require the traditional Microsoft suite programs: Microsoft Excel and Microsoft Word . At the same time, you can source supplementary tools from various websites, like Text Analyzer and WordCounter.

In short, the tools for qualitative data collection that you need are Excel and Word , as well as web-based free tools like Text Analyzer and WordCounter . These online tools are helpful in the quantitative part of your qualitative research.

Step 2: Gather all documents & transcribe non-written docs

Once you have your interviews and/or focus group transcripts, it’s time to decide if you need other documentation. If you do, you’ll need to gather it all into one place first, then develop a strategy for how to transcribe any non-written documents.

When do you need documentation other than interviews and focus groups? Two situations usually call for documentation. First , if you have little funding , then you can’t afford to run expensive interviews and focus groups.

Second , social science researchers typically focus on documents since their research questions are less concerned with subject-oriented data, while hard science and business researchers typically focus on interviews and focus groups because they want to know what people think, and they want to know today.

Non-written records

Other factors at play include the type of research, the field, and specific research goal. For those who need documentation and to describe non-written records, there are some steps to follow:

  • Put all hard copy source documents into a sealed binder (I use plastic paper holders with elastic seals ).
  • If you are sourcing directly from printed books or journals, then you will need to digitalize them by scanning them and making them text readable by the computer. To do so, turn all PDFs into Word documents using online tools such as PDF to Word Converter . This process is never full-proof, and it may be a source of error in the data collection, but it’s part of the process.
  • If you are sourcing online documents, try as often as possible to get computer-readable PDF documents that you can easily copy/paste or convert. Locked PDFs are essentially a lost cause .
  • Transcribe any audio files into written documents. There are free online tools available to help with this, such as 360converter . If you run a test through the system, you’ll see that the output is not 100%. The best way to use this tool is as a first draft generator. You can then correct and complete it with old fashioned, direct transcription.

Step 3: Decide on the type of qualitative research

Before step 3 you should have collected your data, transcribed it all into written-word documents, and compiled it in one place. Now comes the interesting part. You need to decide what you want to get out of your research by choosing an analytic angle, or type of qualitative research.

The available types of qualitative research are as follows. Each of them takes a unique angle that you must choose to get what information you want from the analysis . In addition, each of them has a different impact on the data analysis for qualitative research (coding vs word frequency) that we use.

Content analysis

Narrative analysis, discourse analysis.

  • Framework analysis, and/or

Grounded theory

From a high level, content, narrative, and discourse analysis are actionable independent tactics, whereas framework analysis and grounded theory are ways of honing and applying the first three.

  • Definition : Content analysis is identify and labelling themes of any kind within a text.
  • Focus : Identifying any kind of pattern in written text, transcribed audio, or transcribed video. This could be thematic, word repetition, idea repetition. Most often, the patterns we find are idea that make up an argument.
  • Goal : To simplify, standardize, and quickly reference ideas from any given text. Content analysis is a way to pull the main ideas from huge documents for comparison. In this way, it’s more a means to an end.
  • Pros : The huge advantage of doing content analysis is that you can quickly process huge amounts of texts using simple coding and word frequency techniques we will look at below. To use a metaphore, it is to qualitative analysis documents what Spark notes are to books.
  • Cons : The downside to content analysis is that it’s quite general. If you have a very specific, narrative research question, then tracing “any and all ideas” will not be very helpful to you.
  • Definition : Narrative analysis is the reformulation and simplification of interview answers or documentation into small narrative components to identify story-like patterns.
  • Focus : Understanding the text based on its narrative components as opposed to themes or other qualities.
  • Goal : To reference the text from an angle closer to the nature of texts in order to obtain further insights.
  • Pros : Narrative analysis is very useful for getting perspective on a topic in which you’re extremely limited. It can be easy to get tunnel vision when you’re digging for themes and ideas from a reason-centric perspective. Turning to a narrative approach will help you stay grounded. More importantly, it helps reveal different kinds of trends.
  • Cons : Narrative analysis adds another layer of subjectivity to the instinctive nature of qualitative research. Many see it as too dependent on the researcher to hold any critical value.
  • Definition : Discourse analysis is the textual analysis of naturally occurring speech. Any oral expression must be transcribed before undergoing legitimate discourse analysis.
  • Focus : Understanding ideas and themes through language communicated orally rather than pre-processed on paper.
  • Goal : To obtain insights from an angle outside the traditional content analysis on text.
  • Pros : Provides a considerable advantage in some areas of study in order to understand how people communicate an idea, versus the idea itself. For example, discourse analysis is important in political campaigning. People rarely vote for the candidate who most closely corresponds to his/her beliefs, but rather for the person they like the most.
  • Cons : As with narrative analysis, discourse analysis is more subjective in nature than content analysis, which focuses on ideas and patterns. Some do not consider it rigorous enough to be considered a legitimate subset of qualitative analysis, but these people are few.

Framework analysis

  • Definition : Framework analysis is a kind of qualitative analysis that includes 5 ordered steps: coding, indexing, charting, mapping, and interpreting . In most ways, framework analysis is a synonym for qualitative analysis — the same thing. The significant difference is the importance it places on the perspective used in the analysis.
  • Focus : Understanding patterns in themes and ideas.
  • Goal : Creating one specific framework for looking at a text.
  • Pros : Framework analysis is helpful when the researcher clearly understands what he/she wants from the project, as it’s a limitation approach. Since each of its step has defined parameters, framework analysis is very useful for teamwork.
  • Cons : It can lead to tunnel vision.
  • Definition : The use of content, narrative, and discourse analysis to examine a single case, in the hopes that discoveries from that case will lead to a foundational theory used to examine other like cases.
  • Focus : A vast approach using multiple techniques in order to establish patterns.
  • Goal : To develop a foundational theory.
  • Pros : When successful, grounded theories can revolutionize entire fields of study.
  • Cons : It’s very difficult to establish ground theories, and there’s an enormous amount of risk involved.

Step 4: Coding, word frequency, or both

Coding in data analysis for qualitative research is the process of writing 2-5 word codes that summarize at least 1 paragraphs of text (not writing computer code). This allows researchers to keep track of and analyze those codes. On the other hand, word frequency is the process of counting the presence and orientation of words within a text, which makes it the quantitative element in qualitative data analysis.

Video example of coding for data analysis in qualitative research

In short, coding in the context of data analysis for qualitative research follows 2 steps (video below):

  • Reading through the text one time
  • Adding 2-5 word summaries each time a significant theme or idea appears

Let’s look at a brief example of how to code for qualitative research in this video:

Click here for a link to the source text. 1

Example of word frequency processing

And word frequency is the process of finding a specific word or identifying the most common words through 3 steps:

  • Decide if you want to find 1 word or identify the most common ones
  • Use word’s “Replace” function to find a word or phrase
  • Use Text Analyzer to find the most common terms

Here’s another look at word frequency processing and how you to do it. Let’s look at the same example above, but from a quantitative perspective.

Imagine we are already familiar with melanoma and KITs , and we want to analyze the text based on these keywords. One thing we can do is look for these words using the Replace function in word

  • Locate the search bar
  • Click replace
  • Type in the word
  • See the total results

Here’s a brief video example:

Another option is to use an online Text Analyzer. This methodology won’t help us find a specific word, but it will help us discover the top performing phrases and words. All you need to do it put in a link to a target page or paste a text. I pasted the abstract from our source text, and what turns up is as expected. Here’s a picture:

text analyzer example

Step 5: Compile your data in a spreadsheet

After you have some coded data in the word document, you need to get it into excel for analysis. This process requires saving the word doc as an .htm extension, which makes it a website. Once you have the website, it’s as simple as opening that page, scrolling to the bottom, and copying/pasting the comments, or codes, into an excel document.

You will need to wrangle the data slightly in order to make it readable in excel. I’ve made a video to explain this process and places it below.

Step 6: Identify trends & analyze!

There are literally thousands of different ways to analyze qualitative data, and in most situations, the best technique depends on the information you want to get out of the research.

Nevertheless, there are a few go-to techniques. The most important of this is occurrences . In this short video, we finish the example from above by counting the number of times our codes appear. In this way, it’s very similar to word frequency (discussed above).

A few other options include:

  • Ranking each code on a set of relevant criteria and clustering
  • Pure cluster analysis
  • Causal analysis

We cover different types of analysis like this on the website, so be sure to check out other articles on the home page .

How to analyze qualitative data from an interview

To analyze qualitative data from an interview , follow the same 6 steps for quantitative data analysis:

  • Perform the interviews
  • Transcribe the interviews onto paper
  • Decide whether to either code analytical data (open, axial, selective), analyze word frequencies, or both
  • Compile your data in a spreadsheet using document saving techniques (for windows and mac)
  • Source text [ ↩ ]

About the Author

Noah is the founder & Editor-in-Chief at AnalystAnswers. He is a transatlantic professional and entrepreneur with 5+ years of corporate finance and data analytics experience, as well as 3+ years in consumer financial products and business software. He started AnalystAnswers to provide aspiring professionals with accessible explanations of otherwise dense finance and data concepts. Noah believes everyone can benefit from an analytical mindset in growing digital world. When he's not busy at work, Noah likes to explore new European cities, exercise, and spend time with friends and family.

File available immediately.

data analysis of qualitative research

Notice: JavaScript is required for this content.

Research-Methodology

Qualitative Data Analysis

Qualitative data refers to non-numeric information such as interview transcripts, notes, video and audio recordings, images and text documents. Qualitative data analysis can be divided into the following five categories:

1. Content analysis . This refers to the process of categorizing verbal or behavioural data to classify, summarize and tabulate the data.

2. Narrative analysis . This method involves the reformulation of stories presented by respondents taking into account context of each case and different experiences of each respondent. In other words, narrative analysis is the revision of primary qualitative data by researcher.

3. Discourse analysis . A method of analysis of naturally occurring talk and all types of written text.

4. Framework analysis . This is more advanced method that consists of several stages such as familiarization, identifying a thematic framework, coding, charting, mapping and interpretation.

5. Grounded theory . This method of qualitative data analysis starts with an analysis of a single case to formulate a theory. Then, additional cases are examined to see if they contribute to the theory.

Qualitative data analysis can be conducted through the following three steps:

Step 1: Developing and Applying Codes . Coding can be explained as categorization of data. A ‘code’ can be a word or a short phrase that represents a theme or an idea. All codes need to be assigned meaningful titles. A wide range of non-quantifiable elements such as events, behaviours, activities, meanings etc. can be coded.

There are three types of coding:

  • Open coding . The initial organization of raw data to try to make sense of it.
  • Axial coding . Interconnecting and linking the categories of codes.
  • Selective coding . Formulating the story through connecting the categories.

Coding can be done manually or using qualitative data analysis software such as

 NVivo,  Atlas ti 6.0,  HyperRESEARCH 2.8,  Max QDA and others.

When using manual coding you can use folders, filing cabinets, wallets etc. to gather together materials that are examples of similar themes or analytic ideas. Manual method of coding in qualitative data analysis is rightly considered as labour-intensive, time-consuming and outdated.

In computer-based coding, on the other hand, physical files and cabinets are replaced with computer based directories and files. When choosing software for qualitative data analysis you need to consider a wide range of factors such as the type and amount of data you need to analyse, time required to master the software and cost considerations.

Moreover, it is important to get confirmation from your dissertation supervisor prior to application of any specific qualitative data analysis software.

The following table contains examples of research titles, elements to be coded and identification of relevant codes:

Born or bred: revising The Great Man theory of leadership in the 21 century  

Leadership practice

Born leaders

Made leaders

Leadership effectiveness

A study into advantages and disadvantages of various entry strategies to Chinese market

 

 

 

Market entry strategies

Wholly-owned subsidiaries

Joint-ventures

Franchising

Exporting

Licensing

Impacts of CSR programs and initiative on brand image: a case study of Coca-Cola Company UK.  

 

Activities, phenomenon

Philanthropy

Supporting charitable courses

Ethical behaviour

Brand awareness

Brand value

An investigation into the ways of customer relationship management in mobile marketing environment  

 

Tactics

Viral messages

Customer retention

Popularity of social networking sites

 Qualitative data coding

Step 2: Identifying themes, patterns and relationships . Unlike quantitative methods , in qualitative data analysis there are no universally applicable techniques that can be applied to generate findings. Analytical and critical thinking skills of researcher plays significant role in data analysis in qualitative studies. Therefore, no qualitative study can be repeated to generate the same results.

Nevertheless, there is a set of techniques that you can use to identify common themes, patterns and relationships within responses of sample group members in relation to codes that have been specified in the previous stage.

Specifically, the most popular and effective methods of qualitative data interpretation include the following:

  • Word and phrase repetitions – scanning primary data for words and phrases most commonly used by respondents, as well as, words and phrases used with unusual emotions;
  • Primary and secondary data comparisons – comparing the findings of interview/focus group/observation/any other qualitative data collection method with the findings of literature review and discussing differences between them;
  • Search for missing information – discussions about which aspects of the issue was not mentioned by respondents, although you expected them to be mentioned;
  • Metaphors and analogues – comparing primary research findings to phenomena from a different area and discussing similarities and differences.

Step 3: Summarizing the data . At this last stage you need to link research findings to hypotheses or research aim and objectives. When writing data analysis chapter, you can use noteworthy quotations from the transcript in order to highlight major themes within findings and possible contradictions.

It is important to note that the process of qualitative data analysis described above is general and different types of qualitative studies may require slightly different methods of data analysis.

My  e-book,  The Ultimate Guide to Writing a Dissertation in Business Studies: a step by step approach  contains a detailed, yet simple explanation of qualitative data analysis methods . The e-book explains all stages of the research process starting from the selection of the research area to writing personal reflection. Important elements of dissertations such as research philosophy, research approach, research design, methods of data collection and data analysis are explained in simple words. John Dudovskiy

Qualitative Data Analysis

American Psychological Association Logo

Observational Methods and Qualitative Data Analysis

  • Conducting Research
  • Data Collection and Analysis

Qualitative Research Methods in Psychology

data analysis of qualitative research

After delving into two primary research traditions, ethnographic inquiry and case study, this four-hour course concludes with an examination of qualitative data analysis.

The course begins with observation, a naturalistic qualitative inquiry technique where the researcher witnesses, participates in, and/or experiences the phenomenon under investigation. Developed by 20th-century anthropologists, these methods involve direct engagement in the setting of interest.

We explore ethnography’s potential to uncover cultural phenomena from rituals and traditions of groups like sports teams to understanding different ethnic cultures. The course reviews four ethnographic genres: classical, mainstream, public, and postmodern, each with distinct approaches to fieldwork and data presentation.

Next, the course examines case study methodology, which investigates a phenomenon bounded by time and place. Case studies can focus on individuals, interventions, organizations, or systems. The course highlights the importance of bounding the case to ensure a focused analysis.

The final segment covers qualitative data analysis, introducing coding techniques like initial and focused coding. Grounded theorists use these methods to develop theories grounded in participants’ lived experiences. The course emphasizes the iterative nature of qualitative research, where data collection and analysis occur simultaneously, allowing for constant comparison and refinement of codes.

Throughout the course, students will compare the purpose, focus, data gathering, and analysis methods of ethnographic inquiry and case study. They will also explore the role of the researcher as an instrument of data collection, highlighting the necessary skills, knowledge, and abilities. By the end, students will have a comprehensive understanding of how to apply these research traditions and analytic techniques to their own studies.

Learning objectives

  • Describe the utility of observation, including the researcher’s positionality in the observation space.
  • Identify the distinctive characteristics of ethnography and case study.
  • Explore the kinds of research questions that ethnographic and case study researchers seek to answer.
  • Describe qualitative coding, analysis, and integration techniques.
  • Apply basic qualitative data analysis (QDA) techniques.
  • Identify the distinctive characteristics of ethnography.
  • Explore the kinds of research questions that ethnographers seek to answer.
  • Identify the distinctive characteristics of case study.
  • Explore the kinds of research questions that case study researchers seek to answer.

This program does not offer CE credit.

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • Write for Us
  • BMJ Journals

You are here

  • Volume 3, Issue 3
  • Data analysis in qualitative research
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • Sally Thorne , RN, PhD
  • School of Nursing, University of British Columbia Vancouver, British Columbia, Canada

https://doi.org/10.1136/ebn.3.3.68

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Unquestionably, data analysis is the most complex and mysterious of all of the phases of a qualitative project, and the one that receives the least thoughtful discussion in the literature. For neophyte nurse researchers, many of the data collection strategies involved in a qualitative project may feel familiar and comfortable. After all, nurses have always based their clinical practice on learning as much as possible about the people they work with, and detecting commonalities and variations among and between them in order to provide individualised care. However, creating a database is not sufficient to conduct a qualitative study. In order to generate findings that transform raw data into new knowledge, a qualitative researcher must engage in active and demanding analytic processes throughout all phases of the research. Understanding these processes is therefore an important aspect not only of doing qualitative research, but also of reading, understanding, and interpreting it.

For readers of qualitative studies, the language of analysis can be confusing. It is sometimes difficult to know what the researchers actually did during this phase and to understand how their findings evolved out of the data that were collected or constructed. Furthermore, in describing their processes, some authors use language that accentuates this sense of mystery and magic. For example, they may claim that their conceptual categories “emerged” from the data 1 —almost as if they left the raw data out overnight and awoke to find that the data analysis fairies had organised the data into a coherent new structure that explained everything! In this EBN notebook, I will try to help readers make sense of some of the assertions that are made about qualitative data analysis so that they can develop a critical eye for when an analytical claim is convincing and when it is not.

Qualitative data

Qualitative data come …

Read the full text or download the PDF:

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Prev Med Public Health
  • v.56(2); 2023 Mar
  • PMC10111102

Qualitative Research in Healthcare: Data Analysis

1 Department of Preventive Medicine, Ulsan University Hospital, University of Ulsan College of Medicine, Ulsan, Korea

2 Ulsan Metropolitan City Public Health Policy’s Institute, Ulsan, Korea

Hyeran Jung

3 Department of Preventive Medicine, University of Ulsan College of Medicine, Seoul, Korea

Qualitative research methodology has been applied with increasing frequency in various fields, including in healthcare research, where quantitative research methodology has traditionally dominated, with an empirically driven approach involving statistical analysis. Drawing upon artifacts and verbal data collected from in-depth interviews or participatory observations, qualitative research examines the comprehensive experiences of research participants who have experienced salient yet unappreciated phenomena. In this study, we review 6 representative qualitative research methodologies in terms of their characteristics and analysis methods: consensual qualitative research, phenomenological research, qualitative case study, grounded theory, photovoice, and content analysis. We mainly focus on specific aspects of data analysis and the description of results, while also providing a brief overview of each methodology’s philosophical background. Furthermore, since quantitative researchers have criticized qualitative research methodology for its perceived lack of validity, we examine various validation methods of qualitative research. This review article intends to assist researchers in employing an ideal qualitative research methodology and in reviewing and evaluating qualitative research with proper standards and criteria.

INTRODUCTION

Researchers should select the research methodology best suited for their study. Quantitative research, which is based on empiricism and positivism, has long been the mainstream research methodology in most scientific fields. In recent years, however, increasing attempts have been made to use qualitative research methodology in various research fields, either combined with quantitative research methodology or as a stand-alone research method. Unlike quantitative research, which performs statistical analyses using the results derived in numerical form through investigations or experiments, qualitative research uses various qualitative analysis methods based on verbal data obtained through participatory observations or in-depth interviews. Qualitative research is advantageous when researching topics that involve research participants’ in-depth experiences and perceptions, topics that are important but have not yet drawn sufficient attention, and topics that should be reviewed from a new perspective.

However, qualitative research remains relatively rare in healthcare research, with quantitative research still predominating as the mainstream research practice [ 1 ]. Consequently, there is a lack of understanding of qualitative research, its characteristics, and its procedures in healthcare research. The low level of awareness of qualitative research can lead to the denigration of its results. Therefore, it is essential not only for researchers conducting qualitative research to have a correct understanding of various qualitative research methods, but also for peer researchers who review research proposals, reports, and papers to properly understand the procedures and advantages/disadvantages of qualitative research.

In our previous review paper, we explored the characteristics of qualitative research in comparison to quantitative research and its usefulness in healthcare research [ 2 ]. Specifically, we conducted an in-depth review of the general qualitative research process, selection of research topics and problems, selection of theoretical frameworks and methods, literature analysis, and selection of research participants and data collection methods [ 2 ]. This review article is dedicated to data analysis and the description of results, which may be considered the core of qualitative research, in different qualitative research methods in greater detail, along with the criteria for evaluating the validity of qualitative research. This review article is expected to offer insights into selecting and implementing the qualitative research methodology best suited for a given research topic and evaluating the quality of research.

IN-DEPTH REVIEW OF QUALITATIVE RESEARCH METHODS

This section is devoted to the in-depth review of 6 qualitative research methodologies (consensual qualitative research, phenomenological research, qualitative case study, grounded theory, photovoice, and content analysis), focusing on their characteristics and concrete analysis processes. Table 1 summarizes the characteristics of each methodology.

Characteristics and key analytical approaches of each qualitative research methodology

Qualitative research methodsKey analytical approachesAdvantagesLimitation
Consensual qualitative research
Phenomenological research
Qualitative case study
Grounded theory
Photovoice
Qualitative content analysis

Consensual Qualitative Research

Consensual qualitative research (CQR) was developed by Professor Clara Hill of the University of Maryland [ 3 ]. It emphasizes consensus within a research team (or analysis team) to address the problem of low objectivity being likely to occur when conducting qualitative research. This method seeks to maintain scientific rigor by deriving analysis results through team consensus, asserting the importance of ethical issues, trust, and the role of culture. In CQR, researchers are required to verify each conclusion whenever it is drawn by checking it against the original data.

Building a solid research team is the first step in conducting CQR. Most importantly, each team member should have resolute initiative and clear motivations for joining the research team. In general, at least 3 main team members are needed for data analysis, with 1 or 2 advisors (or auditors) reviewing their work. Researchers without experience in CQR should first receive prior education and training on its procedures and then team up with team members experienced in CQR. Furthermore, as is the case with other types of qualitative research, CQR attaches great importance to ensuring the objectivity of research by sharing prejudices, pre-understanding, and expectations of the research topic among the team members.

CQR is performed in 4 sequential steps: the initial stage, intra-case analysis stage, cross-analysis stage, and manuscript writing stage [ 4 ]. First, in the initial stage, the pre-formed team of researchers selects a research topic, performs a literature review, develops an interview guideline, and conducts pilot interviews. Research participants who fit the research topic are recruited using inclusion and exclusion criteria for selecting suitable participants. Then, interviews are conducted according to the interview guideline, recorded, and transcribed. The transcripts are sent to the interviewees for review. During this process, researchers could make slight modifications to explore the research topic better.

Second, in intra-case analysis stage, domains and subdomains are developed based on the initial interview guideline. The initial domains and subdomains are used to analyze 1 or 2 interviews, and afterward, the domains and subdomains are modified to reflect the analysis results. Core ideas are also created through interview analysis and are coded in domains and subdomains. The advisors review the domains, subdomains, and core ideas and provide suggestions for improvement. The remaining interviews are analyzed according to the revised domains, subdomains, and core ideas.

Third, in the cross-analysis stage, the core ideas from the interview analysis are categorized according to the domains and subdomains. In this process, repeated team discussions are encouraged to revise domains and subdomains and place the core ideas that do not lend themselves well to categorization into a miscellaneous category. The frequency of occurrence of each domain is then calculated for each interview case. In general, a domain is classified as a general category when it appears in all cases, a typical category when it appears in more than half of the cases, and a variant category when it appears in fewer than half of the cases [ 5 ]. However, the criteria for frequency counting may slightly differ from study to study. The advisors should also review the results of the cross-analysis stage, and the main analysis team revises the analysis results based on those comments.

Fourth, the intra-case analysis and cross-analysis results are described in the manuscript writing stage. It is essential to present a clear and convincing narrative to the audience [ 5 ], and it is thus recommended to revise and formulate the manuscript based on team discussions and advisor opinions. However, CQR does not guarantee that different research teams would reach similar conclusions, and the CQR research team dynamics strongly affect conflict-resolution issues during the consensus-building process [ 3 ].

As examined above, despite its limitations, the salient feature of CQR is its rigorous process for ensuring the objectivity of analysis results compared to other qualitative research methods. In addition, it is an accessible method for quantitative researchers because it explains the analysis results in terms of the frequency of domain occurrences. CQR can be a suitable research methodology to persuade researchers who are hesitant to accept the results of qualitative research. Although CQR is still rarely used in healthcare research, some studies have applied it to investigate topics of interest [ 6 , 7 ].

Phenomenological Research

Phenomenological research (PR) is, as its name suggests, qualitative research based on the phenomenological principle. The term “phenomenological principle” is based on Husserlian phenomenology, which seeks the essence (inner core) and the meaning of people’s lived experiences [ 8 ]. According to Husserl, it is necessary to go “back to the things themselves” (in German: zurück zu den Sachen selbst ) and accurately explore the essence of experience. Diverse reflective attitudes based on the phenomenological principle are required to understand “ Sachen ” without expectations and prejudices [ 9 ]. Thus, the purpose of PR using Husserl’s phenomenological principle can be understood as an inquiry into the essence of experience.

The process of PR aiming to fulfill this purpose differs among various schools and scholars. The Husserlian, Heideggerian, and Utrecht schools had major impacts on PR [ 10 ]. Representative Husserlian scholars who further developed the PR process include Amedeo Giorgi and Paul Colaizzi. Giorgi, who pioneered the field of phenomenological psychology, collected data through in-depth interviews and divided the analysis process into 4 steps [ 11 ]. Colaizzi, who was one of Giorgi’s students, proposed a more complex process from data collection to analysis [ 12 , 13 ]. Representative Heideggerian scholars are Patricia Benner, who introduced an interpretive phenomenological qualitative research method to the field of nursing on the subject of clinical placement of nursing students but did not fully clarify its specific procedure [ 14 ], and Nancy Diekelmann [ 15 ] and Nancy Diekelmann and David Allen [ 16 ], who emphasized the role of the team in the analysis process and proposed the 7-step method of analysis. Max Van Manen, a Dutch-born Canadian scholar, is a representative Utrecht School scholar who proposed a 6-step data collection and analysis process and emphasized the importance of phenomenological description [ 8 ]. As a scholar with no affiliation with any specific school, Adrian Van Kaam [ 17 ], an existentialist psychologist, developed an experiential PR method using descriptive texts. Despite differences in data collection and analysis processes, the common denominator of these approaches is a fundamentally phenomenological attitude and the goal of exploring the essence of experience.

In general, the process of phenomenological qualitative analysis can be divided into 5 steps based on the phenomenological attitude [ 18 ]: step 1, reading the data repeatedly to get a sense of the whole and gauge the meanings of the data; step 2, categorizing and clustering the data by meaning unit; step 3, writing analytically by meaning unit in a descriptive, reflective, and hermeneutic manner; step 4, deriving essential factors and thematizing while writing; and step 5, deriving the essential experiential structure by identifying the relationships between essential experiential factors. During the entire process, researchers must embrace the attitudes of “reduction” and “imaginative variation.” The term “reduction” reflects the thought of accepting the meaning of experience in the way it manifests itself [ 19 ]. An attitude of phenomenological reduction is required to recover freshness and curiosity about the research object through non-judgment, bracketing, and epoché , which assist to minimize the effects of researchers’ prejudices of research topic during the analysis process. An attitude of imaginative variation is required to diversify the meanings pertaining to data and view them as diametric opposites.

As described above, PR is characterized more by emphasizing the researcher’s constant reflection and interpretation/recording of the experience, seeking to explore its very essence, than by being conducted according to a concrete procedure. Based on these characteristics, PR in healthcare research has been applied to various topics, including research on the meaning of health behaviors such as drinking and smoking in various cultures since the 1970s [ 20 , 21 ], information and education needs of patients with diabetes [ 22 ], pain in cancer patients [ 23 ], and the experiences of healthcare students and professionals in patient safety activities [ 24 , 25 ].

Qualitative Case Study

Although case studies have long been conducted in various academic fields, in the 1980s [ 26 ], they began to be recognized as a qualitative research method with the case study publications by researchers such as Merriam [ 27 ], Stake [ 28 ], Yin [ 29 ], and Hays [ 30 ]. Case studies include both quantitative and qualitative strategies and can also be used with other qualitative research methods. In general, a qualitative case study (QCS) is a research method adopted to understand the complexity of a case, derive its meaning, and identify the process of change over time [ 27 ]. To achieve these goals, a QCS collects in-depth data using various information sources from rich contexts and explores one or more bounded systems [ 31 ].

A case, which is the core of a case study, has delimitation [ 28 ], contextuality [ 29 ], specificity [ 30 ], complexity [ 32 ], and newness [ 27 ]. The definition of a case study differs among scholars, but they agree that a case to be studied should have boundaries that distinguish it from other cases. Therefore, a case can be a person, a group, a program, or an event and can also be a single or complex case [ 28 ]. The types of QCSs are classified by the scale of the bounded system and the purpose of case analysis. From the latter perspective, Stake [ 28 ] divided case studies into intrinsic and instrumental case studies.

A QCS is conducted in 5 steps [ 33 ]. Stage 1 is the research design stage, where an overall plan is established for case selection, research question setting, research time and cost allocation, and the report format of research outcomes [ 28 ]. Yin [ 33 ] noted that 4 types of case studies could be designed based on the number of cases (single or multiple cases) and the number of analysis units (holistic design for a single unit or embedded design for multiple units). These types are called single holistic design, single embedded design, multiple holistic design, and multiple embedded design. Stage 2 is the preparation stage for data collection. The skills and qualifications required for the researcher are reviewed, prior training of researchers takes place, a protocol is developed, candidate cases are screened, and a pilot case study is conducted. Stage 3 is data collection. Data are collected from the data sources commonly used in case studies, such as documents, archival records, interviews, direct observations, participatory observations, and physical artifacts [ 33 ]. Other data sources for case studies include films, photos, videotapes, and life history studies [ 34 ]. The data collection period may vary depending on the research topic and the need for additional data collection during the analysis process. Stage 4 is the data analysis stage. The case is described in detail based on the collected data, and the data for concrete topics are analyzed [ 28 ]. With no prescribed method related to data collection and analysis for a case study, a general data analysis procedure is followed, and the choice of analysis method differs among researchers. In a multiple-case study, the meaning of the cases is interpreted by performing intra-case and inter-case analyses. The last stage is the interpretation stage, in which the researcher reports the meaning of the case—that is, the lessons learned from the case [ 35 ].

Compared to other qualitative research methods, QCSs have no prescribed procedure, which may prove challenging in the actual research process. However, when the researcher seeks an in-depth understanding of a bound system clearly distinguished from other cases, a QCS can be an appropriate approach. Based on the characteristics mentioned above, QCSs in healthcare research have been mainly conducted on unique cases or cases that should be known in detail, such as the experience of rare diseases [ 36 ], victims of medical malpractice [ 37 ], complications due to home birth [ 38 ], and post-stroke gender awareness of women of childbearing age [ 39 ].

Grounded Theory

Grounded theory (GT) is a research approach to gaining facts about an unfamiliar specific social phenomenon or a new understanding of a particular phenomenon [ 40 ]. GT involves the most systematic research process among all qualitative research methods [ 41 ]. Its most salient feature is generating a theory by collecting various data from research subjects and analyzing the relationship between the central phenomenon and each category through an elaborate analysis process. GT is adequate for understanding social and psychological structural phenomena regarding a specific object or social phenomenon, rather than framework or hypothesis testing [ 42 ].

GT was first introduced in 1967 by Strauss and Glaser. Their views subsequently diverged and each scholar separately developed different GT methods. Glaser’s GT focused on the natural emergence of categories and theories based on positivism [ 40 , 43 ]. Strauss, who was influenced by symbolic interactionism and pragmatism, teamed up with Corbin and systematically presented the techniques and procedures of the GT process [ 44 ]. Since then, various GT techniques have been developed [ 45 ]; Charmaz’s GT is based on constructivism [ 43 ].

Researchers using GT should collect data based on theoretical sampling and theoretical saturation. Theoretical sampling refers to selecting additional data using the theoretical concepts encountered in collecting and analyzing data, and theoretical saturation occurs when no new categories are expected to appear [ 40 ]. Researchers must also possess theoretical sensitivity—that is, the ability to react sensitively to the collected data and gain insight into them [ 40 ]. An analysis is performed through the constant comparative method, wherein researchers constantly compare the collected data and discover similarities and differences to understand the relationships between phenomena, concepts, and categories.

Among the different types of GT research designs, the one proposed by Strauss and Corbin is divided into 3 stages. Stage 1 is open coding; the concepts are derived from the data through a line-by-line data analysis, and the initial categorization occurs. Stage 2 is axial coding; the interrelationships among the categories derived from open coding are schematized in line with the structural framework defined as a paradigm. The major components of the paradigm are causal conditions, context, intervening conditions, action/interaction strategies, and consequences. Stage 3 is selective coding; the core category is first derived, the relationships between subcategories and concepts are identified, and the narrative outline is described. Lastly, the process is presented in a visual mode, whereupon a theoretical model is built and integrated. In contrast, Glaser’s analysis method involves theoretical coding that weaves practical concepts into hypotheses or theories instead of axial coding [ 46 ]. Currently, Strauss and Corbin’s GT method is the most widely used one [ 47 ], and given that different terms are used among scholars, it is crucial to accurately understand the meaning of a term in context instead of solely focusing on the term itself [ 48 ].

The most salient features of GT are that it seeks to generate a new theory from data based on the inductive principle through its analytical framework. This framework enables an understanding of the interaction experience and the structure of its performances [ 40 ]. Furthermore, the above-described characteristics of GT widen the pathway of quantitative researchers to apply GT more than other qualitative research methods [ 43 ], which has resulted in its broader application in healthcare research. GT has been used to explore a wide range of research topics, such as asthma patients’ experiences of disease management [ 48 ], the experiences of cancer patients or their families [ 49 , 50 ], and the experiences of caregivers of patients with cognitive disorders and dementia [ 51 ].

Photovoice, a research methodology initiated by Wang and Burris [ 52 ], has been used to highlight the experiences and perspectives of marginalized people using photos. In other words, photos and their narratives are at the heart of photovoice; this method is designed to make marginalized voices heard. Photovoice, which uses photos to bring to the fore the experiences of participants who have lived a marginalized life, requires the active engagement of the participants. In other research methods, the participants play an essential role in the data collection stage (interview, topic-related materials such as diary and doodle) and the research validation stage (participants’ review). In contrast, in photovoice research, which is classified as participatory action research, participants’ dynamic engagement is essential throughout the study process—from the data collection and analysis procedure to exhibition and policy development [ 53 ].

Specifically, the photovoice research design is as follows [ 54 , 55 ]: First, policymakers or community stakeholders, who will likely bring about practical improvements on the research topic, are recruited. Second, participants with a wealth of experience on a research topic are recruited. In this stage, it should be borne in mind that the drop-out rate is high because participants’ active involvement is required, and the process is relatively time-consuming. Third, the participants are provided with information on the purpose and process of photovoice research, and they are educated on research ethics and the potential risks. Fourth, consent is obtained from the participants for research participation and the use of their photos. Fifth, a brainstorming session is held to create a specific topic within the general research topic. Sixth, researchers select a type of camera and educate the participants on the camera and photo techniques. The characteristics of the camera function (e.g., autofocus and manual focus) should be considered when selecting a camera type (e.g., mobile phone camera, disposable camera, or digital camera). Seventh, participants are given time to take pictures for discussion. Eighth, a discussion is held on the photos provided by the participants. The collected data are managed and analyzed in 3 sub-steps: (1) participants’ photo selection (selecting a photo considered more meaningful or important than other photos); (2) contextualization (analyzing the selected photo and putting the meanings attached to the photo into context); and (3) codifying (categorizing similar photos and meanings among the data collected and summarizing them in writing). In sub-step 2, the “SHOWeD” question skill could be applied to facilitate the discussion [ 56 ]: “What do you S ee here? What’s really H appening here? How does this relate to O ur lives? W hy does this situation, concern, or strength E xist? What can we D o about it?” Ninth, the participants’ summarized experiences related to their respective photos are shared and presented. This process is significant because it provides the participants with an opportunity to exhibit their photos and improve the related topics’ conditions. It is recommended that policymakers or community stakeholders join the roundtable to reflect on the outcomes and discuss their potential involvement to improve the related topics.

Based on the characteristics described above, photovoice has been used in healthcare research since the early 2000s to reveal the experiences of marginalized people, such as the lives of Black lesbian, gay, bisexual, transgender and questioning people [ 57 ] and women with acquired immunodeficiency syndrome [ 58 ], and in studies on community health issues, such as the health status of indigenous women living in a remote community [ 59 ], the quality of life of breast cancer survivors living in rural areas [ 60 ], and healthy eating habits of rural youth [ 61 ].

Qualitative Content Analysis

Content analysis is a research method that can use both qualitative and quantitative methods to derive valid inferences from data [ 62 ]. It can use a wide range of data covering a long period and diverse fields [ 63 ]. It helps compare objects, identify a specific person’s characteristics or hidden intentions, or analyze a specific era’s characteristics [ 64 ]. Quantitative content analysis categorizes research data and analyzes the relationships between the derived categories using statistical methods [ 65 ]. In contrast, qualitative content analysis (QCA) uses data coding to identify categories’ extrinsic and intrinsic meanings. The parallelism of these aspects contributes to establishing the validity of conclusions in content analysis [ 63 ].

Historically, mass media, such as newspapers and news programs, played the role of the locomotive for the development of content analysis. As interest in mass media content dealing with particular events and issues increased, content analysis was increasingly used in research analyzing mass media. In particular, it was also used in various forms to analyze propaganda content during World War II. The subsequent emergence of computer technology led to the revival of various types of content analysis research [ 66 ].

QCA is largely divided into conventional, directed, and summative [ 67 ]. First, conventional content analysis is an inductive method for deriving categories from data without using perceived categories. Key concepts are derived via the coding process by repeatedly reading and analyzing the data collected through open-ended questions. Categorization is then performed by sorting the coded data while checking similarities and differences. Second, directed content analysis uses key concepts or categories extracted from existing theories or studies as the initial coding categories. Unlike conventional content analysis, directed content analysis is closer to a deductive method and is anchored in a more structured process. Summative content analysis, the third approach, not only counts the frequency of keywords or content, but also evaluates their contextual usage and provides qualitative interpretations. It is used to understand the context of a word, along with the frequency of its occurrence, and thus to find the range of meanings that a word can have.

Since there is no concrete set procedure, the content analysis procedure varies among researchers. Some of the typical processes are a 3-step process (preparation, organizing, reporting) proposed by Elo and Kyngäs [ 68 ], a 4-step process (formulating research questions, sampling, coding, analyzing) presented by White and Marsh [ 69 ], and a 6-step process proposed by Krippendorff [ 66 ].

The 6-step content analysis research process proposed by Krippendorff [ 66 ] is as follows: Step 1, unitizing, is a process in which the researcher selects a scheme for classifying the data of interest for data collection and analysis. Step 2, sampling, involves selecting a conceptually representative sample population. In Step 3, recording/coding, the researcher records materials that are difficult to preserve, such as verbal statements, in a way that allows repeated review. Step 4, reducing, refers to simplifying the data into a manageable format using statistical techniques or summaries. Step 5, abductively inferring, involves inferring a phenomenon in the context of a situation to understand the contextual phenomenon while analyzing the data. In Step 6, narrating, the research outcomes are presented in a narrative accessible to the audience. These 6 steps are not subject to a sequential order and may go through a cyclical or iterative process [ 63 ].

As examined above, content analysis is used in several fields due to its advantages of embracing both qualitative and quantitative aspects and processing comprehensive data [ 62 , 70 ]. In recognition of its research potential, the public health field is also increasingly using content analysis research, as exemplified by suicide-related social media content analysis [ 71 ], an analysis of children’s books in association with breast cancer [ 72 ], and an analysis of patients’ medical records [ 73 ].

VALIDATION OF QUALITATIVE RESEARCH

The validation of qualitative research begins when a researcher attempts to persuade others that the research results are worthy of attention [ 35 ]. Several researchers have advanced their arguments in many different ways, from the reason or justification for existence of the validity used in qualitative research to the assessment terms and their meanings [ 74 ]. We explain the validity of qualitative research, focusing on the argument advanced by Guba and Lincoln [ 75 ]. They emphasized that the evaluation of qualitative research is a socio-political process—namely, a researcher should assume the role of a mediator of the judgment process, not that of the judge [ 75 ]. Specifically, Lincoln and Guba [ 75 ] proposed trustworthiness as a validity criterion: credibility, transferability, dependability, and confirmability.

First, credibility is a concept that corresponds to internal validity in quantitative research. To enhance the credibility of qualitative research, a “member check” is used to directly assess whether the reality of the research participants is well-reflected in the raw data, transcripts, and analysis categories [ 76 , 77 ]. Second, transferability corresponds to external validity or generalizability in quantitative research. To enhance the transferability of qualitative research, researchers must describe the data collection and analysis processes in detail and provide thick data on the overall research process, including research participants and the context and culture of research [ 77 , 78 ]. Transferability can also be enhanced by checking whether the analysis results elicit similar feelings in those who have not participated in the study but share similar experiences. Third, dependability corresponds to reliability in quantitative research and is associated with data stability. To enhance the trustworthiness of qualitative research, it is common for multiple researchers to perform the analysis independently; alternatively, or if one researcher has performed the analysis, another researcher reviews the analysis results. Furthermore, a qualitative researcher must provide a detailed and transparent description of the entire research process so that other researchers, internal or external, can evaluate whether the researcher has adequately proceeded with the overall research process. Fourth, confirmability corresponds to objectivity in quantitative research. Bracketing, a process of disclosing and discussing the researcher’s pre-understanding that may affect the research process from the beginning to the end, is conducted to enhance the confirmability of qualitative research. The results of bracketing should be included in the study results so that readers can also track the possible influence [ 77 ].

However, regarding the validity of a qualitative study, it is necessary to consider the research topic, the target audience, and research costs. Caution should also be applied to the proposed theories because presentation methods vary among scholars and researchers. Apart from the methods discussed above, other methods are used to enhance the validity of qualitative research methods, such as prolonged involvement, persistent observation, triangulation, and peer debriefing. In prolonged involvement, a researcher depicts the core of a phenomenon while staying at the study site for a sufficient time to build rapport with the participants and pose a sufficient amount of questions. In persistent observation, a researcher repeatedly reviews and observes data resources until the factors closest to the research topic are identified, giving depth to the study. Triangulation is used to check whether the same results are drawn by a team of researchers who conduct a study using various resources, including individual interviews, talks, and field notes, and discuss their respective analysis processes and results. Lastly, in peer debriefing, research results are discussed with colleagues who have not participated in the study from the beginning to the end, but are well-informed about the research topic or phenomenon [ 76 , 78 ].

This review article examines the characteristics and analysis processes of 6 different qualitative research methodologies. Additionally, a detailed overview of various validation methods for qualitative research is provided. However, a few limitations should be considered when novice qualitative researchers follow the steps in this article. First, as each qualitative research methodology has extensive and unique research approaches and analysis procedures, it should be kept in mind that the priority of this article was to highlight each methodology’s most exclusive elements that essentially compromise the core of its identity. Its scope unfortunately does not include the inch-by-inch steps of individual methodologies—for this information, it would be necessary to review the references included in the section dedicated to each methodology. Another limitation is that this article does not concentrate on the direct comparison of each methodology, which might benefit novice researchers in the process of selecting an adequate methodology for their research topic. Instead, this review article emphasizes the advantages and limitations of each methodology. Nevertheless, this review article is expected to help researchers considering employing qualitative research methodologies in the field of healthcare select an optimal method and conduct a qualitative study properly. It is sincerely hoped that this review article, along with the previous one, will encourage many researchers in the healthcare domain to use qualitative research methodologies.

Ethics Statement

Approval from the institutional review board was not obtained as this study is a review article.

ACKNOWLEDGEMENTS

CONFLICT OF INTEREST

The authors have no conflicts of interest associated with the material presented in this paper.

AUTHOR CONTRIBUTIONS

Conceptualization: Ock M. Literature review: Im D, Pyo J, Lee H, Jung H, Ock M. Funding acquisition: None. Writing – original draft: Im D, Pyo J, Lee H, Jung H, Ock M. Writing – review & editing: Im D, Pyo J, Lee H, Jung H, Ock M.

  • Find course
  • Computer Science
  • Engineering
  • Life Sciences
  • Political Science

Social Sciences

  • ?All Disciplines
  • Czech Republic
  • Netherlands

Switzerland

  • United Kingdom
  • ?All Countries
  • Top Destinations

data analysis of qualitative research

Qualitative Research Methods & Data Analysis

17 June - 21 June 2024

Global School in Empirical Research Methods

Institution:

University of St. Gallen

Course leader

Target group.

Master | PhD | Postdoc | Professionals

Interested?

1100 CHF, Master | PhD

2000 CHF, Postdoc | Professionals

Other relevant courses

Frankfurt am Main, Germany

War in Ukraine: Destruction of Heritage – Mastering Legacy

16 September - 27 September 2024

Lugano, Switzerland

Advanced Skills for Research Data Management and Open Science in Social Sciences

25 November - 29 November 2024

Barcelona, Spain

Probabilities and Odds Ratios in Logistic Regression

22 October - 25 October 2024

Stay up-to-date about our summer schools!

If you don’t want to miss out on new summer school courses, subscribe to our monthly newsletter.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 08 September 2024

Longitudinal analysis of teacher self-efficacy evolution during a STEAM professional development program: a qualitative case study

  • Haozhe Jiang   ORCID: orcid.org/0000-0002-7870-0993 1 ,
  • Ritesh Chugh   ORCID: orcid.org/0000-0003-0061-7206 2 ,
  • Xuesong Zhai   ORCID: orcid.org/0000-0002-4179-7859 1 , 3   nAff7 ,
  • Ke Wang 4 &
  • Xiaoqin Wang 5 , 6  

Humanities and Social Sciences Communications volume  11 , Article number:  1162 ( 2024 ) Cite this article

Metrics details

Despite the widespread advocacy for the integration of arts and humanities (A&H) into science, technology, engineering, and mathematics (STEM) education on an international scale, teachers face numerous obstacles in practically integrating A&H into STEM teaching (IAT). To tackle the challenges, a comprehensive five-stage framework for teacher professional development programs focussed on IAT has been developed. Through the use of a qualitative case study approach, this study outlines the shifts in a participant teacher’s self-efficacy following their exposure to each stage of the framework. The data obtained from interviews and reflective analyses were analyzed using a seven-stage inductive method. The findings have substantiated the significant impact of a teacher professional development program based on the framework on teacher self-efficacy, evident in both individual performance and student outcomes observed over eighteen months. The evolution of teacher self-efficacy in IAT should be regarded as an open and multi-level system, characterized by interactions with teacher knowledge, skills and other entrenched beliefs. Building on our research findings, an enhanced model of teacher professional learning is proposed. The revised model illustrates that professional learning for STEAM teachers should be conceived as a continuous and sustainable process, characterized by the dynamic interaction among teaching performance, teacher knowledge, and teacher beliefs. The updated model further confirms the inseparable link between teacher learning and student learning within STEAM education. This study contributes to the existing body of literature on teacher self-efficacy, teacher professional learning models and the design of IAT teacher professional development programs.

Similar content being viewed by others

data analysis of qualitative research

Primary and secondary school teachers’ perceptions of their social science training needs

data analysis of qualitative research

Investigating how subject teachers transition to integrated STEM education: A hybrid qualitative study on primary and middle school teachers

data analysis of qualitative research

Exploring the factors affecting elementary mathematics teachers’ innovative behavior: an integration of social cognitive theory

Introduction.

In the past decade, there has been a surge in the advancement and widespread adoption of Science, Technology, Engineering, and Mathematics (STEM) education on a global scale (Jiang et al. 2021 ; Jiang et al. 2022 ; Jiang et al. 2023 ; Jiang et al. 2024a , b ; Zhan et al. 2023 ; Zhan and Niu 2023 ; Zhong et al. 2022 ; Zhong et al. 2024 ). Concurrently, there has been a growing chorus of advocates urging the integration of Arts and Humanities (A&H) into STEM education (e.g., Alkhabra et al. 2023 ; Land 2020 ; Park and Cho 2022 ; Uştu et al. 2021 ; Vaziri and Bradburn 2021 ). STEM education is frequently characterized by its emphasis on logic and analysis; however, it may be perceived as deficient in emotional and intuitive elements (Ozkan and Umdu Topsakal 2021 ). Through the integration of Arts and Humanities (A&H), the resulting STEAM approach has the potential to become more holistic, incorporating both rationality and emotional intelligence (Ozkan and Umdu Topsakal 2021 ). Many studies have confirmed that A&H can help students increase interest and develop their understanding of the contents in STEM fields, and thus, A&H can attract potential underrepresented STEM learners such as female students and minorities (Land 2020 ; Park and Cho 2022 ; Perignat and Katz-Buonincontro 2019 ). Despite the increasing interest in STEAM, the approaches to integrating A&H, which represent fundamentally different disciplines, into STEM are theoretically and practically ambiguous (Jacques et al. 2020 ; Uştu et al. 2021 ). Moreover, studies have indicated that the implementation of STEAM poses significant challenges, with STEM educators encountering difficulties in integrating A&H into their teaching practices (e.g., Boice et al. 2021 ; Duong et al. 2024 ; Herro et al. 2019 ; Jacques et al. 2020 ; Park and Cho 2022 ; Perignat and Katz-Buonincontro 2019 ). Hence, there is a pressing need to provide STEAM teachers with effective professional training.

Motivated by this gap, this study proposes a novel five-stage framework tailored for teacher professional development programs specifically designed to facilitate the integration of A&H into STEM teaching (IAT). Following the establishment of this framework, a series of teacher professional development programs were implemented. To explain the framework, a qualitative case study is employed, focusing on examining a specific teacher professional development program’s impact on a pre-service teacher’s self-efficacy. The case narratives, with a particular focus on the pre-service teacher’s changes in teacher self-efficacy, are organized chronologically, delineating stages before and after each stage of the teacher professional development program. More specifically, meaningful vignettes of the pre-service teacher’s learning and teaching experiences during the teacher professional development program are offered to help understand the five-stage framework. This study contributes to understanding teacher self-efficacy, teacher professional learning model and the design of IAT teacher professional development programs.

Theoretical background

The conceptualization of steam education.

STEM education can be interpreted through various lenses (e.g., Jiang et al. 2021 ; English 2016 ). As Li et al. (2020) claimed, on the one hand, STEM education can be defined as individual STEM disciplinary-based education (i.e., science education, technology education, engineering education and mathematics education). On the other hand, STEM education can also be defined as interdisciplinary or cross-disciplinary education where individual STEM disciplines are integrated (Jiang et al. 2021 ; English 2016 ). In this study, we view it as individual disciplinary-based education separately in science, technology, engineering and mathematics (English 2016 ).

STEAM education emerged as a new pedagogy during the Americans for the Arts-National Policy Roundtable discussion in 2007 (Perignat and Katz-Buonincontro 2019 ). This pedagogy was born out of the necessity to enhance students’ engagement, foster creativity, stimulate innovation, improve problem-solving abilities, and cultivate employability skills such as teamwork, communication and adaptability (Perignat and Katz-Buonincontro 2019 ). In particular, within the framework of STEAM education, the ‘A’ should be viewed as a broad concept that represents arts and humanities (A&H) (Herro and Quigley 2016 ; de la Garza 2021 , Park and Cho 2022 ). This conceptualization emphasizes the need to include humanities subjects alongside arts (Herro and Quigley 2016 ; de la Garza 2021 ; Park and Cho 2022 ). Sanz-Camarero et al. ( 2023 ) listed some important fields of A&H, including physical arts, fine arts, manual arts, sociology, politics, philosophy, history, psychology and so on.

In general, STEM education does not necessarily entail the inclusion of all STEM disciplines collectively (Ozkan and Umdu Topsakal 2021 ), and this principle also applies to STEAM education (Gates 2017 ; Perignat and Katz-Buonincontro 2019 ; Quigley et al. 2017 ; Smith and Paré 2016 ). As an illustration, Smith and Paré ( 2016 ) described a STEAM activity in which pottery (representing A&H) and mathematics were integrated, while other STEAM elements such as science, technology and engineering were not included. In our study, STEAM education is conceptualized as an interdisciplinary approach that involves the integration of one or more components of A&H into one or more STEM school subjects within educational activities (Ozkan and Umdu Topsakal 2021 ; Vaziri and Bradburn 2021 ). Notably, interdisciplinary collaboration entails integrating one or more elements from arts and humanities (A&H) with one or more STEM school subjects, cohesively united by a shared theme while maintaining their distinct identities (Perignat and Katz-Buonincontro 2019 ).

In our teacher professional development programs, we help mathematics, technology, and science pre-service teachers integrate one component of A&H into their disciplinary-based teaching practices. For instance, we help mathematics teachers integrate history (a component of A&H) into mathematics teaching. In other words, in our study, integrating A&H into STEM teaching (IAT) can be defined as integrating one component of A&H into the teaching of one of the STEM school subjects. The components of A&H and the STEM school subject are brought together under a common theme, but each of them remains discrete. Engineering is not taught as an individual subject in the K-12 curriculum in mainland China. Therefore, A&H is not integrated into engineering teaching in our teacher professional development programs.

Self-efficacy and teacher self-efficacy

Self-efficacy was initially introduced by Bandura ( 1977 ) as a key concept within his social cognitive theory. Bandura ( 1997 ) defined self-efficacy as “people’s beliefs about their capabilities to produce designated levels of performance that exercise influence over events that affect their lives” (p. 71). Based on Bandura’s ( 1977 ) theory, Tschannen-Moran et al. ( 1998 ) defined the concept of teacher self-efficacy Footnote 1 as “a teacher’s belief in her or his ability to organize and execute the courses of action required to successfully accomplish a specific teaching task in a particular context” (p. 233). Blonder et al. ( 2014 ) pointed out that this definition implicitly included teachers’ judgment of their ability to bring about desired outcomes in terms of students’ engagement and learning. Moreover, OECD ( 2018 ) defined teacher self-efficacy as “the beliefs that teachers have of their ability to enact certain teaching behavior that influences students’ educational outcomes, such as achievement, interest, and motivation” (p. 51). This definition explicitly included two dimensions: teachers’ judgment of the ability related to their teaching performance (i.e., enacting certain teaching behavior) and their influence on student outcomes.

It is argued that teacher self-efficacy should not be regarded as a general or overarching construct (Zee et al. 2017 ; Zee and Koomen 2016 ). Particularly, in the performance-driven context of China, teachers always connect their beliefs in their professional capabilities with the educational outcomes of their students (Liu et al. 2018 ). Therefore, we operationally conceptualize teacher self-efficacy as having two dimensions: self-efficacy in individual performance and student outcomes (see Table 1 ).

Most importantly, given its consistent association with actual teaching performance and student outcomes (Bray-Clark and Bates 2003 ; Kelley et al. 2020 ), teacher self-efficacy is widely regarded as a pivotal indicator of teacher success (Kelley et al. 2020 ). Moreover, the enhancement of teaching self-efficacy reflects the effectiveness of teacher professional development programs (Bray-Clark and Bates 2003 ; Kelley et al. 2020 ; Wong et al. 2022 ; Zhou et al. 2023 ). For instance, Zhou et al. ( 2023 ) claimed that in STEM teacher education, effective teacher professional development programs should bolster teachers’ self-efficacy “in teaching the content in the STEM discipline” (p. 2).

It has been documented that teachers frequently experience diminished confidence and comfort when teaching subject areas beyond their expertise (Kelley et al. 2020 ; Stohlmann et al. 2012 ). This diminished confidence extends to their self-efficacy in implementing interdisciplinary teaching approaches, such as integrated STEM teaching and IAT (Kelley et al. 2020 ). For instance, Geng et al. ( 2019 ) found that STEM teachers in Hong Kong exhibited low levels of self-efficacy, with only 5.53% of teachers rating their overall self-efficacy in implementing STEM education as higher than a score of 4 out of 5. Additionally, Hunter-Doniger and Sydow ( 2016 ) found that teachers may experience apprehension and lack confidence when incorporating A&H elements into the classroom context, particularly within the framework of IAT. Considering the critical importance of teacher self-efficacy in STEM and STEAM education (Kelley et al. 2020 ; Zakariya, 2020 ; Zhou et al. 2023 ), it is necessary to explore effective measures, frameworks and teacher professional development programs to help teachers improve their self-efficacy regarding interdisciplinary teaching (e.g., IAT).

Teacher professional learning models

The relationship between teachers’ professional learning and students’ outcomes (such as achievements, skills and attitudes) has been a subject of extensive discussion and research for many years (Clarke and Hollingsworth 2002 ). For instance, Clarke and Hollingsworth ( 2002 ) proposed and validated the Interconnected Model of Professional Growth, which illustrates that teacher professional development is influenced by the interaction among four interconnected domains: the personal domain (teacher knowledge, beliefs and attitudes), the domain of practice (professional experimentation), the domain of consequence (salient outcomes), and the external domain (sources of information, stimulus or support). Sancar et al. ( 2021 ) emphasized that teachers’ professional learning or development never occurs independently. In practice, this process is inherently intertwined with many variables, including student outcomes, in various ways (Sancar et al. 2021 ). However, many current teacher professional development programs exclude real in-class teaching and fail to establish a comprehensive link between teachers’ professional learning and student outcomes (Cai et al. 2020 ; Sancar et al. 2021 ). Sancar et al. ( 2021 ) claimed that exploring the complex relationships between teachers’ professional learning and student outcomes should be grounded in monitoring and evaluating real in-class teaching, rather than relying on teachers’ self-assessment. It is essential to understand these relationships from a holistic perspective within the context of real classroom teaching (Sancar et al. 2021 ). However, as Sancar et al. ( 2021 ) pointed out, such efforts in teacher education are often considered inadequate. Furthermore, in the field of STEAM education, such efforts are further exacerbated.

Cai et al. ( 2020 ) proposed a teacher professional learning model where student outcomes are emphasized. This model was developed based on Cai ( 2017 ), Philipp ( 2007 ) and Thompson ( 1992 ). It has also been used and justified in a series of teacher professional development programs (e.g., Calabrese et al. 2024 ; Hwang et al. 2024 ; Marco and Palatnik 2024 ; Örnek and Soylu 2021 ). The model posits that teachers typically increase their knowledge and modify their beliefs through professional teacher learning, subsequently improving their classroom instruction, enhancing teaching performance, and ultimately fostering improved student learning outcomes (Cai et al. 2020 ). Notably, this model can be updated in several aspects. Firstly, prior studies have exhibited the interplay between teacher knowledge and beliefs (e.g., Basckin et al. 2021 ; Taimalu and Luik 2019 ). This indicates that the increase in teacher knowledge and the change in teacher belief may not be parallel. The two processes can be intertwined. Secondly, the Interconnected Model of Professional Growth highlights that the personal domain and the domain of practice are interconnected (Clarke and Hollingsworth 2002 ). Liu et al. ( 2022 ) also confirmed that improvements in classroom instruction may, in turn, influence teacher beliefs. This necessitates a reconsideration of the relationships between classroom instruction, teacher knowledge and teacher beliefs in Cai et al.’s ( 2020 ) model. Thirdly, the Interconnected Model of Professional Growth also exhibits the connections between the domain of consequence and the personal domain (Clarke and Hollingsworth 2002 ). Hence, the improvement of learning outcomes may signify the end of teacher learning. For instance, students’ learning feedback may be a vital source of teacher self-efficacy (Bandura 1977 ). Therefore, the improvement of student outcomes may, in turn, affect teacher beliefs. The aforementioned arguments highlight the need for an updated model that integrates Cai et al.’s ( 2020 ) teacher professional learning model with Clarke and Hollingsworth’s ( 2002 ) Interconnected Model of Professional Growth. This integration may provide a holistic view of the teacher’s professional learning process, especially within the complex contexts of STEAM teacher education.

The framework for teacher professional development programs of integrating arts and humanities into STEM teaching

In this section, we present a framework for IAT teacher professional development programs, aiming to address the practical challenges associated with STEAM teaching implementation. Our framework incorporates the five features of effective teacher professional development programs outlined by Archibald et al. ( 2011 ), Cai et al. ( 2020 ), Darling-Hammond et al. ( 2017 ), Desimone and Garet ( 2015 ) and Roth et al. ( 2017 ). These features include: (a) alignment with shared goals (e.g., school, district, and national policies and practice), (b) emphasis on core content and modeling of teaching strategies for the content, (c) collaboration among teachers within a community, (d) adequate opportunities for active learning of new teaching strategies, and (e) embedded follow-up and continuous feedback. It is worth noting that two concepts, namely community of practice and lesson study, have been incorporated into our framework. Below, we delineate how these features are reflected in our framework.

(a) The Chinese government has issued a series of policies to facilitate STEAM education in K-12 schools (Jiang et al. 2021 ; Li and Chiang 2019 ; Lyu et al. 2024 ; Ro et al. 2022 ). The new curriculum standards released in 2022 mandate that all K-12 teachers implement interdisciplinary teaching, including STEAM education. Our framework for teacher professional development programs, which aims to help teachers integrate A&H into STEM teaching, closely aligns with these national policies and practices supporting STEAM education in K-12 schools.

(b) The core content of the framework is IAT. Specifically, as A&H is a broad concept, we divide it into several subcomponents, such as history, culture, and visual and performing arts (e.g., drama). We are implementing a series of teacher professional development programs to help mathematics, technology and science pre-service teachers integrate these subcomponents of A&H into their teaching Footnote 2 . Notably, pre-service teachers often lack teaching experience, making it challenging to master and implement new teaching strategies. Therefore, our framework provides five step-by-step stages designed to help them effectively model the teaching strategies of IAT.

(c) Our framework advocates for collaboration among teachers within a community of practice. Specifically, a community of practice is “a group of people who share an interest in a domain of human endeavor and engage in a process of collective learning that creates bonds between them” (Wenger et al. 2002 , p. 1). A teacher community of practice can be considered a group of teachers “sharing and critically observing their practices in growth-promoting ways” (Näykki et al. 2021 , p. 497). Long et al. ( 2021 ) claimed that in a teacher community of practice, members collaboratively share their teaching experiences and work together to address teaching problems. Our community of practice includes three types of members. (1) Mentors: These are professors and experts with rich experience in helping pre-service teachers practice IAT. (2) Pre-service teachers: Few have teaching experience before the teacher professional development programs. (3) In-service teachers: All in-service teachers are senior teachers with rich teaching experience. All the members work closely together to share and improve their IAT practice. Moreover, our community includes not only mentors and in-service teachers but also pre-service teachers. We encourage pre-service teachers to collaborate with experienced in-service teachers in various ways, such as developing IAT lesson plans, writing IAT case reports and so on. In-service teachers can provide cognitive and emotional support and share their practical knowledge and experience, which may significantly benefit the professional growth of pre-service teachers (Alwafi et al. 2020 ).

(d) Our framework offers pre-service teachers various opportunities to engage in lesson study, allowing them to actively design and implement IAT lessons. Based on the key points of effective lesson study outlined by Akiba et al. ( 2019 ), Ding et al. ( 2024 ), and Takahashi and McDougal ( 2016 ), our lesson study incorporates the following seven features. (1) Study of IAT materials: Pre-service teachers are required to study relevant IAT materials under the guidance of mentors. (2) Collaboration on lesson proposals: Pre-service teachers should collaborate with in-service teachers to develop comprehensive lesson proposals. (3) Observation and data collection: During the lesson, pre-service teachers are required to carefully observe and collect data on student learning and development. (4) Reflection and analysis: Pre-service teachers use the collected data to reflect on the lesson and their teaching effects. (5) Lesson revision and reteaching: If needed, pre-service teachers revise and reteach the lesson based on their reflections and data analysis. (6) Mentor and experienced in-service teacher involvement: Mentors and experienced in-service teachers, as knowledgeable others, are involved throughout the lesson study process. (7) Collaboration on reporting: Pre-service teachers collaborate with in-service teachers to draft reports and disseminate the results of the lesson study. Specifically, recognizing that pre-service teachers often lack teaching experience, we do not require them to complete all the steps of lesson study independently at once. Instead, we guide them through the lesson study process in a step-by-step manner, allowing them to gradually build their IAT skills and confidence. For instance, in Stage 1, pre-service teachers primarily focus on studying IAT materials. In Stage 2, they develop lesson proposals, observe and collect data, and draft reports. However, the implementation of IAT lessons is carried out by in-service teachers. This approach prevents pre-service teachers from experiencing failures due to their lack of teaching experience. In Stage 3, pre-service teachers implement, revise, and reteach IAT lessons, experiencing the lesson study process within a simulated environment. In Stage 4, pre-service teachers engage in lesson study in an actual classroom environment. However, their focus is limited to one micro-course during each lesson study session. It is not until the fifth stage that they experience a complete lesson study in an actual classroom environment.

(e) Our teacher professional development programs incorporate assessments specifically designed to evaluate pre-service teachers’ IAT practices. We use formative assessments to measure their understanding and application of IAT strategies. Pre-service teachers receive ongoing and timely feedback from peers, mentors, in-service teachers, and students, which helps them continuously refine their IAT practices throughout the program. Recognizing that pre-service teachers often have limited contact with real students and may not fully understand students’ learning needs, processes and outcomes, our framework requires them to actively collect and analyze student feedback. By doing so, they can make informed improvements to their instructional practice based on student feedback.

After undergoing three rounds of theoretical and practical testing and revision over the past five years, we have successfully finalized the optimization of the framework design (Zhou 2021 ). Throughout each cycle, we collected feedback from both participants and researchers on at least three occasions. Subsequently, we analyzed this feedback and iteratively refined the framework. For example, we enlisted the participation of in-service teachers to enhance the implementation of STEAM teaching, extended practice time through micro-teaching sessions, and introduced a stage of micro-course development within the framework to provide more opportunities for pre-service teachers to engage with real teaching situations. In this process, we continuously improved the coherence between each stage of the framework, ensuring that they mutually complement one another. The five-stage framework is described as follows.

Stage 1 Literature study

Pre-service teachers are provided with a series of reading materials from A&H. On a weekly basis, two pre-service teachers are assigned to present their readings and reflections to the entire group, followed by critical discussions thereafter. Mentors and all pre-service teachers discuss and explore strategies for translating the original A&H materials into viable instructional resources suitable for classroom use. Subsequently, pre-service teachers select topics of personal interest for further study under mentor guidance.

Stage 2 Case learning

Given that pre-service teachers have no teaching experience, collaborative efforts between in-service teachers and pre-service teachers are undertaken to design IAT lesson plans. Subsequently, the in-service teachers implement these plans. Throughout this process, pre-service teachers are afforded opportunities to engage in lesson plan implementation. Figure 1 illustrates the role of pre-service teachers in case learning. In the first step, pre-service teachers read about materials related to A&H, select suitable materials, and report their ideas on IAT lesson design to mentors, in-service teachers, and fellow pre-service teachers.

figure 1

Note: A&H refers to arts and humanities.

In the second step, they liaise with the in-service teachers responsible for implementing the lesson plan, discussing the integration of A&H into teaching practices. Pre-service teachers then analyze student learning objectives aligned with curriculum standards, collaboratively designing the IAT lesson plan with in-service teachers. Subsequently, pre-service teachers present lesson plans for feedback from mentors and other in-service teachers.

In the third step, pre-service teachers observe the lesson plan’s implementation, gathering and analyzing feedback from students and in-service teachers using an inductive approach (Merriam 1998 ). Feedback includes opinions on the roles and values of A&H, perceptions of the teaching effect, and recommendations for lesson plan implementation and modification. The second and third steps may iterate multiple times to refine the IAT lesson plan. In the fourth step, pre-service teachers consolidate all data, including various versions of teaching instructions, classroom videos, feedback, and discussion notes, composing reflection notes. Finally, pre-service teachers collaborate with in-service teachers to compile the IAT case report and submit it for publication.

Stage 3 Micro-teaching

Figure 2 illustrates the role of pre-service teachers in micro-teaching. Before entering the micro-classrooms Footnote 3 , all the discussions and communications occur within the pre-service teacher group, excluding mentors and in-service teachers. After designing the IAT lesson plan, pre-service teachers take turns implementing 40-min lesson plans in a simulated micro-classroom setting. Within this simulated environment, one pre-service teacher acts as the teacher, while others, including mentors, in-service teachers, and other fellow pre-service teachers, assume the role of students Footnote 4 . Following the simulated teaching, the implementer reviews the video of their session and self-assesses their performance. Subsequently, the implementer receives feedback from other pre-service teachers, mentors, and in-service teachers. Based on this feedback, the implementer revisits steps 2 and 3, revising the lesson plan and conducting the simulated teaching again. This iterative process typically repeats at least three times until the mentors, in-service teachers, and other pre-service teachers are satisfied with the implementation of the revised lesson plan. Finally, pre-service teachers complete reflection notes and submit a summary of their reflections on the micro-teaching experience. Each pre-service teacher is required to choose at least three topics and undergo at least nine simulated teaching sessions.

figure 2

Stage 4 Micro-course development

While pre-service teachers may not have the opportunity to execute the whole lesson plans in real classrooms, they can design and create five-minute micro-courses Footnote 5 before class, subsequently presenting these videos to actual students. The process of developing micro-courses closely mirrors that of developing IAT cases in the case learning stage (see Fig. 1 ). However, in Step 3, pre-service teachers assume dual roles, not only as observers of IAT lesson implementation but also as implementers of a five-minute IAT micro-course.

Stage 5 Classroom teaching

Pre-service teachers undertake the implementation of IAT lesson plans independently, a process resembling micro-teaching (see Fig. 2 ). However, pre-service teachers engage with real school students in partner schools Footnote 6 instead of simulated classrooms. Furthermore, they collect feedback not only from the mentors, in-service teachers, and fellow pre-service teachers but also from real students.

To provide our readers with a better understanding of the framework, we provide meaningful vignettes of a pre-service teacher’s learning and teaching experiences in one of the teacher professional development programs based on the framework. In addition, we choose teacher self-efficacy as an indicator to assess the framework’s effectiveness, detailing the pre-service teacher’s changes in teacher self-efficacy.

Research design

Research method.

Teacher self-efficacy can be measured both quantitatively and qualitatively (Bandura 1986 , 1997 ; Lee and Bobko 1994 ; Soprano and Yang 2013 ; Unfried et al. 2022 ). However, researchers and theorists in the area of teacher self-efficacy have called for more qualitative and longitudinal studies (Klassen et al. 2011 ). As some critiques stated, most studies were based on correlational and cross-sectional data obtained from self-report surveys, and qualitative studies of teacher efficacy were overwhelmingly neglected (Henson 2002 ; Klassen et al. 2011 ; Tschannen-Moran et al. 1998 ; Xenofontos and Andrews 2020 ). There is an urgent need for more longitudinal studies to shed light on the development of teacher efficacy (Klassen et al. 2011 ; Xenofontos and Andrews 2020 ).

This study utilized a longitudinal qualitative case study methodology to delve deeply into the context (Jiang et al. 2021 ; Corden and Millar 2007 ; Dicks et al. 2023 ; Henderson et al. 2012 ; Matusovich et al. 2010 ; Shirani and Henwood 2011 ), presenting details grounded in real-life situations and analyzing the inner relationships rather than generalize findings about the change of a large group of pre-service teachers’ self-efficacy.

Participant

This study forms a component of a broader multi-case research initiative examining teachers’ professional learning in the STEAM teacher professional development programs in China (Jiang et al. 2021 ; Wang et al. 2018 ; Wang et al. 2024 ). Within this context, one participant, Shuitao (pseudonym), is selected and reported in this current study. Shuitao was a first-year graduate student at a first-tier Normal university in Shanghai, China. Normal universities specialize in teacher education. Her graduate major was mathematics curriculum and instruction. Teaching practice courses are offered to students in this major exclusively during their third year of study. The selection of Shuitao was driven by three primary factors. Firstly, Shuitao attended the entire teacher professional development program and actively engaged in nearly all associated activities. Table 2 illustrates the timeline of the five stages in which Shuitao was involved. Secondly, her undergraduate major was applied mathematics, which was not related to mathematics teaching Footnote 7 . She possessed no prior teaching experience and had not undergone any systematic study of IAT before her involvement in the teacher professional development program. Thirdly, her other master’s courses during her first two years of study focused on mathematics education theory and did not include IAT Footnote 8 . Additionally, she scarcely participated in any other teaching practice outside of the teacher professional development program. As a pre-service teacher, Shuitao harbored a keen interest in IAT. Furthermore, she discovered that she possessed fewer teaching skills compared to her peers who had majored in education during their undergraduate studies. Hence, she had a strong desire to enhance her teaching skills. Consequently, Shuitao decided to participate in our teacher professional development program.

Shuitao was grouped with three other first-year graduate students during the teacher professional development program. She actively collaborated with them at every stage of the program. For instance, they advised each other on their IAT lesson designs, observed each other’s IAT practice and offered constructive suggestions for improvement.

Research question

Shuitao was a mathematics pre-service teacher who participated in one of our teacher professional development programs, focusing on integrating history into mathematics teaching (IHT) Footnote 9 . Notably, this teacher professional development program was designed based on our five-stage framework for teacher professional development programs of IAT. To examine the impact of this teacher professional development program on Shuitao’s self-efficacy related to IHT, this case study addresses the following research question:

What changes in Shuitao’s self-efficacy in individual performance regarding integrating history into mathematics teaching (SE-IHT-IP) may occur through participation in the teacher professional development program?

What changes in Shuitao’s self-efficacy in student outcomes regarding integrating history into mathematics teaching (SE-IHT-SO) may occur through participation in the teacher professional development program?

Data collection and analysis

Before Shuitao joined the teacher professional development program, a one-hour preliminary interview was conducted to guide her in self-narrating her psychological and cognitive state of IHT.

During the teacher professional development program, follow-up unstructured interviews were conducted once a month with Shuitao. All discussions in the development of IHT cases were recorded, Shuitao’s teaching and micro-teaching were videotaped, and the reflection notes, journals, and summary reports written by Shuitao were collected.

After completing the teacher professional development program, Shuitao participated in a semi-structured three-hour interview. The objectives of this interview were twofold: to reassess her self-efficacy and to explore the relationship between her self-efficacy changes and each stage of the teacher professional development program.

Interview data, discussions, reflection notes, journals, summary reports and videos, and analysis records were archived and transcribed before, during, and after the teacher professional development program.

In this study, we primarily utilized data from seven interviews: one conducted before the teacher professional development program, five conducted after each stage of the program, and one conducted upon completion of the program. Additionally, we reviewed Shuitao’s five reflective notes, which were written after each stage, as well as her final summary report that encompassed the entire teacher professional development program.

Merriam’s ( 1998 ) approach to coding data and inductive approach to retrieving possible concepts and themes were employed using a seven-stage method. Considering theoretical underpinnings in qualitative research is common when interpreting data (Strauss and Corbin 1990 ). First, a list based on our conceptual framework of teacher self-efficacy (see Table 1 ) was developed. The list included two codes (i.e., SE-IHT-IP and SE-IHT-SO). Second, all data were sorted chronologically, read and reread to be better understood. Third, texts were coded into multi-colored highlighting and comment balloons. Fourth, the data for groups of meanings, themes, and behaviors were examined. How these groups were connected within the conceptual framework of teacher self-efficacy was confirmed. Fifth, after comparing, confirming, and modifying, the selective codes were extracted and mapped onto the two categories according to the conceptual framework of teacher self-efficacy. Accordingly, changes in SE-IHT-IP and SE-IHT-SO at the five stages of the teacher professional development program were identified, respectively, and then the preliminary findings came (Strauss and Corbin 1990 ). In reality, in Shuitao’s narratives, SE-IHT-IP and SE-IHT-SO were frequently intertwined. Through our coding process, we differentiated between SE-IHT-IP and SE-IHT-SO, enabling us to obtain a more distinct understanding of how these two aspects of teacher self-efficacy evolved over time. This helped us address the two research questions effectively.

Reliability and validity

Two researchers independently analyzed the data to establish inter-rater reliability. The inter-rater reliability was established as kappa = 0.959. Stake ( 1995 ) suggested that the most critical assertions in a study require the greatest effort toward confirmation. In this study, three methods served this purpose and helped ensure the validity of the findings. The first way to substantiate the statement about the changes in self-efficacy was by revisiting each transcript to confirm whether the participant explicitly acknowledged the changes (Yin 2003 ). Such a check was repeated in the analysis of this study. The second way to confirm patterns in the data was by examining whether Shuitao’s statements were replicated in separate interviews (Morris and Usher 2011 ). The third approach involved presenting the preliminary conclusions to Shuitao and affording her the opportunity to provide feedback on the data and conclusions. This step aimed to ascertain whether we accurately grasped the true intentions of her statements and whether our subjective interpretations inadvertently influenced our analysis of her statements. Additionally, data from diverse sources underwent analysis by at least two researchers, with all researchers reaching consensus on each finding.

As each stage of our teacher professional development programs spanned a minimum of three months, numerous documented statements regarding the enhancement of Shuitao’s self-efficacy regarding IHT were recorded. Notably, what we present here offers only a concise overview of findings derived from our qualitative analysis. The changes in Shuitao’s SE-IHT-IP and SE-IHT-SO are organized chronologically, delineating the period before and during the teacher professional development program.

Before the teacher professional development program: “I have no confidence in IHT”

Before the teacher professional development program, Shuitao frequently expressed her lack of confidence in IHT. On the one hand, Shuitao expressed considerable apprehension about her individual performance in IHT. “How can I design and implement IHT lesson plans? I do not know anything [about it]…” With a sense of doubt, confusion and anxiety, Shuitao voiced her lack of confidence in her ability to design and implement an IHT case that would meet the requirements of the curriculum standards. Regarding the reasons for her lack of confidence, Shuitao attributed it to her insufficient theoretical knowledge and practical experience in IHT:

I do not know the basic approaches to IHT that I could follow… it is very difficult for me to find suitable historical materials… I am very confused about how to organize [historical] materials logically around the teaching goals and contents… [Furthermore,] I am [a] novice, [and] I have no IHT experience.

On the other hand, Shuitao articulated very low confidence in the efficacy of her IHT on student outcomes:

I think my IHT will have a limited impact on student outcomes… I do not know any specific effects [of history] other than making students interested in mathematics… In fact, I always think it is difficult for [my] students to understand the history… If students cannot understand [the history], will they feel bored?

This statement suggests that Shuitao did not fully grasp the significance of IHT. In fact, she knew little about the educational significance of history for students, and she harbored no belief that her IHT approach could positively impact students. In sum, her SE-IHT-SO was very low.

After stage 1: “I can do well in the first step of IHT”

After Stage 1, Shuitao indicated a slight improvement in her confidence in IHT. She attributed this improvement to her acquisition of theoretical knowledge in IHT, the approaches for selecting history-related materials, and an understanding of the educational value of history.

One of Shuitao’s primary concerns about implementing IHT before the teacher professional development program was the challenge of sourcing suitable history-related materials. However, after Stage 1, Shuitao explicitly affirmed her capability in this aspect. She shared her experience of organizing history-related materials related to logarithms as an example.

Recognizing the significance of suitable history-related materials in effective IHT implementation, Shuitao acknowledged that conducting literature studies significantly contributed to enhancing her confidence in undertaking this initial step. Furthermore, she expressed increased confidence in designing IHT lesson plans by utilizing history-related materials aligned with teaching objectives derived from the curriculum standards. In other words, her SE-IHT-IP was enhanced. She said:

After experiencing multiple discussions, I gradually know more about what kinds of materials are essential and should be emphasized, what kinds of materials should be adapted, and what kinds of materials should be omitted in the classroom instructions… I have a little confidence to implement IHT that could meet the requirements [of the curriculum standards] since now I can complete the critical first step [of IHT] well…

However, despite the improvement in her confidence in IHT following Stage 1, Shuitao also expressed some concerns. She articulated uncertainty regarding her performance in the subsequent stages of the teacher professional development program. Consequently, her confidence in IHT experienced only a modest increase.

After stage 2: “I participate in the development of IHT cases, and my confidence is increased a little bit more”

Following Stage 2, Shuitao reported further increased confidence in IHT. She attributed this growth to two main factors. Firstly, she successfully developed several instructional designs for IHT through collaboration with in-service teachers. These collaborative experiences enabled her to gain a deeper understanding of IHT approaches and enhance her pedagogical content knowledge in this area, consequently bolstering her confidence in her ability to perform effectively. Secondly, Shuitao observed the tangible impact of IHT cases on students in real classroom settings, which reinforced her belief in the efficacy of IHT. These experiences instilled in her a greater sense of confidence in her capacity to positively influence her students through her implementation of IHT. Shuitao remarked that she gradually understood how to integrate suitable history-related materials into her instructional designs (e.g., employ a genetic approach Footnote 10 ), considering it as the second important step of IHT. She shared her experience of developing IHT instructional design on the concept of logarithms. After creating several iterations of IHT instructional designs, Shuitao emphasized that her confidence in SE-IHT-IP has strengthened. She expressed belief in her ability to apply these approaches to IHT, as well as the pedagogical content knowledge of IHT, acquired through practical experience, in her future teaching endeavors. The following is an excerpt from the interview:

I learned some effective knowledge, skills, techniques and approaches [to IHT]… By employing these approaches, I thought I could [and] I had the confidence to integrate the history into instructional designs very well… For instance, [inspired] by the genetic approach, we designed a series of questions and tasks based on the history of logarithms. The introduction of the new concept of logarithms became very natural, and it perfectly met the requirements of our curriculum standards, [which] asked students to understand the necessity of learning the concept of logarithms…

Shuitao actively observed the classroom teaching conducted by her cooperating in-service teacher. She helped her cooperating in-service teacher in collecting and analyzing students’ feedback. Subsequently, discussions ensued on how to improve the instructional designs based on this feedback. The refined IHT instructional designs were subsequently re-implemented by the in-service teacher. After three rounds of developing IHT cases, Shuitao became increasingly convinced of the significance and efficacy of integrating history into teaching practices, as evidenced by the following excerpt:

The impacts of IHT on students are visible… For instance, more than 93% of the students mentioned in the open-ended questionnaires that they became more interested in mathematics because of the [historical] story of Napier… For another example, according to the results of our surveys, more than 75% of the students stated that they knew log a ( M  +  N ) = log a M  × log a N was wrong because of history… I have a little bit more confidence in the effects of my IHT on students.

This excerpt highlights that Shuitao’s SE-IHT-SO was enhanced. She attributed this enhancement to her realization of the compelling nature of history and her belief in her ability to effectively leverage its power to positively influence her students’ cognitive and emotional development. This also underscores the importance of reinforcing pre-service teachers’ awareness of the significance of history. Nonetheless, Shuiato elucidated that she still retained concerns regarding the effectiveness of her IHT implementation. Her following statement shed light on why her self-efficacy only experienced a marginal increase in this stage:

Knowing how to do it successfully and doing it successfully in practice are two totally different things… I can develop IHT instructional designs well, but I have no idea whether I can implement them well and whether I can introduce the history professionally in practice… My cooperation in-service teacher has a long history of teaching mathematics and gains rich experience in educational practices… If I cannot acquire some required teaching skills and capabilities, I still cannot influence my students powerfully.

After stage 3: “Practice makes perfect, and my SE-IHT-IP is steadily enhanced after a hit”

After successfully developing IHT instructional designs, the next critical step was the implementation of these designs. Drawing from her observations of her cooperating in-service teachers’ IHT implementations and discussions with other pre-service teachers, Shuitao developed her own IHT lesson plans. In Stage 3, she conducted simulated teaching sessions and evaluated her teaching performance ten times Footnote 11 . Shuitao claimed that her SE-IHT-IP steadily improved over the course of these sessions. According to Shuitao, two main processes in Stage 3 facilitated this steady enhancement of SE-IHT-IP.

On the one hand, through the repeated implementation of simulated teaching sessions, Shuitao’s teaching proficiency and fluency markedly improved. Shuitao first described the importance of teaching proficiency and fluency:

Since the detailed history is not included in our curriculum standards and textbooks, if I use my historical materials in class, I have to teach more contents than traditional teachers. Therefore, I have to teach proficiently so that teaching pace becomes a little faster than usual… I have to teach fluently so as to use each minute efficiently in my class. Otherwise, I cannot complete the teaching tasks required [by curriculum standards].

As Shuitao said, at the beginning of Stage 3, her self-efficacy even decreased because she lacked teaching proficiency and fluency and was unable to complete the required teaching tasks:

In the first few times of simulated teaching, I always needed to think for a second about what I should say next when I finish one sentence. I also felt very nervous when I stood in the front of the classrooms. This made my narration of the historical story between Briggs and Napier not fluent at all. I paused many times to look for some hints on my notes… All these made me unable to complete the required teaching tasks… My [teaching] confidence took a hit.

Shuitao quoted the proverb, “practice makes perfect”, and she emphasized that it was repeated practice that improved her teaching proficiency and fluency:

I thought I had no other choice but to practice IHT repeatedly… [At the end of Stage 3,] I could naturally remember most words that I should say when teaching the topics that I selected… My teaching proficiency and fluency was improved through my repeated review of my instructional designs and implementation of IHT in the micro-classrooms… With the improvement [of my teaching proficiency and fluency], I could complete the teaching tasks, and my confidence was increased as well.

In addition, Shuitao also mentioned that through this kind of self-exploration in simulated teaching practice, her teaching skills and capabilities (e.g., blackboard writing, abilities of language organization abilities, etc.) improved. This process was of great help to her enhancement of SE-IHT-IP.

On the other hand, Shuitao’s simulated teaching underwent assessment by herself, with mentors, in-service teachers and fellow pre-service teachers. This comprehensive evaluation process played a pivotal role in enhancing her individual performance and self-efficacy. Reflecting on this aspect, Shuitao articulated the following sentiments in one of her reflection reports:

By watching the videos, conducting self-assessment, and collecting feedback from others, I can understand what I should improve or emphasize in my teaching. [Then,] I think my IHT can better meet the requirements [of curriculum standards]… I think my teaching performance is getting better and better.

After stage 4: “My micro-courses influenced students positively, and my SE-IHT-SO is steadily enhanced”

In Stage 4, Shuitao commenced by creating 5-min micro-course videos. Subsequently, she played these videos in her cooperating in-service teachers’ authentic classroom settings and collected student feedback. This micro-course was played at the end of her cooperating in-service teachers’ lesson Footnote 12 . Shuitao wrote in her reflections that this micro-course of logarithms helped students better understand the nature of mathematics:

According to the results of our surveys, many students stated that they knew the development and evolution of the concept of logarithms is a long process and many mathematicians from different countries have contributed to the development of the concept of logarithms… This indicated that my micro-course helped students better understand the nature of mathematics… My micro-course about the history informed students that mathematics is an evolving and human subject and helped them understand the dynamic development of the [mathematics] concept…

Meanwhile, Shuitao’s micro-course positively influenced some students’ beliefs towards mathematics. As evident from the quote below, integrating historical context into mathematics teaching transformed students’ perception of the subject, boosting Shuitao’s confidence too.

Some students’ responses were very exciting… [O]ne [typical] response stated, he always regarded mathematics as abstract, boring, and dreadful subject; but after seeing the photos of mathematicians and great men and learning the development of the concept of logarithms through the micro-course, he found mathematics could be interesting. He wanted to learn more the interesting history… Students’ such changes made me confident.

Furthermore, during post-class interviews, several students expressed their recognition of the significance of the logarithms concept to Shuitao, attributing this realization to the insights provided by prominent figures in the micro-courses. They also conveyed their intention to exert greater effort in mastering the subject matter. This feedback made Shuitao believe that her IHT had the potential to positively influence students’ attitudes towards learning mathematics.

In summary, Stage 4 marked Shuitao’s first opportunity to directly impact students through her IHT in authentic classroom settings. Despite implementing only brief 5-min micro-courses integrating history during each session, the effectiveness of her short IHT implementation was validated by student feedback. Shuitao unequivocally expressed that students actively engaged with her micro-courses and that these sessions positively influenced them, including attitudes and motivation toward mathematics learning, understanding of mathematics concepts, and beliefs regarding mathematics. These collective factors contributed to a steady enhancement of her confidence in SE-IHT-SO.

After stage 5: “My overall self-efficacy is greatly enhanced”

Following Stage 5, Shuitao reported a significant increase in her overall confidence in IHT, attributing it to gaining mastery through successful implementations of IHT in real classroom settings. On the one hand, Shuitao successfully designed and executed her IHT lesson plans, consistently achieving the teaching objectives mandated by curriculum standards. This significantly enhanced her SE-IHT-IP. On the other hand, as Shuitao’s IHT implementation directly influenced her students, her confidence in SE-IHT-SO experienced considerable improvement.

According to Bandura ( 1997 ), mastery experience is the most powerful source of self-efficacy. Shuitao’s statements confirmed this. As she claimed, her enhanced SE-IHT-IP in Stage 5 mainly came from the experience of successful implementations of IHT in real classrooms:

[Before the teacher professional development program,] I had no idea about implementing IHT… Now, I successfully implemented IHT in senior high school [classrooms] many times… I can complete the teaching tasks and even better completed the teaching objectives required [by the curriculum standards]… The successful experience greatly enhances my confidence to perform well in my future implementation of IHT… Yeah, I think the successful teaching practice experience is the strongest booster of my confidence.

At the end of stage 5, Shuitao’s mentors and in-service teachers gave her a high evaluation. For instance, after Shuitao’s IHT implementation of the concept of logarithms, all mentors and in-service teachers consistently provided feedback that her IHT teaching illustrated the necessity of learning the concept of logarithms and met the requirements of the curriculum standards very well. This kind of verbal persuasion (Bandura 1997 ) enhanced her SE-IHT-IP.

Similarly, Shuitao’s successful experience of influencing students positively through IHT, as one kind of mastery experience, powerfully enhanced her SE-IHT-SO. She described her changes in SE-IHT-SO as follows:

I could not imagine my IHT could be so influential [before]… But now, my IHT implementation directly influenced students in so many aspects… When I witnessed students’ real changes in various cognitive and affective aspects, my confidence was greatly improved.

Shuitao described the influence of her IHT implementation of the concept of logarithms on her students. The depiction is grounded in the outcomes of surveys conducted by Shuitao following her implementation. Shuitao asserted that these results filled her with excitement and confidence regarding her future implementation of IHT.

In summary, following Stage 5 of the teacher professional development program, Shuitao experienced a notable enhancement in her overall self-efficacy, primarily attributed to her successful practical experience in authentic classroom settings during this stage.

A primary objective of our teacher professional development programs is to equip pre-service teachers with the skills and confidence needed to effectively implement IAT. Our findings show that one teacher professional development program, significantly augmented a participant’s TSE-IHT across two dimensions: individual performance and student outcomes. Considering the pressing need to provide STEAM teachers with effective professional training (e.g., Boice et al. 2021 ; Duong et al. 2024 ; Herro et al. 2019 ; Jacques et al. 2020 ; Park and Cho 2022 ; Perignat and Katz-Buonincontro 2019 ), the proposed five-stage framework holds significant promise in both theoretical and practical realms. Furthermore, this study offers a viable solution to address the prevalent issue of low levels of teacher self-efficacy in interdisciplinary teaching, including IAT, which is critical in STEAM education (Zhou et al. 2023 ). This study holds the potential to make unique contributions to the existing body of literature on teacher self-efficacy, teacher professional learning models and the design of teacher professional development programs of IAT.

Firstly, this study enhances our understanding of the development of teacher self-efficacy. Our findings further confirm the complexity of the development of teacher self-efficacy. On the one hand, the observed enhancement of the participant’s teacher self-efficacy did not occur swiftly but unfolded gradually through a protracted, incremental process. Moreover, it is noteworthy that the participant’s self-efficacy exhibited fluctuations, underscoring that the augmentation of teacher self-efficacy is neither straightforward nor linear. On the other hand, the study elucidated that the augmentation of teacher self-efficacy constitutes an intricate, multi-level system that interacts with teacher knowledge, skills, and other beliefs. This finding resonates with prior research on teacher self-efficacy (Morris et al. 2017 ; Xenofontos and Andrews 2020 ). For example, our study revealed that Shuitao’s enhancement of SE-IHT-SO may always be interwoven with her continuous comprehension of the significance of the A&H in classroom settings. Similarly, the participant progressively acknowledged the educational value of A&H in classroom contexts in tandem with the stepwise enhancement of SE-IHT-SO. Factors such as the participant’s pedagogical content knowledge of IHT, instructional design, and teaching skills were also identified as pivotal components of SE-IHT-IP. This finding corroborates Morris and Usher ( 2011 ) assertion that sustained improvements in self-efficacy stem from developing teachers’ skills and knowledge. With the bolstering of SE-IHT-IP, the participant’s related teaching skills and content knowledge also exhibited improvement.

Methodologically, many researchers advocate for qualitative investigations into self-efficacy (e.g., Philippou and Pantziara 2015; Klassen et al. 2011 ; Wyatt 2015 ; Xenofontos and Andrews 2020 ). While acknowledging limitations in sample scope and the generalizability of the findings, this study offers a longitudinal perspective on the stage-by-stage development of teacher self-efficacy and its interactions with different factors (i.e., teacher knowledge, skills, and beliefs), often ignored by quantitative studies. Considering that studies of self-efficacy have been predominantly quantitative, typically drawing on survey techniques and pre-determined scales (Xenofontos and Andrews, 2020 ; Zhou et al. 2023 ), this study highlights the need for greater attention to qualitative studies so that more cultural, situational and contextual factors in the development of self-efficacy can be captured.

Our study provides valuable practical implications for enhancing pre-service teachers’ self-efficacy. We conceptualize teacher self-efficacy in two primary dimensions: individual performance and student outcomes. On the one hand, pre-service teachers can enhance their teaching qualities, boosting their self-efficacy in individual performance. The adage “practice makes perfect” underscores the necessity of ample teaching practice opportunities for pre-service teachers who lack prior teaching experience. Engaging in consistent and reflective practice helps them develop confidence in their teaching qualities. On the other hand, pre-service teachers should focus on positive feedback from their students, reinforcing their self-efficacy in individual performance. Positive student feedback serves as an affirmation of their teaching effectiveness and encourages continuous improvement. Furthermore, our findings highlight the significance of mentors’ and peers’ positive feedback as critical sources of teacher self-efficacy. Mentors and peers play a pivotal role in the professional growth of pre-service teachers by actively encouraging them and recognizing their teaching achievements. Constructive feedback from experienced mentors and supportive peers fosters a collaborative learning environment and bolsters the self-confidence of pre-service teachers. Additionally, our research indicates that pre-service teachers’ self-efficacy may fluctuate. Therefore, mentors should be prepared to help pre-service teachers manage teaching challenges and setbacks, and alleviate any teaching-related anxiety. Mentors can help pre-service teachers build resilience and maintain a positive outlook on their teaching journey through emotional support and guidance. Moreover, a strong correlation exists between teacher self-efficacy and teacher knowledge and skills. Enhancing pre-service teachers’ knowledge base and instructional skills is crucial for bolstering their overall self-efficacy.

Secondly, this study also responds to the appeal to understand teachers’ professional learning from a holistic perspective and interrelate teachers’ professional learning process with student outcome variables (Sancar et al. 2021 ), and thus contributes to the understanding of the complexity of STEAM teachers’ professional learning. On the one hand, we have confirmed Cai et al.’s ( 2020 ) teacher professional learning model in a new context, namely STEAM teacher education. Throughout the teacher professional development program, the pre-service teacher, Shuitao, demonstrated an augmentation in her knowledge, encompassing both content knowledge and pedagogical understanding concerning IHT. Moreover, her beliefs regarding IHT transformed as a result of her engagement in teacher learning across the five stages. This facilitated her in executing effective IHT teaching and improving her students’ outcomes. On the other hand, notably, in our studies (including this current study and some follow-up studies), student feedback is a pivotal tool to assist teachers in discerning the impact they are effectuating. This enables pre-service teachers to grasp the actual efficacy of their teaching efforts and subsequently contributes significantly to the augmentation of their self-efficacy. Such steps have seldom been conducted in prior studies (e.g., Cai et al. 2020 ), where student outcomes are often perceived solely as the results of teachers’ instruction rather than sources informing teacher beliefs. Additionally, this study has validated both the interaction between teaching performance and teacher beliefs and between teacher knowledge and teacher beliefs. These aspects were overlooked in Cai et al.’s ( 2020 ) model. More importantly, while Clarke and Hollingsworth’s ( 2002 ) Interconnected Model of Professional Growth illustrates the connections between the domain of consequence and the personal domain, as well as between the personal domain and the domain of practice, it does not adequately clarify the complex relationships among the factors within the personal domain (e.g., the interaction between teacher knowledge and teacher beliefs). Therefore, our study also supplements Clarke and Hollingsworth’s ( 2002 ) model by addressing these intricacies. Based on our findings, an updated model of teacher professional learning has been proposed, as shown in Fig. 3 . This expanded model indicates that teacher learning should be an ongoing and sustainable process, with the enhancement of student learning not marking the conclusion of teacher learning, but rather serving as the catalyst for a new phase of learning. In this sense, we advocate for further research to investigate the tangible impacts of teacher professional development programs on students and how those impacts stimulate subsequent cycles of teacher learning.

figure 3

Note: Paths in blue were proposed by Cai et al. ( 2020 ), and paths in yellow are proposed and verified in this study.

Thirdly, in light of the updated model of teacher professional learning (see Fig. 3 ), this study provides insights into the design of teacher professional development programs of IAT. According to Huang et al. ( 2022 ), to date, very few studies have set goals to “develop a comprehensive understanding of effective designs” for STEM (or STEAM) teacher professional development programs (p. 15). To fill this gap, this study proposes a novel and effective five-stage framework for teacher professional development programs of IAT. This framework provides a possible and feasible solution to the challenges of STEAM teacher professional development programs’ design and planning, and teachers’ IAT practice (Boice et al. 2021 ; Herro et al. 2019 ; Jacques et al. 2020 ; Park and Cho 2022 ; Perignat and Katz-Buonincontro 2019 ).

Specifically, our five-stage framework incorporates at least six important features. Firstly, teacher professional development programs should focus on specific STEAM content. Given the expansive nature of STEAM, teacher professional development programs cannot feasibly encompass all facets of its contents. Consistent with recommendations by Cai et al. ( 2020 ), Desimone et al. ( 2002 ) and Garet et al. ( 2001 ), an effective teacher professional development program should prioritize content focus. Our five-stage framework is centered on IAT. Throughout an 18-month duration, each pre-service teacher is limited to selecting one subcomponent of A&H, such as history, for integration into their subject teaching (i.e., mathematics teaching, technology teaching or science teaching) within one teacher professional development program. Secondly, in response to the appeals that teacher professional development programs should shift from emphasizing teaching and instruction to emphasizing student learning (Cai et al. 2020 ; Calabrese et al. 2024 ; Hwang et al. 2024 ; Marco and Palatnik 2024 ; Örnek and Soylu 2021 ), our framework requires pre-service teachers to pay close attention to the effects of IAT on student learning outcomes, and use students’ feedback as the basis of improving their instruction. Thirdly, prior studies found that teacher education with a preference for theory led to pre-service teachers’ dissatisfaction with the quality of teacher professional development program and hindered the development of pre-service teachers’ teaching skills and teaching beliefs, which also widened the gap between theory and practice (Hennissen et al. 2017 ; Ord and Nuttall 2016 ). In this regard, our five-stage framework connects theory and teaching practice closely. In particular, pre-service teachers can experience the values of IAT not only through theoretical learning but also through diverse teaching practices. Fourthly, we build a teacher community of practice tailored for pre-service teachers. Additionally, we aim to encourage greater participation of in-service teachers in such teacher professional development programs designed for pre-service educators in STEAM teacher education. By engaging in such programs, in-service teachers can offer valuable teaching opportunities for pre-service educators and contribute their insights and experiences from teaching practice. Importantly, pre-service teachers stand to gain from the in-service teachers’ familiarity with textbooks, subject matter expertise, and better understanding of student dynamics. Fifthly, our five-stage framework lasts for an extended period, spanning 18 months. This duration ensures that pre-service teachers engage in a sustained and comprehensive learning journey. Lastly, our framework facilitates a practical understanding of “integration” by offering detailed, sequential instructions for blending two disciplines in teaching. For example, our teacher professional development programs prioritize systematic learning of pedagogical theories and simulated teaching experiences before pre-service teachers embark on real STEAM teaching endeavors. This approach is designed to mitigate the risk of unsuccessful experiences during initial teaching efforts, thereby safeguarding pre-service teachers’ teacher self-efficacy. Considering the complexity of “integration” in interdisciplinary teaching practices, including IAT (Han et al. 2022 ; Ryu et al. 2019 ), we believe detailed stage-by-stage and step-by-step instructions are crucial components of relevant pre-service teacher professional development programs. Notably, this aspect, emphasizing structural instructional guidance, has not been explicitly addressed in prior research (e.g., Cai et al. 2020 ). Figure 4 illustrates the six important features outlined in this study, encompassing both established elements and the novel addition proposed herein, describing an effective teacher professional development program.

figure 4

Note: STEAM refers to science, technology, engineering, arts and humanities, and mathematics.

The successful implementation of this framework is also related to the Chinese teacher education system and cultural background. For instance, the Chinese government has promoted many university-school collaboration initiatives, encouraging in-service teachers to provide guidance and practical opportunities for pre-service teachers (Lu et al. 2019 ). Influenced by Confucian values emphasizing altruism, many experienced in-service teachers in China are eager to assist pre-service teachers, helping them better realize their teaching career aspirations. It is reported that experienced in-service teachers in China show significantly higher motivation than their international peers when mentoring pre-service teachers (Lu et al. 2019 ). Therefore, for the successful implementation of this framework in other countries, it is crucial for universities to forge close collaborative relationships with K-12 schools and actively involve K-12 teachers in pre-service teacher education.

Notably, approximately 5% of our participants dropped out midway as they found that the IAT practice was too challenging or felt overwhelmed by the number of required tasks in the program. Consequently, we are exploring options to potentially simplify this framework in future iterations.

Without minimizing the limitations of this study, it is important to recognize that a qualitative longitudinal case study can be a useful means of shedding light on the development of a pre-service STEAM teacher’s self-efficacy. However, this methodology did not allow for a pre-post or a quasi-experimental design, and the effectiveness of our five-stage framework could not be confirmed quantitatively. In the future, conducting more experimental or design-based studies could further validate the effectiveness of our framework and broaden our findings. Furthermore, future studies should incorporate triangulation methods and utilize multiple data sources to enhance the reliability and validity of the findings. Meanwhile, owing to space limitations, we could only report the changes in Shuitao’s SE-IHT-IP and SE-IHT-SO here, and we could not describe the teacher self-efficacy of other participants regarding IAT. While nearly all of the pre-service teachers experienced an improvement in their teacher self-efficacy concerning IAT upon participating in our teacher professional development programs, the processes of their change were not entirely uniform. We will need to report the specific findings of these variations in the future. Further studies are also needed to explore the factors contributing to these variations. Moreover, following this study, we are implementing more teacher professional development programs of IAT. Future studies can explore the impact of this framework on additional aspects of pre-service STEAM teachers’ professional development. This will help gain a more comprehensive understanding of its effectiveness and potential areas for further improvement. Additionally, our five-stage framework was initially developed and implemented within the Chinese teacher education system. Future research should investigate how this framework can be adapted in other educational systems and cultural contexts.

The impetus behind this study stems from the burgeoning discourse advocating for the integration of A&H disciplines into STEM education on a global scale (e.g., Land 2020 ; Park and Cho 2022 ; Uştu et al. 2021 ; Vaziri and Bradburn 2021 ). Concurrently, there exists a pervasive concern regarding the challenges teachers face in implementing STEAM approaches, particularly in the context of IAT practices (e.g., Boice et al. 2021 ; Herro et al. 2019 ; Jacques et al. 2020 ; Park and Cho 2022 ; Perignat and Katz-Buonincontro 2019 ). To tackle this challenge, we first proposed a five-stage framework designed for teacher professional development programs of IAT. Then, utilizing this innovative framework, we implemented a series of teacher professional development programs. Drawing from the recommendations of Bray-Clark and Bates ( 2003 ), Kelley et al. ( 2020 ) and Zhou et al. ( 2023 ), we have selected teacher self-efficacy as a key metric to examine the effectiveness of the five-stage framework. Through a qualitative longitudinal case study, we scrutinized the influence of a specific teacher professional development program on the self-efficacy of a single pre-service teacher over an 18-month period. Our findings revealed a notable enhancement in teacher self-efficacy across both individual performance and student outcomes. The observed enhancement of the participant’s teacher self-efficacy did not occur swiftly but unfolded gradually through a prolonged, incremental process. Building on our findings, an updated model of teacher learning has been proposed. The updated model illustrates that teacher learning should be viewed as a continuous and sustainable process, wherein teaching performance, teacher beliefs, and teacher knowledge dynamically interact with one another. The updated model also confirms that teacher learning is inherently intertwined with student learning in STEAM education. Furthermore, this study also summarizes effective design features of STEAM teacher professional development programs.

Data availability

The datasets generated and/or analyzed during this study are not publicly available due to general data protection regulations, but are available from the corresponding author on reasonable request.

In their review article, Morris et al. ( 2017 ) equated “teaching self-efficacy” and “teacher self-efficacy” as synonymous concepts. This perspective is also adopted in this study.

An effective teacher professional development program should have specific, focused, and clear content instead of broad and scattered ones. Therefore, each pre-service teacher can only choose to integrate one subcomponent of A&H into their teaching in one teacher professional development program. For instance, Shuitao, a mathematics pre-service teacher, participated in one teacher professional development program focused on integrating history into mathematics teaching. However, she did not explore the integration of other subcomponents of A&H into her teaching during her graduate studies.

In the micro-classrooms, multi-angle, and multi-point high-definition video recorders are set up to record the teaching process.

In micro-teaching, mentors, in-service teachers, and other fellow pre-service teachers take on the roles of students.

In China, teachers can video record one section of a lesson and play them in formal classes. This is a practice known as a micro-course. For instance, in one teacher professional development program of integrating history into mathematics teaching, micro-courses encompass various mathematics concepts, methods, ideas, history-related material and related topics. Typically, teachers use these micro-courses to broaden students’ views, foster inquiry-based learning, and cultivate critical thinking skills. Such initiatives play an important role in improving teaching quality.

Many university-school collaboration initiatives in China focus on pre-service teachers’ practicum experiences (Lu et al. 2019 ). Our teacher professional development program is also supported by many K-12 schools in Shanghai. Personal information in videos is strictly protected.

In China, students are not required to pursue a graduate major that matches their undergraduate major. Most participants in our teacher professional development programs did not pursue undergraduate degrees in education-related fields.

Shuitao’s university reserves Wednesday afternoons for students to engage in various programs or clubs, as classes are not scheduled during this time. Similarly, our teacher professional development program activities are planned for Wednesday afternoons to avoid overlapping with participants’ other coursework commitments.

History is one of the most important components of A&H (Park and Cho 2022 ).

To learn more about genetic approach (i.e., genetic principle), see Jankvist ( 2009 ).

For the assessment process, see Fig. 2 .

Shuitao’s cooperating in-service teacher taught the concept of logarithms in Stage 2. In Stage 4, the teaching objective of her cooperating in-service teacher’s review lesson was to help students review the concept of logarithms to prepare students for the final exam.

Akiba M, Murata A, Howard C, Wilkinson B, Fabrega J (2019) Race to the top and lesson study implementation in Florida: District policy and leadership for teacher professional development. In: Huang R, Takahashi A, daPonte JP (eds) Theory and practice of lesson study in mathematics, pp. 731–754. Springer, Cham. https://doi.org/10.1007/978-3-030-04031-4_35

Alkhabra YA, Ibrahem UM, Alkhabra SA (2023) Augmented reality technology in enhancing learning retention and critical thinking according to STEAM program. Humanit Soc Sci Commun 10:174. https://doi.org/10.1057/s41599-023-01650-w

Article   Google Scholar  

Alwafi EM, Downey C, Kinchin G (2020) Promoting pre-service teachers’ engagement in an online professional learning community: Support from practitioners. J Professional Cap Community 5(2):129–146. https://doi.org/10.1108/JPCC-10-2019-0027

Archibald S, Coggshall JG, Croft A, Goe L (2011) High-quality professional development for all teachers: effectively allocating resources. National Comprehensive Center for Teacher Quality, Washington, DC

Google Scholar  

Bandura A (1977) Self-efficacy: Toward a unifying theory of behavioral change. Psychological Rev 84:191–215. https://doi.org/10.1016/0146-6402(78)90002-4

Article   CAS   Google Scholar  

Bandura A (1986) The explanatory and predictive scope of self-efficacy theory. J Soc Clin Psychol 4:359–373. https://doi.org/10.1521/jscp.1986.4.3.359

Bandura A (1997) Self-efficacy: The exercise of control. Freeman, New York

Basckin C, Strnadová I, Cumming TM (2021) Teacher beliefs about evidence-based practice: A systematic review. Int J Educ Res 106:101727. https://doi.org/10.1016/j.ijer.2020.101727

Bray-Clark N, Bates R (2003) Self-efficacy beliefs and teacher effectiveness: Implications for professional development. Prof Educ 26(1):13–22

Blonder R, Benny N, Jones MG (2014) Teaching self-efficacy of science teachers. In: Evans R, Luft J, Czerniak C, Pea C (eds), The role of science teachers’ beliefs in international classrooms: From teacher actions to student learning, Sense Publishers, Rotterdam, Zuid-Holland, pp. 3–16

Boice KL, Jackson JR, Alemdar M, Rao AE, Grossman S, Usselman M (2021) Supporting teachers on their STEAM journey: A collaborative STEAM teacher training program. Educ Sci 11(3):105. https://doi.org/10.3390/educsci11030105

Cai J (2017) Longitudinally investigating the effect of teacher professional development on instructional practice and student learning: A focus on mathematical problem posing. The University of Delaware, Newark, DE

Cai J, Chen T, Li X, Xu R, Zhang S, Hu Y, Zhang L, Song N (2020) Exploring the impact of a problem-posing workshop on elementary school mathematics teachers’ conceptions on problem posing and lesson design. Int J Educ Res 102:101404. https://doi.org/10.1016/j.ijer.2019.02.004

Calabrese JE, Capraro MM, Viruru R (2024) Semantic structure and problem posing: Preservice teachers’ experience. School Sci Math. https://doi.org/10.1111/ssm.12648

Clarke D, Hollingsworth H (2002) Elaborating a model of teacher professional growth. Teach Teach Educ 18(8):947–967. https://doi.org/10.1016/S0742-051X(02)00053-7

Corden A, Millar J (2007) Time and change: A review of the qualitative longitudinal research literature for social policy. Soc Policy Soc 6(4):583–592. https://doi.org/10.1017/S1474746407003910

Darling-Hammond L, Hyler ME, Gardner M (2017) Effective teacher professional development. Learning Policy Institute, Palo Alto, CA

Book   Google Scholar  

de la Garza A (2021) Internationalizing the curriculum for STEAM (STEM+ Arts and Humanities): From intercultural competence to cultural humility. J Stud Int Educ 25(2):123–135. https://doi.org/10.1177/1028315319888468

Article   MathSciNet   Google Scholar  

Desimone LM, Garet MS (2015) Best practices in teachers’ professional development in the United States. Psychol, Soc, Educ 7(3):252–263

Desimone LM, Porter AC, Garet MS, Yoon KS, Birman BF (2002) Effects of professional development on teachers’ instruction: Results from a three-year longitudinal study. Educ Evaluation Policy Anal 24(2):81–112. https://doi.org/10.3102/01623737024002081

Dicks SG, Northam HL, van Haren FM, Boer DP (2023) The bereavement experiences of families of potential organ donors: a qualitative longitudinal case study illuminating opportunities for family care. Int J Qualitative Stud Health Well-being 18(1):1–24. https://doi.org/10.1080/17482631.2022.2149100

Ding M, Huang R, Pressimone Beckowski C, Li X, Li Y (2024) A review of lesson study in mathematics education from 2015 to 2022: implementation and impact. ZDM Math Educ 56:87–99. https://doi.org/10.1007/s11858-023-01538-8

Duong NH, Nam NH, Trung TT (2024) Factors affecting the implementation of STEAM education among primary school teachers in various countries and Vietnamese educators: comparative analysis. Education 3–13. https://doi.org/10.1080/03004279.2024.2318239

English LD (2016) STEM education K-12: Perspectives on integration. Int J STEM Educ 3:3. https://doi.org/10.1186/s40594-016-0036-1

Garet MS, Porter AC, Desimone L, Birman BF, Yoon KS (2001) What makes professional development effective? Results from a national sample of teachers. Am Educ Res J 38(4):915–945. https://doi.org/10.3102/00028312038004915

Gates AE (2017) Benefits of a STEAM collaboration in Newark, New Jersey: Volcano simulation through a glass-making experience. J Geosci Educ 65(1):4–11. https://doi.org/10.5408/16-188.1

Geng J, Jong MSY, Chai CS (2019) Hong Kong teachers’ self-efficacy and concerns about STEM education. Asia-Pac Educ Researcher 28:35–45. https://doi.org/10.1007/s40299-018-0414-1

Han J, Kelley T, Knowles JG (2022) Building a sustainable model of integrated stem education: investigating secondary school STEM classes after an integrated STEM project. Int J Technol Design Educ. https://doi.org/10.1007/s10798-022-09777-8

Henderson S, Holland J, McGrellis S, Sharpe S, Thomson R (2012) Storying qualitative longitudinal research: Sequence, voice and motif. Qualitative Res 12(1):16–34. https://doi.org/10.1177/1468794111426232

Hennissen P, Beckers H, Moerkerke G (2017) Linking practice to theory in teacher education: A growth in cognitive structures. Teach Teach Educ 63:314–325. https://doi.org/10.1016/j.tate.2017.01.008

Henson RK (2002) From adolescent angst to adulthood: Substantive implications and measurement dilemmas in the development of teacher efficacy research. Educ Psychologist 37:137–150. https://doi.org/10.1207/S15326985EP3703_1

Herro D, Quigley C (2016) Innovating with STEAM in middle school classrooms: remixing education. Horizon 24(3):190–204. https://doi.org/10.1108/OTH-03-2016-0008

Herro D, Quigley C, Cian H (2019) The challenges of STEAM instruction: Lessons from the field. Action Teach Educ 41(2):172–190. https://doi.org/10.1080/01626620.2018.1551159

Huang B, Jong MSY, Tu YF, Hwang GJ, Chai CS, Jiang MYC (2022) Trends and exemplary practices of STEM teacher professional development programs in K-12 contexts: A systematic review of empirical studies. Comput Educ 189:104577. https://doi.org/10.1016/j.compedu.2022.104577

Hunter-Doniger T, Sydow L (2016) A journey from STEM to STEAM: A middle school case study. Clearing House 89(4-5):159–166. https://doi.org/10.1080/00098655.2016.1170461

Hwang S, Xu R, Yao Y, Cai J (2024) Learning to teach through problem posing: A teacher’s journey in a networked teacher−researcher partnership. J Math Behav 73:101120. https://doi.org/10.1016/j.jmathb.2023.101120

Jacques LA, Cian H, Herro DC, Quigley C (2020) The impact of questioning techniques on STEAM instruction. Action Teach Educ 42(3):290–308. https://doi.org/10.1080/01626620.2019.1638848

Jankvist UT (2009) A categorization of the “whys” and “hows” of using history in mathematics education. Educ Stud Math 71:235–261. https://doi.org/10.1007/s10649-008-9174-9

Jiang H, Chugh R, Turnbull D, Wang X, Chen S (2023) Modeling the impact of intrinsic coding interest on STEM career interest: evidence from senior high school students in two large Chinese cities. Educ Inf Technol 28:2639–2659. https://doi.org/10.1007/s10639-022-11277-0

Jiang H, Chugh R, Turnbull D, Wang X, Chen S (2024a) Exploring the effects of technology-related informal mathematics learning activities: A structural equation modeling analysis. Int J Sci Math Educ . Advance online publication. https://doi.org/10.1007/s10763-024-10456-4

Jiang H, Islam AYMA, Gu X, Guan J (2024b) How do thinking styles and STEM attitudes have effects on computational thinking? A structural equation modeling analysis. J Res Sci Teach 61:645–673. https://doi.org/10.1002/tea.21899

Jiang H, Turnbull D, Wang X, Chugh R, Dou Y, Chen S (2022) How do mathematics interest and self-efficacy influence coding interest and self-efficacy? A structural equation modeling analysis. Int J Educ Res 115:102058. https://doi.org/10.1016/j.ijer.2022.102058

Jiang H, Wang K, Wang X, Lei X, Huang Z (2021) Understanding a STEM teacher’s emotions and professional identities: A three-year longitudinal case study. Int J STEM Educ 8:51. https://doi.org/10.1186/s40594-021-00309-9

Kelley TR, Knowles JG, Holland JD, Han J (2020) Increasing high school teachers self-efficacy for integrated STEM instruction through a collaborative community of practice. Int J STEM Educ 7:14. https://doi.org/10.1186/s40594-020-00211-w

Klassen RM, Tze VMC, Betts SM, Gordon KA (2011) Teacher efficacy research 1998–2009: Signs of progress or unfulfilled promise? Educ Psychol Rev 23(1):21–43. https://doi.org/10.1007/s10648-010-9141-8

Land M (2020) The importance of integrating the arts into STEM curriculum. In: Stewart AJ, Mueller MP, Tippins DJ (eds), Converting STEM into STEAM programs, pp. 11–19. Springer. https://doi.org/10.1007/978-3-030-25101-7_2

Lee C, Bobko P (1994) Self-efficacy beliefs: Comparison of five measures. J Appl Psychol 79(3):364–369. https://doi.org/10.1037/0021-9010.79.3.364

Li W, Chiang FK (2019) Preservice teachers’ perceptions of STEAM education and attitudes toward STEAM disciplines and careers in China. In: Sengupta P, Shanahan MC, Kim B, (eds), Critical, transdisciplinary and embodied approaches in STEM education. Springer. https://doi.org/10.1007/978-3-030-29489-2_5

Liu M, Zwart R, Bronkhorst L, Wubbels T (2022) Chinese student teachers’ beliefs and the role of teaching experiences in the development of their beliefs. Teach Teach Educ 109:103525. https://doi.org/10.1016/j.tate.2021.103525

Liu S, Xu X, Stronge J (2018) The influences of teachers’ perceptions of using student achievement data in evaluation and their self-efficacy on job satisfaction: evidence from China. Asia Pac Educ Rev 19:493–509. https://doi.org/10.1007/s12564-018-9552-7

Long T, Zhao G, Yang X, Zhao R, Chen Q (2021) Bridging the belief-action gap in a teachers’ professional learning community on teaching of thinking. Professional Dev Educ 47(5):729–744. https://doi.org/10.1080/19415257.2019.1647872

Lu L, Wang F, Ma Y, Clarke A, Collins J (2019) Exploring Chinese teachers’ commitment to being a cooperating teacher in a university-government-school initiative for rural practicum placements. In: Liu WC, Goh CCM (eds), Teachers’ perceptions, experience and learning, pp. 33–54. Routledge, London

Lyu S, Niu S, Yuan J, Zhan Z (2024) Developing professional capital through technology-enabled university-school-enterprise collaboration: an innovative model for C-STEAM preservice teacher education in the Greater Bay area. Asia Pacific J Innov Entrepreneurship. https://doi.org/10.1108/APJIE-01-2024-0014

Marco N, Palatnik A (2024) From “learning to variate” to “variate for learning”: Teachers learning through collaborative, iterative context-based mathematical problem posing. J Math Behav 73:101119. https://doi.org/10.1016/j.jmathb.2023.101119

Merriam SB (1998) Qualitative research and case study applications in education. Jossey-Bass Publishers, Hoboken, New Jersey

Morris DB, Usher EL (2011) Developing teaching self-efficacy in research institutions: A study of award-winning professors. Contemp Educ Psychol 36(3):232–245. https://doi.org/10.1016/j.cedpsych.2010.10.005

Morris DB, Usher EL, Chen JA (2017) Reconceptualizing the sources of teaching self-efficacy: A critical review of emerging literature. Educ Psychol Rev 29(4):795–833. https://doi.org/10.1007/s10648-016-9378-y

Matusovich HM, Streveler RA, Miller RL (2010) Why do students choose engineering? A qualitative, longitudinal investigation of students’ motivational values. J Eng Educ 99(4):289–303. https://doi.org/10.1002/j.2168-9830.2010.tb01064.x

Näykki P, Kontturi H, Seppänen V, Impiö N, Järvelä S (2021) Teachers as learners–a qualitative exploration of pre-service and in-service teachers’ continuous learning community OpenDigi. J Educ Teach 47(4):495–512. https://doi.org/10.1080/02607476.2021.1904777

OECD (2018) Teaching and learning international survey (TALIS) 2018 conceptual framework. OECD, Paris

Örnek T, Soylu Y (2021) A model design to be used in teaching problem posing to develop problem-posing skills. Think Skills Creativity 41:100905. https://doi.org/10.1016/j.tsc.2021.100905

Ord K, Nuttall J (2016) Bodies of knowledge: The concept of embodiment as an alternative to theory/practice debates in the preparation of teachers. Teach Teach Educ 60:355–362. https://doi.org/10.1016/j.tate.2016.05.019

Ozkan G, Umdu Topsakal U (2021) Investigating the effectiveness of STEAM education on students’ conceptual understanding of force and energy topics. Res Sci Technol Educ 39(4):441–460. https://doi.org/10.1080/02635143.2020.1769586

Park W, Cho H (2022) The interaction of history and STEM learning goals in teacher-developed curriculum materials: opportunities and challenges for STEAM education. Asia Pacific Educ Rev. https://doi.org/10.1007/s12564-022-09741-0

Perignat E, Katz-Buonincontro J (2019) STEAM in practice and research: An integrative literature review. Think Skills Creativity 31:31–43. https://doi.org/10.1016/j.tsc.2018.10.002

Philipp RA (2007) Mathematics teachers’ beliefs and affect. In: Lester Jr FK, (ed), Second handbook of research on mathematics teaching and learning, pp. 257–315. Information Age, Charlotte, NC

Quigley CF, Herro D, Jamil FM (2017) Developing a conceptual model of STEAM teaching practices. Sch Sci Math 117(1–2):1–12. https://doi.org/10.1111/ssm.12201

Ro S, Xiao S, Zhou Z (2022) Starting up STEAM in China: A case study of technology entrepreneurship for STEAM education in China. In: Ray P, Shaw R (eds), Technology entrepreneurship and sustainable development, pp. 115–136. Springer. https://doi.org/10.1007/978-981-19-2053-0_6

Roth KJ, Bintz J, Wickler NI, Hvidsten C, Taylor J, Beardsley PM, Wilson CD (2017) Design principles for effective video-based professional development. Int J STEM Educ 4:31. https://doi.org/10.1186/s40594-017-0091-2

Article   PubMed   PubMed Central   Google Scholar  

Ryu M, Mentzer N, Knobloch N (2019) Preservice teachers’ experiences of STEM integration: Challenges and implications for integrated STEM teacher preparation. Int J Technol Des Educ, 29:493–512. https://doi.org/10.1007/s10798-018-9440-9

Sancar R, Atal D, Deryakulu D (2021) A new framework for teachers’ professional development. Teach Teach Educ 101:103305. https://doi.org/10.1016/j.tate.2021.103305

Sanz-Camarero R, Ortiz-Revilla J, Greca IM (2023) The place of the arts within integrated education. Arts Educ Policy Rev. https://doi.org/10.1080/10632913.2023.2260917

Shirani F, Henwood K (2011) Continuity and change in a qualitative longitudinal study of fatherhood: relevance without responsibility. Int J Soc Res Methodol 14(1):17–29. https://doi.org/10.1080/13645571003690876

Smith CE, Paré JN (2016) Exploring Klein bottles through pottery: A STEAM investigation. Math Teach 110(3):208–214. https://doi.org/10.5951/mathteacher.110.3.0208

Soprano K, Yang L (2013) Inquiring into my science teaching through action research: A case study on one pre-service teacher’s inquiry-based science teaching and self-efficacy. Int J Sci Math Educ 11(6):1351–1368. https://doi.org/10.1007/s10763-012-9380-x

Stake RE (1995) The art of case study research. Sage Publication, Thousand Oaks, California

Stohlmann M, Moore T, Roehrig G (2012) Considerations for teaching integrated STEM education. J Pre Coll Eng Educ Res 2(1):28–34. https://doi.org/10.5703/1288284314653

Strauss AL, Corbin JM (1990) Basics of qualitative research. Sage Publications, Thousand Oaks, California

Taimalu M, Luik P (2019) The impact of beliefs and knowledge on the integration of technology among teacher educators: A path analysis. Teach Teach Educ 79:101–110. https://doi.org/10.1016/j.tate.2018.12.012

Takahashi A, McDougal T (2016) Collaborative lesson research: Maximizing the impact of lesson study. ZDM Math Educ 48:513–526. https://doi.org/10.1007/s11858-015-0752-x

Thompson AG (1992) Teachers’ beliefs and conceptions: A synthesis of the research. In: Grouws DA (ed), Handbook of research on mathematics teaching and learning, pp. 127–146. Macmillan, New York

Tschannen-Moran M, Woolfolk Hoy A, Hoy WK (1998) Teacher efficacy: Its meaning and measure. Rev Educ Res 68:202–248. https://doi.org/10.3102/00346543068002202

Unfried A, Rachmatullah A, Alexander A, Wiebe E (2022) An alternative to STEBI-A: validation of the T-STEM science scale. Int J STEM Educ 9:24. https://doi.org/10.1186/s40594-022-00339-x

Uştu H, Saito T, Mentiş Taş A (2021) Integration of art into STEM education at primary schools: an action research study with primary school teachers. Syst Pract Action Res. https://doi.org/10.1007/s11213-021-09570-z

Vaziri H, Bradburn NM (2021) Flourishing effects of integrating the arts and humanities in STEM Education: A review of past studies and an agenda for future research. In: Tay L, Pawelski JO (eds), The Oxford handbook of the positive humanities, pp. 71–82. Oxford University Press, Oxford

Wang K, Wang X, Li Y, Rugh MS (2018) A framework for integrating the history of mathematics into teaching in Shanghai. Educ Stud Math 98:135–155. https://doi.org/10.1007/s10649-018-9811-x

Wang Z, Jiang H, Jin W, Jiang J, Liu J, Guan J, Liu Y, Bin E (2024) Examining the antecedents of novice stem teachers’ job satisfaction: The roles of personality traits, perceived social support, and work engagement. Behav Sci 14(3):214. https://doi.org/10.3390/bs14030214

Wenger E, McDermott R, Snyder WM (2002) Cultivating communities of practice. Harvard Business School Press, Boston, MA

Wong JT, Bui NN, Fields DT, Hughes BS (2022) A learning experience design approach to online professional development for teaching science through the arts: Evaluation of teacher content knowledge, self-efficacy and STEAM perceptions. J Sci Teacher Educ. https://doi.org/10.1080/1046560X.2022.2112552

Wyatt M (2015) Using qualitative research methods to assess the degree of fit between teachers’ reported self-efficacy beliefs and their practical knowledge during teacher education. Aust J Teach Educ 40(1):117–145

Xenofontos C, Andrews P (2020) The discursive construction of mathematics teacher self-efficacy. Educ Stud Math 105(2):261–283. https://doi.org/10.1007/s10649-020-09990-z

Yin R (2003) Case study research: Design and methods. Sage Publications, Thousand Oaks, California

Zakariya YF (2020) Effects of school climate and teacher self-efficacy on job satisfaction of mostly STEM teachers: a structural multigroup invariance approach. Int J STEM Educ 7:10. https://doi.org/10.1186/s40594-020-00209-4

Zee M, de Jong PF, Koomen HM (2017) From externalizing student behavior to student-specific teacher self-efficacy: The role of teacher-perceived conflict and closeness in the student–teacher relationship. Contemp Educ Psychol 51:37–50. https://doi.org/10.1016/j.cedpsych.2017.06.009

Zee M, Koomen HM (2016) Teacher self-efficacy and its effects on classroom processes, student academic adjustment, and teacher well-being: A synthesis of 40 years of research. Rev Educ Res 86(4):981–1015. https://doi.org/10.3102/0034654315626801

Zhan Z, Li Y, Mei H, Lyu S (2023) Key competencies acquired from STEM education: gender-differentiated parental expectations. Humanit Soc Sci Commun 10:464. https://doi.org/10.1057/s41599-023-01946-x

Zhan Z, Niu S (2023) Subject integration and theme evolution of STEM education in K-12 and higher education research. Humanit Soc Sci Commun 10:781. https://doi.org/10.1057/s41599-023-02303-8

Zhong B, Liu X, Li X (2024) Effects of reverse engineering pedagogy on students’ learning performance in STEM education: The bridge-design project as an example. Heliyon 10(2):e24278. https://doi.org/10.1016/j.heliyon.2024.e24278

Zhong B, Liu X, Zhan Z, Ke Q, Wang F (2022) What should a Chinese top-level design in STEM Education look like? Humanit Soc Sci Commun 9:261. https://doi.org/10.1057/s41599-022-01279-1

Zhou B (2021) Cultivation of Ed. M. to bring up “famous subject teachers”: practical exploration and policy suggestions. Teach Educ Res 33(5):19–26. https://doi.org/10.13445/j.cnki.t.e.r.2021.05.003

Zhou X, Shu L, Xu Z, Padrón Y (2023) The effect of professional development on in-service STEM teachers’ self-efficacy: a meta-analysis of experimental studies. Int J STEM Educ 10:37. https://doi.org/10.1186/s40594-023-00422-x

Download references

Acknowledgements

This research is funded by 2021 National Natural Science Foundation of China (Grant No.62177042), 2024 Zhejiang Provincial Natural Science Foundation of China (Grant No. Y24F020039), and 2024 Zhejiang Educational Science Planning Project (Grant No. 2024SCG247).

Author information

Xuesong Zhai

Present address: School of Education, City University of Macau, Macau, China

Authors and Affiliations

College of Education, Zhejiang University, Hangzhou, China

Haozhe Jiang & Xuesong Zhai

School of Engineering and Technology, CML‑NET & CREATE Research Centres, Central Queensland University, North Rockhampton, QLD, Australia

Ritesh Chugh

Hangzhou International Urbanology Research Center & Zhejiang Urban Governance Studies Center, Hangzhou, China

Department of Teacher Education, Nicholls State University, Thibodaux, LA, USA

School of Mathematical Sciences, East China Normal University, Shanghai, China

Xiaoqin Wang

College of Teacher Education, Faculty of Education, East China Normal University, Shanghai, China

You can also search for this author in PubMed   Google Scholar

Contributions

Conceptualization - Haozhe Jiang; methodology - Haozhe Jiang; software - Xuesong Zhai; formal analysis - Haozhe Jiang & Ke Wang; investigation - Haozhe Jiang; resources - Haozhe Jiang, Xuesong Zhai & Xiaoqin Wang; data curation - Haozhe Jiang & Ke Wang; writing—original draft preparation - Haozhe Jiang & Ritesh Chugh; writing—review and editing - Ritesh Chugh & Ke Wang; visualization - Haozhe Jiang, Ke Wang & Xiaoqin Wang; supervision - Xuesong Zhai & Xiaoqin Wang; project administration - Xuesong Zhai & Xiaoqin Wang; and funding acquisition - Xuesong Zhai & Xiaoqin Wang. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Xuesong Zhai .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This research was approved by the Committee for Human Research of East China Normal University (Number: HR 347-2022). The procedures used in this study adhere to the tenets of the Declaration of Helsinki.

Informed consent

Written informed consent was obtained from all participants in this study before data collection.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Jiang, H., Chugh, R., Zhai, X. et al. Longitudinal analysis of teacher self-efficacy evolution during a STEAM professional development program: a qualitative case study. Humanit Soc Sci Commun 11 , 1162 (2024). https://doi.org/10.1057/s41599-024-03655-5

Download citation

Received : 27 April 2024

Accepted : 12 August 2024

Published : 08 September 2024

DOI : https://doi.org/10.1057/s41599-024-03655-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

data analysis of qualitative research

Promoting higher education students’ self-regulated learning through learning analytics: A qualitative study

  • Open access
  • Published: 07 September 2024

Cite this article

You have full access to this open access article

data analysis of qualitative research

  • Riina Kleimola   ORCID: orcid.org/0000-0003-2091-2798 1 ,
  • Laura Hirsto   ORCID: orcid.org/0000-0002-8963-3036 2 &
  • Heli Ruokamo   ORCID: orcid.org/0000-0002-8679-781X 1  

Learning analytics provides a novel means to support the development and growth of students into self-regulated learners, but little is known about student perspectives on its utilization. To address this gap, the present study proposed the following research question: what are the perceptions of higher education students on the utilization of a learning analytics dashboard to promote self-regulated learning? More specifically, this can be expressed via the following threefold sub-question: how do higher education students perceive the use of a learning analytics dashboard and its development as promoting the (1) forethought, (2) performance, and (3) reflection phase processes of self-regulated learning? Data for the study were collected from students ( N  = 16) through semi-structured interviews and analyzed using a qualitative content analysis. Results indicated that the students perceived the use of the learning analytics dashboard as an opportunity for versatile learning support, providing them with a means to control and observe their studies and learning, while facilitating various performance phase processes. Insights from the analytics data could also be used in targeting the students’ development areas as well as in reflecting on their studies and learning, both individually and jointly with their educators, thus contributing to the activities of forethought and reflection phases. However, in order for the learning analytics dashboard to serve students more profoundly across myriad studies, its further development was deemed necessary. The findings of this investigation emphasize the need to integrate the use and development of learning analytics into versatile learning processes and mechanisms of comprehensive support and guidance.

Explore related subjects

  • Artificial Intelligence
  • Digital Education and Educational Technology

Avoid common mistakes on your manuscript.

1 Introduction

Promoting students to become autonomous, self-regulated learners is a fundamental goal of education (Lodge et al., 2019 ; Puustinen & Pulkkinen, 2001 ). The importance of doing so is particularly highlighted in higher education (HE) contexts that strive to prepare its students for highly demanding and autonomous expert tasks (Virtanen, 2019 ). In order to perform successfully in diverse educational and professional settings, students need to take an active, self-initiated role in managing their learning processes, thereby assuming primary responsibility for their educational pursuits. Self-regulated learning (SRL) invites students to actively monitor, control, and regulate their cognition, motivation, and behavior in relation to their learning goals and contextual conditions (Pintrich, 2000 ). In an effort to create a favorable foundation for the development of SRL, many HE institutions have begun to explore and exploit the potential of emerging educational technologies, such as learning analytics (LA).

Despite the growing interest in adopting LA for educational purposes (Van Leeuwen et al., 2022 ), little is known about students’ perspectives on its utilization (Jivet et al., 2020 ; Wise et al., 2016 ). Additionally, there is only limited evidence on using LA to support SRL (Heikkinen et al., 2022 ; Jivet et al., 2018 ; Matcha et al., 2020 ; Viberg et al., 2020 ). Thus, more research is inevitably needed to better understand how students themselves consider the potential of analytics applications from the perspective of SRL. Involving students in the development of LA is particularly important, as they represent primary stakeholders targeted to benefit from its utilization (Dollinger & Lodge, 2018 ; West et al., 2020 ). LA should not only be developed for users but also with them in order to adapt its potential to their needs and expectations (Dollinger & Lodge, 2018 ; Klein et al., 2019 ).

LA is thought to provide a promising means to enhance student SRL by harnessing the massive amount of data stored in educational systems and facilitating appropriate means of support (Lodge et al., 2019 ). It is generally defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (Conole et al., 2011 , para. 4). The reporting of such data is typically conducted through learning analytics dashboards (LADs) that aggregate diverse types of indicators about learners and learning processes in a visualized form (Corrin & De Barba, 2014 ; Park & Jo, 2015 ; Schwendimann et al., 2017 ). Recently, there has been a rapid movement into LADs that present analytics data directly to students themselves (Schwendimann et al., 2017 ; Teasley, 2017 ; Van Leeuwen et al., 2022 ). Such analytics applications generally aim to provide students with insights into their study progress as well as support for optimizing learning outcomes (Molenaar et al., 2019 ; Sclater et al., 2016 ; Susnjak et al., 2022 ).

The purpose of this qualitative study is to examine how HE students perceive the use and development of an LAD to promote the different phases and processes of SRL. Instead of taking a course-level approach, this study addresses a less-examined study path perspective that covers the entirety of studies, from the start of an HE degree to its completion. A specific emphasis is placed on such an LAD that students could use both independently across studies and together with their tutor teachers as a component of educational guidance. As analytics applications are largely still under development (Sclater et al., 2016 ), and mainly in the exploratory phase (Schwendimann et al., 2017 ; Susnjak et al., 2022 ), it is essential to gain an understanding of how students perceive the use of these applications as a form of learning support. Preparing students to become efficient self-regulated learners is increasingly—and simultaneously—a matter of helping them develop into efficient users of analytics data.

2 Theoretical framework

2.1 enhancing srl in he.

SRL, which has been the subject of wide research interest over the last two decades (Panadero, 2017 ), is referred to as “self-generated thoughts, feelings, and actions that are planned and cyclically adapted to the attainment of personal goals” (Zimmerman, 1999 , p. 14). Self-regulated students are proactive in their endeavors to learn, and they engage in diverse, personally initiated metacognitive, motivational, and behavioral processes to achieve their goals (Zimmerman, 1999 ). They master their learning through covert, cognitive means but also through behavioral, social, and environmental approaches that are reciprocally interdependent and interrelated (Zimmerman, 1999 , 2015 ), thus emphasizing the sociocognitive views on SRL (Bandura, 1986 ).

When describing and modelling SRL, researchers have widely agreed on its cyclical nature and its organization into several distinct phases and processes (Panadero, 2017 ; Puustinen & Pulkkinen, 2001 ). In the well-established model by Zimmerman and Moylan ( 2009 ), SRL occurs in the cyclic phases of forethought, performance, and self-reflection that take place before, during, and after students’ efforts to learn. In the forethought phase, students prepare themselves for learning and approach the learning tasks through the processes of planning and goal setting, and the activation of self-motivation beliefs, such as self-efficacy perceptions, outcome expectations, and personal interests. Next, in the performance phase, they carry out the actual learning tasks and make use of self-control processes and strategies, such as self-instruction, time management, help-seeking, and interest enhancement. Moreover, they keep records of their performance and monitor their learning, while promoting the achievement of desired goals. In the final self-reflection phase, students participate in the processes of evaluating their learning and reflecting on the perceived causes of their successes and failures, which typically results in different types of cognitive and affective self-reactions as responses to such activity. This phase also forms the basis for the approaches to be adjusted for and applied in the subsequent forethought phase, thereby completing the SRL cycle. The model suggests that the processes in each phase influence the following ones in a cyclical and interactive manner and provide feedback for subsequent learning efforts (Zimmerman & Moylan, 2009 ; Zimmerman, 2011 ). Participation in these processes allows students to become self-aware, competent, and decisive in their learning approaches (Kramarski & Michalsky, 2009 ).

Although several other prevalent SRL models with specific emphases also exist (e.g., Pintrich, 2000 ; Winne & Hadwin, 1998 ; for a review, see Panadero, 2017 ), the one presented above provides a comprehensive yet straightforward framework for identifying and examining the key phases and processes related to SRL (Panadero & Alonso-Tapia, 2014 ). Developing thorough insights into the student SRL is especially needed in an HE context, where the increase in digitized educational settings and tools requires students to manage their learning in a way that is autonomous and self-initiated. When pursuing an HE degree, students are expected to engage in the cyclical phases and processes of SRL as a continuous effort throughout their studies. Involvement in SRL is needed not only to successfully perform a single study module, course, or task but also to actively promote the entirety of studies throughout semesters and academic years. It therefore plays a central role in the successful completion of HE studies.

From the study path perspective, the forethought phase requires HE students to be active in the directing and planning of their studies and learning—that is, setting achievable goals, making detailed plans, finding personal interests, and trusting in their abilities to complete the degree. The performance phase, in turn, invites students to participate in the control and observation of their studies and learning. While completing their studies, they must regularly track study performance, visualize relevant study information, create functional study environments, maintain motivation and interest, and seek and receive productive guidance. The reflection phase, on the other hand, involves students in evaluating and reflecting on their studies and learning—that is, analyzing their learning achievements and processing resulting responses. These activities typically occur as overlapping, cyclic, and connected processes and as a continuum across studies. Additionally, the phases may appear simultaneously, as students strive to learn and receive feedback from different processes (Pintrich, 2000 ). The processes may also emerge in more than one phase (Panadero & Alonso-Tapia, 2014 ), and boundaries between the phases are not always that precise.

SRL is shown to benefit HE students in various ways. Research has evidenced, for instance, that online students who use their time efficiently, are aware of their learning behavior, think critically, and show efforts to learn despite challenges are likely to achieve academic success when studying in online settings (Broadbent & Poon, 2015 ). SRL is also shown to contribute to many non-academic outcomes in HE blended environments (for a review, see Anthonysamy et al., 2020 ). Despite this importance, research (e.g., Azevedo et al., 2004 ; Barnard-Brak et al., 2010 ) has indicated that students differ in their ways to self-regulate, and not all are competent self-regulated learners by default. As such, many students would require and benefit from support to develop their SRL (Moos, 2018 ; Wong et al., 2019 ).

Supporting student SRL is generally considered the responsibility of a teaching staff (Callan et al., 2022 ; Kramarski, 2018 ). It can also be a specific task given to tutor teachers assigned to each student or to a group of students for particular academic years. Sometimes referred to as advisors, they are often teachers of study programs who aim to help students in decision-making, study planning, and career reflection (De Laet et al., 2020 ), while offering them guidance and support for the better management of learning. In recent years, efforts have also been made to promote student SRL with educational technologies such as LA (e.g., Marzouk et al., 2016 ; Wise et al., 2016 ). LA is used to deliver insights for students themselves to better self-regulate their learning (e.g., Jivet et al., 2021 ; Molenaar et al., 2019 ), and also to facilitate the interaction between students and guidance personnel (e.g., Charleer et al., 2018 ). It is generally thought to promote the development of future competences needed by students in education and working life (Kleimola & Leppisaari, 2022 ), and to offer novel insights into their motivational drivers (Kleimola et al., 2023 ).

2.2 LA as a potential tool to promote SRL

Much of the recent development in the field of LA has focused on the design and implementation of LADs. In general, their purpose is to support sensemaking and encourage students and teachers to make informed decisions about learning and teaching processes (Jivet et al., 2020 ; Verbert et al., 2020 ). Schwendimann and colleagues ( 2017 ) refer to an LAD as a “display that aggregates different indicators about learner(s), learning process(es) and/or learning context(s) into one or multiple visualizations” (p. 37). Such indicators may provide information, for instance, about student actions and use of learning contents on a learning platform, or the results of one’s learning performance, such as grades (Schwendimann et al., 2017 ). Data can also be extracted from educational institutions’ student information systems to provide students with snapshots of their study progress and access to learning support (Elouazizi, 2014 ). While visualizations enable intuitive and quick interpretations of educational data (Papamitsiou & Economides, 2015 ), they additionally require careful preparation, as not all users may necessarily interpret them uniformly (Aguilar, 2018 ).

LADs can target various stakeholders, and recently there has been a growing interest in their development for students’ personal use (Van Leeuwen et al., 2022 ). Such displays, also known as student-facing dashboards, are thought to increase students’ knowledge of themselves and to assist them in achieving educational goals (Eickholt et al., 2022 ). They are also believed to promote student autonomy by encouraging students to take control of their learning and by supporting their intrinsic motivation to succeed (Bodily & Verbert, 2017 ). However, simply making analytics applications available to students does not guarantee that they will be used productively in terms of learning (Wise, 2014 ; Wise et al., 2016 ). Moreover, they may not necessarily cover or address the relevant aspects of learning (Clow, 2013 ). Thus, to promote the widespread acceptance and adoption of LADs, it is crucial to consider students’ perspectives on their use as a means of learning support (Divjak et al., 2023 ; Schumacher & Ifenthaler, 2018 ). If students’ needs are not adequately examined and met, such analytics applications may fail to encourage or even hinder the process of SRL (Schumacher & Ifenthaler, 2018 ).

Although previous research on students’ perceptions of LA to enhance their SRL appears to be limited, some studies have addressed such perspectives. Schumacher and Ifenthaler ( 2018 ) found that HE students appreciated LADs that help them plan and initiate their learning activities with supporting elements such as reminders, to-do lists, motivational prompts, learning objectives, and adaptive recommendations, thus promoting the forethought phase of SRL. The students in their study also expected such analytics applications to support the performance phase by providing analyses of their current situation and progress towards goals, materials to meet their individual learning needs, and opportunities for learning exploration and social interaction. To promote the self-reflection phase, the students anticipated LADs to allow for self-assessment, real-time feedback, and future recommendations but were divided as to whether they should receive comparative information about their own or their peers’ performance (Schumacher & Ifenthaler, 2018 ). Additionally, the students desired analytics applications to be holistic and advanced, as well as adaptable to individual needs (Schumacher & Ifenthaler, 2018 ).

Somewhat similar notions were made by Divjak and colleagues ( 2023 ), who discovered that students welcomed LADs that promote short-term planning and organization of learning but were wary of making comparisons or competing with peers, as they might demotivate learners. Correspondingly, De Barba et al. ( 2022 ) noted that students perceived goal setting and monitoring of progress from a multiple-goals approach as key features in LADs, but they were hesitant to view peer comparisons, as they could promote unproductive competition between students and challenge data privacy. In a similar vein, Rets et al. ( 2021 ) reported that students favored LADs that provide them with study recommendations but did not favor peer comparison unless additional information was included. Roberts et al. ( 2017 ), in turn, stressed that LADs should be customizable by students and offer them some level of control to support their SRL. Silvola et al. ( 2023 ) found that students perceived LADs as supportive for their study planning and monitoring at a study path level but also associated some challenges with them in terms of SRL. Further, Bennett ( 2018 ) found that students’ responses to receiving analytics data varied and were highly individual. There were different views, for instance, on the potential of analytics to motivate students: although it seemed to inspire most students, not all students felt the same way (Bennett, 2018 ; see also Corrin & De Barba, 2014 ; Schumacher & Ifenthaler, 2018 ). Moreover, LADs were reported to evoke varying affective responses in students (Bennett, 2018 ; Lim et al., 2021 ).

To promote student SRL, it is imperative that LADs comprehensively address and support all phases of SRL (Schumacher & Ifenthaler, 2018 ). However, a systematic literature review conducted by Jivet et al. ( 2017 ) indicated that students were often offered only limited support for goal setting and planning, and comprehensive self-monitoring, as very few of the LADs included in their study enabled the management of self-set learning goals or the tracking of study progress over time. According to Jivet et al. ( 2017 ), this might indicate that most LADs were mainly harnessed to support the reflection and self-evaluation phase of SRL, as the other phases were mostly ignored. Somewhat contradictory results were obtained by Viberg et al. ( 2020 ), whose literature review revealed that most studies aiming to measure or support SRL with LA were primarily focused on the forethought and performance phases and less on the reflection phase. Heikkinen et al. ( 2022 ) discovered that not many of the studies combining analytics-based interventions and SRL processes covered all phases of SRL.

It appears that further development is inevitably required for LADs to better promote student SRL as a whole. Similarly, there is a demand for their tight integration into pedagogical practices and learning processes to encourage their productive use (Wise, 2014 ; Wise et al., 2016 ). One such strategy is to use these analytics applications as a part of guidance activity and as a joint tool for both students and guidance personnel. In the study by Charleer et al. ( 2018 ), the LAD was shown to trigger conversations and to facilitate dialogue between students and study advisors, improve the personalization of guidance, and provide insights into factual data for further interpretation and reflection. However, offering students access to an LAD only during the guidance meeting may not be sufficient to meet their requirements for the entire duration of their studies. For instance, Charleer and colleagues ( 2018 ) found that the students were also interested in using the LAD independently, outside of the guidance context. Also, it seems that encouraging students to actively advance their studies with such analytics applications necessitates a student-centered approach and holistic development through research. According to Rets et al. ( 2021 ), there is a particular call for qualitative insights, as many previous LAD studies that included students have primarily used quantitative approaches (e.g., Beheshitha et al., 2016 ; Divjak et al., 2023 ; Kim et al., 2016 ).

2.3 Research questions

The purpose of this qualitative study is to examine how HE students perceive the utilization of an LAD in SRL. A specific emphasis was placed on its utilization as part of the forethought, performance, and reflection phase processes, considered central to student SRL. The main research question (RQ) and the threefold sub-question are as follows:

RQ: What are the perceptions of HE students on the utilization of an LAD to promote SRL?

How do HE students perceive the use of an LAD and its development as promoting the (1) forethought, (2) performance, and (3) reflection phase processes of SRL?

3.1 Context

The study was conducted in a university of applied sciences (UAS) in Finland that had launched an initial version of an LAD to be piloted together with its students and tutor teachers as a part of the guidance process. The LAD was descriptive in nature and consisted of commonly available analytics data and simple analytics indicators showing an individual student’s study progress and success in a study path. As is typical for descriptive analytics, it offered insights to better understand the past and present (Costas-Jauregui et al., 2021 ) while informing the future action (Van Leeuwen et al., 2022 ). The data were extracted from the UAS’ student information system and presented using Microsoft Power BI tools. No predictive or comparative information was included. The main display of the LAD consisted of three data visualizations and an information bar (see Fig.  1 , a–d), all presented originally in Finnish. Each visualization could also be expanded into a single display for more accurate viewing.

figure 1

An example of the main display of the piloted LAD with data visualizations ( a – c ) and an information bar ( d )

First, the LAD included a data visualization that illustrated a student’s study progress and success per semester using a line chart (Fig.  1 , a). It displayed the scales for total number of credit points (left) and grade point averages (right) for courses completed on a semester timeline. Data points on the chart displayed an individual student’s study performance with respect to these indicators in each semester and were connected to each other with a line. Pointing to one of these data points also opened a data box that indicated the student name and information about courses (course name, scope, grade, assessment date) from which the credit points and grade point averages were obtained.

Second, the LAD contained another type of line chart that indicated a student’s individual study progress over time in more detail (Fig.  1 , b). The chart displayed a timeline with three-month periods and illustrated a scale for the accumulated credit points. Data points on the chart indicated the accumulated number of credit points obtained from the courses and appeared in blue if the student had passed the course(s) and in red if the student had failed the course(s) at that time. As with the line chart above it, the data points in this chart also provided more detailed information about the courses behind the credit points and were intertwined with a line.

Third, the LAD offered information related to a student’s study success through a radar chart (Fig.  1 , c). The chart represented the courses taken by the student and displayed a scale for the grades received from them. The lowest grade was placed in the center of the chart and the highest one on its outer circle. The grades in between were scaled on the chart accordingly, and the courses performed with a similar grade were displayed close to each other. Data points on the chart represented the grades obtained from numerically evaluated courses and were merged with a line. Each data point also had a data box with the course name and the grade obtained.

Fourth, the LAD included an information bar (Fig.  1 , d) that displayed the student number and the student name (removed from the figure), the total number of accumulated credit points, the grade point average for passed courses, and the amount of credit points obtained from practical training.

The LAD was piloted in authentic guidance meetings in which a tutor teacher and a student discussed topical issues related to the completion of studies. Such meetings were a part of the UAS’ standard guidance discussions that were typically held 1–2 times during the academic year, or more often if needed. In the studied meetings, the students and tutor teachers collectively reviewed the LAD to support the discussion. Only the tutor teachers were commonly able to access the LAD, as it was still under development and in the pilot phase. However, the students could examine its use as presented by the tutor teacher. In addition to the LAD, the meeting focused on reviewing the student’s personal study plan, which contained information about their studies to be completed and could be viewed through the student information system. Most of the meetings were organized online, and their duration varied according to an individual student’s needs. A researcher (first author) attended the meetings as an observer.

3.2 Participants and procedures

Participants were HE students ( N  = 16) pursuing a bachelor’s degree at the Finnish University of Applied Sciences (UAS), ranging from 21 to 49 years of age (mean = 30.38, median = 29.5); 11 (68.75%) were female, and 5 (31.25%) were male. HE studies commenced between 2016 and 2020, and comprised different academic fields, including business administration, culture, engineering, humanities and education, and social services and health care. Depending on the degree, study scope ranged from 210 to 240 ECTS credit points, which take approximately three and a half to four years to complete. However, the students could also proceed at a faster or slower pace under certain conditions. The students were selected to represent different study fields and study stages, and to have studied for more than one academic year. Informed consent to participate in the study was obtained from all students, and their participation was voluntary. The research design was approved by the respective UAS.

Data for this qualitative study was collected through semi-structured, individual student interviews conducted in April–September 2022. To address certain topics in each interview, an interview guide was used. The interview questions incorporated into the guide were tested in two student test interviews to simulate a real interview situation and to assure intelligibility, as also suggested by Chenail ( 2011 ). Findings indicated that the questions were largely usable, functional, and understandable, but some had to be slightly refined to ensure their conciseness and to improve clarity and familiarity of expressions vis-à-vis the target group. Also, the order of questions was partly reshaped to support the flow of discussion.

In the interviews, the students were asked to provide information about their demographic and educational backgrounds as well as their overall opinions of educational practices and the use of LA. In particular, they were invited to share their views on the use of the piloted LAD and its development as promoting different phases and processes of SRL. Students’ perceptions were generally based on the assumption that they could use the LAD both independently during their studies and collectively with their tutor teachers as a component of the guidance process.

Interviews were conducted immediately or shortly after the guidance meeting. Interview duration ranged from 42 to 70 min. The graphical presentation of the LAD was commonly shown to the students to provide stimuli and evoke discussion, as suggested by Kwasnicka et al. ( 2015 ). The interviews were conducted by the same researcher (first author) who observed the guidance meetings. They were primarily held online, and only one was organized face-to-face. All interviews were video recorded for subsequent analysis.

3.3 Data analysis

Interview recordings were transcribed verbatim, accumulating a total of 187 pages of textual material for analysis (Times New Roman, 12-point font, line spacing 1). A qualitative content analysis method was used to analyze the data (see Mayring, 2000 ; Schreier, 2014 ) to enhance in-depth understanding of the research phenomenon and to inform practical actions (Krippendorf, 2019 ). Also, data were approached both deductively and inductively (see Elo & Kyngäs, 2008 ; Merriam & Tisdell, 2016 ), and the analysis was supported using the ATLAS.ti program.

Analysis began with a thorough familiarization with the data in order to develop a general understanding of the students’ perspectives. First, the data were deductively coded using Zimmerman and Moylan’s ( 2009 ) SRL model as a theoretical guide for analysis and as applied to the study path perspective. All relevant units of analysis—such as paragraphs, sentences, or phrases that addressed the use of the LAD or its development in relation to the processes of SRL presented in the model—were initially identified from the data, and then sorted into meaningful units with specific codes. The focus was placed on instances in the data that were applicable and similar to the processes represented in the model, but the analysis was not limited to those that fully corresponded to them. The preliminary analysis involved several rounds of coding that ultimately led to the formation of main categories, grouped into the phases of SRL. The forethought phase consisted of processes that emphasized the planning and directing of studies and learning with the LAD. The performance phase, in turn, involved processes that addressed the control and observation of studies and learning through the LAD. Finally, the reflection phase included processes that focused on evaluating and reflecting on studies and learning with the LAD.

Second, the data were approached inductively by examining the use of the LAD and its development as distinct aspects within each phase and process of SRL (i.e., the main categories). The aim was, on the one hand, to identify how the use of the LAD was considered to serve the students in the phases and processes of SRL in its current form, and on the other hand, how it should be improved to better support them. The analysis not only focused on the characteristics of the LAD but also on the practices that surrounded its use and development. The units of analysis were first condensed from the data and then organized into subcategories for similar units. As suggested by Schreier ( 2014 ), the process was continued until a saturation point was reached—that is, no additional categories could be found. As a result, subcategories for all of the main categories were identified.

Following Schreier’s ( 2014 ) recommendation, the categories were named and described with specific data examples. Additionally, some guidelines were added to highlight differences between categories and to avoid overlap. Using parts of this categorization framework as a coding scheme, a portion of the data (120 text segments) was independently coded into the main categories by the first and second authors. The results were then compared, and all disagreements were resolved through negotiation until a shared consensus was reached. After minor changes were made to the coding scheme, the first author recoded all data. The number of students who had provided responses to each subcategory was counted and added to provide details on the study. For study integrity, the results are supported by data examples with the students’ aliases and the study fields they represented. The quotations were translated from Finnish to English.

The results are reported by first answering the threefold sub-question, that is, how do HE students perceive the use of an LAD and its development as promoting the (1) forethought, (2) performance, and (3) reflection phase processes of SRL. The subsequent results are then summarized to address the main RQ, that is, what are the main findings on HE students’ perceptions on the utilization of an LAD to promote SRL.

4.1 LAD as a part of the forethought phase processes

The students perceived the use of the LAD and its development as related to the forethought phase processes of SRL through the categorization presented in Table  1 below.

Regarding the process of goal setting , almost all students ( n  = 15) emphasized that the use of the LAD promoted the targeting of goal-oriented study completion and competence development. Analytics indicators—such as grades, grade point averages, and accumulated credit points—adequately informed the students of areas they should aim for, further improve, or put more effort into. Only one student ( n  = 1) considered the analytics data too general for establishing goals. However, some students ( n  = 7) specifically mentioned their desire to set and enter individual goals in the LAD. The students were considered to have individual intentions, which should also be made visible in the LAD:

For example, someone might complete [their studies] in four years, someone might do [them] even faster, so maybe in a way that there is the possibility…to set…that, well, I want or my goal is to graduate in this time, and then it would kind of show in it. (Sophia, Humanities and Education student)

Moreover, some students ( n  = 6) wanted to obtain information on the degree program’s overall target times, study requirements, or pace recommendations through the LAD.

In relation to the process of study planning , the use of the LAD provided many students ( n  = 8) grounds to plan and structure the promotion and completion of their studies, such as which courses and types of studies to choose, and what kind of study pace and schedule to follow. However, an even greater set of students ( n  = 12) hoped that the LAD could provide them with more sophisticated tools for planning. For instance, it could inform them about studies to be completed, analyze their study performance in detail, or make predictions for the future. Moreover, it should offer them opportunities to choose courses, make enrollments, set schedules, get reminders, and take notes. One example of such an advanced analytics application was described as follows: ‟It would be a bit like a conversational tool with the student as well, that you would first put…your studies in the program, so it would [then] remind you regularly that hey, do this” (James, Humanities and Education student).

When discussing the use of the LAD, most students ( n  = 12) emphasized the critical role of personal interests and preferences , which was found to not only guide studying and learning in general but to also drive and shape the utilization of the LAD. According to the students, using such an analytics application could particularly benefit those students who, for instance, prefer monitoring of study performance, perceive information in a visualized form, are interested in analytics or themselves, or find it relevant for their studies. Prior familiarization was also considered useful: ‟Of course, there are those who use this kind of thing more and those who use this kind of thing in daily life, so they could especially benefit from this, probably more than I do” (Olivia, Social Services and Health Care student). Even though the LAD was considered to offer pertinent insights for many types of learners, it might not be suitable for all. For instance, it could be challenging for some students to comprehend analytics data or to make effective use of them in their studies. In the development of the LAD, such personal aspects should be noted. The students ( n  = 7) believed the LAD might better adapt to students’ individual needs if it allows them to customize its features and displays or to use it voluntarily based on one’s personal interests and needs.

When describing the use of LAD, half of the students ( n  = 8) discussed its connections with self-efficacy . Making use of analytics data appeared to strengthen the students’ beliefs in their abilities to study and learn in a targeted manner, even if their own feelings suggested otherwise. As one of the students stated:

It’s nice to see that progress, that it has happened although it feels that it hasn’t. So, you can probably set goals based on [an idea] that you’re likely to progress, you could set [them] that you could graduate sometime. (Emma, Engineering student)

On the other hand, the use of the LAD also seemed to require students to have sufficient self-efficacy. It was perceived as vital especially when the analytics data showed unfavorable study performance, such as failed or incomplete courses, or gaps in the study performance with respect to peers. One student ( n  = 1) suggested that the LAD could include praises as evidence of and support for appropriate study performance. Such incentives may help improve the students’ self-confidence as learners. Apart from this, however, the students had no other recommendations for developing the use of the LAD to support self-efficacy.

4.2 LAD as a part of the performance phase processes

The students discussed the use of the LAD and its development in relation to the performance phase processes of SRL according to the categories described in Table  2 below.

The students ( n  = 16) widely agreed that using the LAD benefited them in the process of metacognitive monitoring. By indicating the progress and success of study performance, the LAD was thought to be well suited for observing the course of studies and the development of competences. Moreover, it helped the students to gain awareness of their individual strengths and weaknesses, as well as successes and failures, in a study path. Tracking individual study performance was also found to contribute to purposeful study completion, as the following data example demonstrates:

It’s important especially when there is a target time to graduate, so of course you must follow and stay on track in many ways as there are many such pitfalls to easily fall into, [and] as I’ve fallen quite many times, it’s good [to monitor]. (Sarah, Culture student)

Additionally, the insights of monitoring could be used in future job searches to provide information about acquired competences to potential employers. The successful promotion of studies was generally perceived to require regular monitoring by both students and their educators. However, one of the students considered it a particular responsibility of the students themselves, as the studies were completed at an HE level and were thus voluntary for them. To provide more in-depth insights, many students ( n  = 12) recommended the incorporation of a course-level monitoring opportunity in the LAD. More detailed information was needed, for instance, about course descriptions, assignments completed, and grades received. The rest of the students ( n  = 4), however, wanted to keep the course-level monitoring within the learning management system. One of them stated that it could also be a place through which the students could use the LAD. Some students ( n  = 6) emphasized the need to reconsider current assessment practices to enable better tracking of study performance. Specifically, assessments could be made in greater detail and grades given immediately after course completion. The variation in scales and time points of assessments between the courses and degree programs posed potential challenges for monitoring, thus prompting the need to unify educational practices at the organizational level.

As an activity closely related to metacognitive monitoring, the process of imaging and visualizing was emphasized by the students as helping them to advance in their educational pursuits. Most students ( n  = 15) mentioned that using the LAD allowed them to easily image their study path and clarify their study situation. As one of them stated, ‟This is quite clear, this like, that you can see the overall situation with a quick glance” (Anna, Business Administration student). The visualizations were perceived as informative, tangible, and understandable. However, they were also thought to carry the risk of students neglecting some other relevant aspects of studying and learning in the course of attracting such focused attention. Although the visualizations were generally considered clear, some students ( n  = 11) noted that they could be further improved to better organize the analytics data. For instance, the students suggested the attractive use of colors and the categorization of different types of courses. Visual symbols, in turn, may be particularly effective in course-level data. Technical aspects should also be carefully considered to avoid false visualizations.

Regarding the process of environmental structuring , the LAD appeared to be a welcome addition to the study toolkit and overall study environment. A few students ( n  = 4) considered it appropriate to utilize the LAD as a separate PowerBI application alongside other (Microsoft O365) study tools, but they also felt that it could be utilized through other systems if necessary. However, one student ( n  = 1) raised the need for overall system integrations and some students ( n  = 8) expressed a specific wish to use the LAD as an integrated part of the student information system that was thought to improve its accessibility. A few students ( n  = 6) also wanted to receive some additional analytics data as related to the information stored in such a system. For instance, the students could be informed about their study progress or offered feedback on their overall performance in relation to the personal study plan. Other students ( n  = 10), in turn, did not consider the need for this or did not mention it. It was generally emphasized that the LAD should remain sufficiently clear and simple, as too much information can make its use ineffective:

I think there is just enough information in this. Of course, if you would want to add something small, you could, but I don’t know how much, because I feel that when there is too much information, so it’s a bit like you can’t get as much out of it as you could get. (Olivia, Social Services and Health Care student)

Moreover, the analytics data must be kept private and protected. The students generally desired personal access to the LAD; if given such an opportunity, almost all ( n  = 15) believed they would utilize it in the future, and only one ( n  = 1) was unsure about this prospect. The analytics data were believed to be of particular use when studies were actively promoted. Hence they should be made available to the students from the start of their studies.

Regarding the process of interest and motivation enhancement , all students ( n  = 16) mentioned that using the LAD stimulated their interest or enhanced their motivation, although to varying degrees. For some students, a general tracking of studies was enough to encourage them to continue their pursuits, while others were particularly inspired by seeing either high or low study performance. The development of motivation and interest was generally thought to be a hindrance if the students perceived the analytics data as unfavorable or lacking essential information. As one of students mentioned, ‟If your [chart] line was downward, and if there were only ones and zeros or something like that, it could in a way decrease the motivation” (Helen, Humanities and Education student). It appeared that enhancing interest and motivation was mainly dependent on the students’ own efforts to succeed in their course of study and thus to generate favorable analytics data. However, some students ( n  = 7) felt that it could be additionally enhanced by diversifying and improving the analytics tools in the LAD. For example, the opportunities for more detailed analyses and future study planning or comparisons of study performance with that of peers might further increase these students’ motivation and interest in their studies. Even so, it was also considered possible that especially comparisons between students might have the opposite, demotivating and discouraging effect.

All students ( n  = 16) mentioned that using the LAD facilitated the process of seeking and accessing help . It enabled the identification of potential support needs—for instance, if several courses were failed or left unfinished. As noted, they were perceived as alarming signals for the students themselves to seek help and for the guidance personnel to provide targeted support. As one of the students emphasized, it was important that not only ‟a teacher [tutor] gets interested in looking at what the situation is but also that a student would understand to communicate regarding the promotion of studies and situations” (Emily, Social Services and Health Care student). Some students ( n  = 9) suggested that the students, tutor teachers, or both could receive automated alerts if concerns were to arise. On the other hand, the impact of such automated notifications on changing the course of study was considered somewhat questionable. Above all, the students ( n  = 16) preferred human contact and personal support by the guidance personnel, who would use a sensitive approach to address possibly delicate issues. Support would be important to include in existing practices, as the tutor teachers should not be overburdened. One of the students also stated that the automated alerts could be sufficient if they just worked effectively.

4.3 LAD as a part of the reflection phase and processes

The students addressed the use of the LAD and its development as a part of the reflection phase processes of SRL through categories outlined in Table  3 .

The students widely appreciated the support provided by the use of the LAD for the process of evaluation and reflection. The majority ( n  = 15) mentioned that it allowed them to individually reflect on the underlying aspects of their study performance, such as what kind of learners they are, what type of teaching or learning methods suit them, and what factors impact their learning. Similarly, the students ( n  = 16) valued the possibility of examining the analytics data together with the guidance personnel, such as tutor teachers, and commonly expressed a desire to revisit the LAD in future guidance meetings. It was thought to promote the interpretation of analytics data and to facilitate collective reflection on the reasons behind one’s study success or failure. However, this might require a certain orientation from the guidance personnel, as the student describes below:

I feel that it’s possible to address such themes that what may perhaps cause this. Of course, a lot depends on how amenable the teacher [tutor] is, like are we focusing on how the studies are going but in a way, not so much on what may cause it. (Sophia, Humanities and Education student)

Some students ( n  = 8) proposed incorporating familiarization with analytics insights into course implementations of the degree programs. Additionally, many students ( n  = 11) expressed a desire to examine the student group’s general progress in tutoring classes together with the tutor teacher and peers, particularly if the results were properly targeted and anonymized, and presented in a discreet manner. However, some students ( n  = 5) found this irrelevant. The students were generally wary to evaluate and compare an individual student’s study performance in relation to the peer average through the LAD. While some students ( n  = 4) welcomed such an opportunity, others ( n  = 6) considered it unnecessary. A few students ( n  = 5) emphasized that such comparisons between students should be optional and visible if desired, and one student ( n  = 1) did not have a definite view about it. Rather than competing with others, the students stressed the importance of challenging themselves and evaluating study performance against their own goals or previous achievements.

According to the students ( n  = 16), the use of the LAD was associated with a wide range of affective reactions . Positive responses such as joy, relief, and satisfaction were considered to emerge if the analytics data displayed by the LAD was perceived as favorable and expected, and supportive of future learning. Similarly, negative responses such as anxiety, pressure, or stress were likely to occur if such data indicated poor performance, thus challenging the learning process. On the other hand, such self-reactions could also appear as neutral or indifferent, depending on the student and the situation. Individual responses were related not only to the current version of the LAD but also to its further development targets. Some students ( n  = 3) pointed out the importance of guidance and support, through which the affective reactions could be processed together with professionals. As one of the students underlined, it is important “that there is also that support for the studies, that it isn’t just like you have this chart, and it looks bad, that try to manage. Perhaps there is that support, support in a significant role as well” (Sophia, Humanities and Education student). It seemed critical that the students were not left alone with the LAD but rather were given assistance to deal with the various responses its use may elicit.

4.4 Summary of findings on LAD utilization to promote SRL among HE students

In summary, HE students’ perceptions on the utilization of an LAD to promote SRL phases and processes were largely congruent, but nonetheless partly varied. In particular, the students agreed on the support provided by the LAD during the performance phase and for the purpose of metacognitive monitoring. Such activity was thought to not only enable the students to observe their studies and learning, but to also create the basis for the emergence of all other processes, which were facilitated by the monitoring. That is, while the students familiarized themselves with the course of their studies via the analytics data, they could further apply these insights—for instance, to visualize study situations, enhance motivation, and identify possible support needs. Monitoring with the LAD was also perceived to partly promote the students to the forethought and reflection phases and processes by giving them grounds to target their development areas as well as to reflect on their studies and learning individually and jointly with their tutor teachers. However, it was clear that less emphasis was placed on using the LAD for study planning, addressing individual interests, activating self-efficacy, and supporting environmental structuring, thus giving incentives for their further investigation and future improvement.

Although the LAD used in this study seemed to serve many functions as such, its holistic development was deemed necessary for more thorough SRL support. In particular, the students agreed on the need to improve such an analytics application to further strengthen the performance phase processes—particularly monitoring—by, for instance, developing it for the students’ independent use, and by integrating it with instructional and guidance practices provided by their educators. Moreover, the students commonly wished for more advanced analytics tools that could more directly contribute to the planning of studies and joint reflection of group-level analytics data. To better support the various processes of SRL, new features were generally welcomed into the LAD, although the students’ views and emphases on them also varied. Mixed perspectives were related, for instance, to the need to enrich data or compare students within the LAD. Thus, it seemed important to develop the LAD to conform to the preferences of its users. Along with improving the LAD, students also paid attention to the development of pedagogical practices and guidance processes that together could create appropriate conditions for the emergence of SRL.

5 Discussion

The purpose of this study was to gain insights into HE students’ perceptions on the utilization of an LAD to promote their SRL. The investigation extended the previous research by offering in-depth descriptions of the specific phases and processes of SRL associated with the use of an LAD and its development targets. By applying a study path perspective, it also provided novel insights into how to promote students to become self-regulated learners and effective users of analytics data as an integral part of their studies in HE.

The students’ perspectives on the use of LAD and its development were initially explored as a part of the forethought phase processes of SRL, with a particular focus on the planning and directing of studies and learning. In line with previous research (e.g., Divjak et al., 2023 ; Schumacher & Ifenthaler, 2018 ; Silvola et al., 2023 ), the students in this study appreciated an analytics application that helped them prepare for their future learning endeavors—that is, the initial phase of the SRL cycle (see Zimmerman & Moylan, 2009 ). Using the LAD specifically allowed the students to recognize their development areas and offered a basis to organize their future coursework. However, improvements to allow students to set individual goals and make plans directly within the LAD, as well as to increase awareness of general degree goals, were also desired. These seem to be pertinent avenues for development, as goals may inspire the students not only to invest greater efforts in learning but also to track their achievements against these goals (Wise, 2014 ; Wise et al., 2016 ). While education is typically entered with individual starting points, it is important to allow the students to set personal targets and routes for their learning (Wise, 2014 ; Wise et al., 2016 ).

The results of this study indicate that the use of LADs is primarily driven and shaped by students’ personal interests and preferences, which commonly play a crucial role in the development of SRL (see Zimmerman & Moylan, 2009 ; Panadero & Alonso-Tapia, 2014 ). It might particularly benefit those students for whom analytics-related activities are characteristic and of interest, and who consider them personally meaningful for their studies. It has been argued that if students consider analytics applications serve their learning, they are also willing to use them (Schumacher & Ifenthaler, 2018 ; Wise et al., 2016 ). On the other hand, it has also been stated that not all students are necessarily able to maximize its possible benefits on their own and might need support in understanding its purpose (Wise, 2014 ) and in finding personal relevance for its use. The findings of this study suggest that a more individual fit of LADs could be promoted by allowing students to customize its functionalities and displays. Comparable results have also been obtained from other studies (e.g., Bennett, 2018 ; Rets et al., 2021 ; Roberts et al., 2017 ; Schumacher & Ifenthaler, 2018 ), thus highlighting the need to develop customized LADs that better meet the needs of diverse students and that empower them to control their analytics data. More attention may also be needed to promote the use and development of LADs to support self-efficacy, as it appeared to be an unrecognized potential still for many students in this study. According to Rets et al. ( 2021 ), using LADs for such a purpose might particularly benefit online learners and part-time students, who often face various requirements and thus may forget the efforts put into learning and giving themselves enough credit. By facilitating students’ self-confidence, it could also promote the necessary changes in study behavior, at least for those students with low self-efficacy (Rets et al., 2021 ).

Second, the students’ views on the use of the LAD and its development were investigated in terms of the performance phase processes of SRL, with an emphasis on the control and observation of studies and learning. In line with the results of other studies (De Barba et al., 2022 ; Rets et al., 2021 ; Schumacher & Ifenthaler, 2018 ; Silvola et al., 2023 ), the students preferred using the LAD to monitor their study performance—they wanted to follow their progress and success over time and keep themselves and their educators up to date. According to Jivet et al. ( 2017 ), such functionality directly promotes the performance phase of SRL. Moreover, it seemed to serve as a basis for other activities under SRL, all of which were heavily dependent and built on the monitoring. The results of this study, however, imply that monitoring opportunities should be further expanded to provide even more detailed insights. Moreover, they indicate the need to develop and refine pedagogical practices at the organizational level in order to better serve student monitoring. As monitoring plays a crucial role in SRL (Zimmerman & Moylan, 2009 ), it is essential to examine how it is related to other SRL processes and how it can be effectively promoted with analytics applications (Viberg et al., 2020 ).

In this study, the students used the LAD not only to monitor but also to image and visualize their learning. In accordance with the views of Papamitsiou and Economides ( 2015 ), the visualizations transformed the analytics data into an easily interpretable visual form. The visualizations were not considered to generate information overload, although such a concern has sometimes been associated with the use of LADs (e.g., Susnjak et al., 2022 ). However, the students widely preferred even more descriptive and nuanced illustrations to clarify and structure the analytics data. At the same time, care must be taken to ensure that the visualizations do not divert too much attention from other relevant aspects of learning, as was also found important in prior research (e.g., Charleer et al., 2018 ; Wise, 2014 ). It seems critical that an LAD inform but not overwhelm its users (Susnjak et al., 2022 ). As argued by Klein et al. ( 2019 ), confusing visualizations may not only generate mistrust but also lead to their complete nonuse.

Although the LAD piloted in the study was considered to be a relatively functional application, it could be even more accessible and usable if it was incorporated into the student information system and enriched with the data from it. Even then, however, the LAD should remain simple to use and its data privacy ensured. It has been argued that more information is not always better (Aguilar, 2018 ), and the analytics indicators must be carefully considered to truly optimize learning (Clow, 2013 ). While developing their SRL, students would particularly benefit from a well-structured environment with fewer distractions and more facilitators for learning (Panadero & Alonso-Tapia, 2014 ). The smooth promotion of studies also seems to require personal access to the analytics data. Similar to the learners in Charleer and colleagues’ ( 2018 ) study, the students in this study desired to take advantage of the LAD autonomously, beyond the guidance context. It was believed to be especially used when they were actively promoting their studies. This is seen as a somewhat expected finding given the significant role of study performance indicators in the LAD. However, the question is also raised as to whether such an analytics application would be used mainly by those students who progress diligently but would be ignored by those who advance only a little or not at all. Ideally, the LAD would serve students in different situations and at various stages of studies.

Using the LAD offered the students a promising means to enhance motivation and interest in their studies through the monitoring of analytics data. However, not all students were inspired in the same manner or similar analytics data displayed by the LAD. Although the LAD was seen as inspiring and interesting in many ways, it also had the potential to demotivate or even discourage. This finding corroborates the results of other studies reporting mixed results on the power of LADs to motivate students (e.g., Bennett, 2018 ; Corrin & de Barba, 2014 ; Schumacher & Ifenthaler, 2018 ). As such, it would be essential that the analytics applications consider and address students with different performance levels and motivational factors (Jivet et al., 2017 ). Based on the results of this study, diversifying the tools included in the LAD might also be necessary. On the other hand, the enhancement of motivation was also found to be the responsibility of the students themselves—that is, if the students wish the analytics application to display favorable analytics data and thus motivate them, they must first display concomitant effort in their studies.

The use of the LAD provided a convenient way to intervene if the students’ study performance did not meet expectations. With the LAD, both the students and their tutor teachers could detect signs of possible support needs and address them with guidance. In the future, such needs could also be reported through automated alerts. Overall, however, the students in this study preferred human contact and personal support over automated interventions, contrary to the findings obtained by Roberts and colleagues ( 2017 ). Being identified to their educators did not seem to be a particular concern for them, although it has been found to worry students in other contexts (e.g., Roberts et al., 2017 ). Rather, the students felt they would benefit more from personal support that was specifically targeted to them and sensitive in its approach. The students generally demanded delicate, ethical consideration when acting upon analytics data and in the provision of support, which was also found to be important in prior research (e.g., Kleimola & Leppisaari, 2022 ). Additionally, Wise and colleagues ( 2016 ) underlined the need to foster student agency and to prevent students from becoming overly reliant on analytics-based interventions: if all of the students’ mistakes are pointed out to them, they may no longer learn to recognize mistakes on their own. Therefore, to support SRL, it is essential to know when to intervene and when to let students solve challenges independently (Kramarski & Michalsky, 2009 ).

Lastly, the students’ perceptions on the use and development of the LAD were examined from the perspective of the reflection phase processes of SRL, with particular attention given to evaluation and reflection on studies and learning. The use of the LAD provided the students with a basis to individually reflect on the potential causes behind their study performance, for better or worse. Moreover, they could address such issues together with guidance personnel and thus make better sense of the analytics data. Corresponding to the results of Charleer et al.’s ( 2018 ) study, collective reflection on analytical data provided the students with new insights and supported their understanding. Engaging in such reflective practices offered the students the opportunity to complete the SRL cycle and draw the necessary conclusions regarding their performance for subsequent actions (see Zimmerman & Moylan, 2009 ). In the future, analytics-based reflection could also be implemented in joint tutoring classes and courses included in the degree programs. This would likely promote the integration of LADs into the activity flow of educational environments, as recommended by Wise and colleagues ( 2016 ). In sum, using LADs should be a regular part of pedagogical practices and learning processes (Wise et al., 2016 ).

When evaluating and reflecting on their studies and learning, the students preferred to focus on themselves and their own development as learners. Similar to earlier findings (e.g., Divjak et al., 2023 ; Rets et al., 2021 ; Roberts et al., 2017 ; Schumacher & Ifenthaler, 2018 ), the students felt differently about the need to develop LADs to compare their study performance with that of other students. Although this function could help some of the students to position themselves in relation to their peers, others thought it should be optional or completely avoided. In agreement with the findings of Divjak et al. ( 2023 ), it seemed that the students wanted to avoid mutual competition comparisons; however, it might not be harmful for everyone and in every case. Consequently, care is required when considering the kind of features in the LAD that offer real value to students in a particular context (Divjak et al., 2023 ). Rather than limiting the point of reference only to peers, it might be useful to also offer students other targets for comparative activity, such as individual students’ previous progress or goals set for the activity (Wise, 2014 ; Wise et al., 2016 ; see also Bandura, 1986 ). In addition, it is important that students not be left alone to face and cope with the various reactions that may be elicited by such evaluation and reflection with analytics data (Kleimola & Leppisaari, 2022 ). As the results of this study and those of others (e.g., Bennett, 2018 ; Lim et al., 2021 ) generally indicate, affective responses evoked by LADs may vary and are not always exclusively positive. Providing a safe environment for students to reflect on successes and failures and to process the resulting responses might not only encourage necessary changes in future studies but also promote the use of an LAD as a learning support.

In summary, the results of this study imply that making an effective use of an analytics application—even with a limited amount of analytics data and functionality available—may facilitate the growth of students into self-regulated learners. That is, even if the LAD principally addresses some particular phase or process of SRL, it can act as a catalyst to encourage students in the development of SRL on a wider scale. This finding also emphasizes the interdependent and interactive nature of SRL (see Zimmerman, 2011 ; Zimmerman & Moylan, 2009 ) that similarly seems to characterize the use of an LAD. However, the potential of LADs to promote SRL may be lost unless students themselves are (pro)active in initiating and engaging with such activity or receive appropriate pedagogical support for it. There appears to be a specific need for guidance that is sensitive to the students’ affective reactions and would help students learn and develop with analytics data. Providing the students with adequate support is particularly critical if their studies have not progressed favorably or as planned. It seems important that the LAD would not only target those students who are already self-regulated learners but, with appropriate support and guidance, would also serve those students who are gradually growing in that direction.

5.1 Limitations and further research

This study has some limitations. First, it involved a relatively small number of HE students who were examined in a pilot setting. Although the sample was sufficient to provide in-depth insights and the saturation point was reached, it might be useful in further research to use quantitative approaches and diverse groups of students to improve the generalizability of results to a larger student population. Also, addressing the perspectives of guidance personnel, specifically tutor teachers, could provide additional insights into the use and development of LADs to promote SRL.

Second, the LAD piloted and investigated in this study was not yet widely in use or accessible by the students. Moreover, it was examined for a relatively brief time, so the students’ perceptions were shaped not only by their experiences but also by their expectations of its potential. Future research on students and tutor teachers with more extensive user experience could build an even more profound picture of the possibilities and limitations of the LAD from a study path perspective. Such investigation might also benefit from trace data collected from the students’ and tutor teachers’ interactions with the LAD. It would be valuable to examine how the students and tutor teachers make use of the LAD in the long term and how it is integrated into learning activities and pedagogical practices.

Third, due to the emphasis on an HE institution and the analytics application used in this specific context, the transferability of results may be limited. However, the results of this study offer many important and applicable perspectives to consider in various educational environments where LADs are implemented and aimed at supporting students across their studies.

6 Conclusions

The results of this study offer useful insights for the creation of LADs that are closely related to the theoretical aspects of learning and that meet the particular needs of their users. In particular, the study increases the understanding of how such analytics applications should be connected to the entirety of studies—that is, what kind of learning processes and pedagogical support are needed alongside them to best serve students in their learning. Consequently, it encourages a comprehensive consideration and promotion of pedagogy, educational technology, and related practices in HE. The role of LA in supporting learning and guidance seems significant, so investments must be made in its appropriate use and development. In particular, the voice of the students must be listened to, as it promotes their commitment to the joint development process and fosters the productive use of analytics applications in learning. At its best, LA becomes an integral part of HE settings, one that helps students to complete their studies and contributes to their development into self-regulated learners.

Data availability

Not applicable.

Abbreviations

Higher education

  • Learning analytics
  • Learning analytics dashboard

Research question

  • Self-regulated learning

Aguilar, S. J. (2018). Examining the relationship between comparative and self-focused academic data visualizations in at-risk college students’ academic motivation. Journal of Research on Technology in Education , 50 (1), 84–103. https://doi.org/10.1080/15391523.2017.1401498

Article   Google Scholar  

Anthonysamy, L., Koo, A-C., & Hew, S-H. (2020). Self-regulated learning strategies and non-academic outcomes in higher education blended learning environments: A one decade review. Education and Information Technologies , 25 (5), 3677–3704. https://doi.org/10.1007/s10639-020-10134-2

Azevedo, R., Guthrie, J. T., & Seibert, D. (2004). The role of self-regulated learning in fostering students’ conceptual understanding of complex systems with hypermedia. Journal of Educational Computing Research , 30 (1–2), 87–111. https://doi.org/10.2190/DVWX-GM1T-6THQ-5WC7

Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory . Prentice-Hall.

Barnard-Brak, L., Paton, V. O., & Lan, W. Y. (2010). Profiles in self-regulated learning in the online learning environment. The International Review of Research in Open and Distributed Learning , 11 (1), 61–80. https://doi.org/10.19173/irrodl.v11i1.769

Beheshitha, S. S., Hatala, M., Gašević, D., & Joksimović, S. (2016). The role of achievement goal orientations when studying effect of learning analytics visualizations. Proceedings of the 6th International Conference on Learning Analytics and Knowledge (pp. 54–63). Association for Computing Machinery. https://doi.org/10.1145/2883851.2883904

Bennett, E. (2018). Students’ learning responses to receiving dashboard data: Research report . Huddersfield Centre for Research in Education and Society, University of Huddersfield.

Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies , 10 (4), 405–418. https://doi.org/10.1109/TLT.2017.2740172

Broadbent, J., & Poon, W. L. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. The Internet and Higher Education , 27 , 1–13. https://doi.org/10.1016/j.iheduc.2015.04.007

Callan, G., Longhurst, D., Shim, S., & Ariotti, A. (2022). Identifying and predicting teachers’ use of practices that support SRL. Psychology in the Schools , 59 (11), 2327–2344. https://doi.org/10.1002/pits.22712

Charleer, S., Moere, A. V., Klerkx, J., Verbert, K., & De Laet, T. (2018). Learning analytics dashboards to support adviser-student dialogue. IEEE Transactions on Learning Technologies , 11 (3), 389–399. https://doi.org/10.1109/TLT.2017.2720670

Chenail, R. J. (2011). Interviewing the investigator: Strategies for addressing instrumentation and researcher bias concerns in qualitative research. The Qualitative Report , 16 (1), 255–262. https://doi.org/10.46743/2160-3715/2011.1051

Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education , 18 (6), 683–695. https://doi.org/10.1080/13562517.2013.827653

Conole, G., Gašević, D., Long, P., & Siemens, G. (2011). Message from the LAK 2011 general & program chairs. Proceedings of the 1st International Conference on Learning Analytics and Knowledge . Association for Computing Machinery. https://doi.org/10.1145/2090116

Corrin, L., & De Barba, P. (2014). Exploring students’ interpretation of feedback delivered through learning analytics dashboards. In B. Hegarty, J. McDonald, & S-K. Loke (Eds.), ASCILITE 2014 conference proceedings—Rhetoric and reality: Critical perspectives on educational technology (pp. 629–633). Australasian Society for Computers in Learning in Tertiary Education (ASCILITE). https://www.ascilite.org/conferences/dunedin2014/files/concisepapers/223-Corrin.pdf

Costas-Jauregui, V., Oyelere, S. S., Caussin-Torrez, B., Barros-Gavilanes, G., Agbo, F. J., Toivonen, T., Motz, R., & Tenesaca, J. B. (2021). Descriptive analytics dashboard for an inclusive learning environment. 2021 IEEE Frontiers in Education Conference (FIE) (pp. 1–9). IEEE. https://doi.org/10.1109/FIE49875.2021.9637388

De Barba, P., Oliveira, E. A., & Hu, X. (2022). Same graph, different data: A usability study of a student-facing dashboard based on self-regulated learning theory. In S. Wilson, N. Arthars, D. Wardak, P. Yeoman, E. Kalman, & D. Y. T. Liu (Eds.), ASCILITE 2022 conference proceedings: Reconnecting relationships through technology (Article e22168). Australasian Society for Computers in Learning in Tertiary Education (ASCILITE). https://doi.org/10.14742/apubs.2022.168

De Laet, T., Millecamp, M., Ortiz-Rojas, M., Jimenez, A., Maya, R., & Verbert, K. (2020). Adoption and impact of a learning analytics dashboard supporting the advisor: Student dialogue in a higher education institute in Latin America. British Journal of Educational Technology , 51 (4), 1002–1018. https://doi.org/10.1111/bjet.12962

Divjak, B., Svetec, B., & Horvat, D. (2023). Learning analytics dashboards: What do students actually ask for? Proceedings of the 13th International Learning Analytics and Knowledge Conference (pp. 44–56). Association for Computing Machinery. https://doi.org/10.1145/3576050.3576141

Dollinger, M., & Lodge, J. M. (2018). Co-creation strategies for learning analytics. Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 97–101). Association for Computing Machinery. https://doi.org/10.1145/3170358.3170372

Eickholt, J., Weible, J. L., & Teasley, S. D. (2022). Student-facing learning analytics dashboard: Profiles of student use. IEEE Frontiers in Education Conference (FIE) (1–9). IEEE. https://doi.org/10.1109/FIE56618.2022.9962531

Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing , 62 (1), 107–115. https://doi.org/10.1111/j.1365-2648.2007.04569.x

Elouazizi, N. (2014). Critical factors in data governance for learning analytics. Journal of Learning Analytics , 1 (3), 211–222. https://doi.org/10.18608/jla.2014.13.25

Heikkinen, S., Saqr, M., Malmberg, J., & Tedre, M. (2022). Supporting self-regulated learning with learning analytics interventions: A systematic literature review. Education and Information Technologies , 28 (3), 3059–3088. https://doi.org/10.1007/s10639-022-11281-4

Jivet, I., Scheffel, M., Drachsler, H., & Specht, M. (2017). Awareness is not enough: Pitfalls of learning analytics dashboards in the educational practice. In É. Lavoué, H. Drachsler, K. Verbert, J. Broisin, & M. Pérez-Sanagustín (Eds.), Lecture notes in computer science: Vol. 10474. Data driven approaches in digital education (pp. 82–96). Springer. https://doi.org/10.1007/978-3-319-66610-5_7

Jivet, I., Scheffel, M., Specht, M., & Drachsler, H. (2018). License to evaluate: Preparing learning analytics dashboards for educational practice. Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 31–40). Association for Computing Machinery. https://doi.org/10.1145/3170358.3170421

Jivet, I., Scheffel, M., Schmitz, M., Robbers, S., Specht, M., & Drachsler, H. (2020). From students with love: An empirical study on learner goals, self-regulated learning and sense-making of learning analytics in higher education. The Internet and Higher Education , 47 , 100758. https://doi.org/10.1016/j.iheduc.2020.100758

Jivet, I., Wong, J., Scheffel, M., Valle Torre, M., Specht, M., & Drachsler, H. (2021). Quantum of choice: How learners’ feedback monitoring decisions, goals and self-regulated learning skills are related. Proceedings of 11th International Conference on Learning Analytics and Knowledge (pp. 416–427). Association for Computing Machinery. https://doi.org/10.1145/3448139.3448179

Kim, J., Jo, I-H., & Park, Y. (2016). Effects of learning analytics dashboard: Analyzing the relations among dashboard utilization, satisfaction, and learning achievement. Asia Pacific Education Review , 17 (1), 13–24. https://doi.org/10.1007/s12564-015-9403-8

Kleimola, R., & Leppisaari, I. (2022). Learning analytics to develop future competences in higher education: A case study. International Journal of Educational Technology in Higher Education , 19 (1), 17. https://doi.org/10.1186/s41239-022-00318-w

Kleimola, R., López-Pernas, S., Väisänen, S., Saqr, M., Sointu, E., & Hirsto, L. (2023). Learning analytics to explore the motivational profiles of non-traditional practical nurse students: A mixed-methods approach. Empirical Research in Vocational Education and Training , 15 (1), 11. https://doi.org/10.1186/s40461-023-00150-0

Klein, C., Lester, J., Rangwala, H., & Johri, A. (2019). Technological barriers and incentives to learning analytics adoption in higher education: Insights from users. Journal of Computing in Higher Education , 31 (3), 604–625. https://doi.org/10.1007/s12528-019-09210-5

Kramarski, B. (2018). Teachers as agents in promoting students’ SRL and performance: Applications for teachers’ dual-role training program. In D. H. Schunk, & J. A. Greene (Eds.), Handbook of self-regulation of learning and performance (2nd ed., pp. 223–239). Routledge/Taylor & Francis Group. https://doi.org/10.4324/9781315697048-15

Kramarski, B., & Michalsky, T. (2009). Investigating preservice teachers’ professional growth in self-regulated learning environments. Journal of Educational Psychology , 101 (1), 161–175. https://doi.org/10.1037/a0013101

Krippendorff, K. (2019). Content analysis (4th ed.). SAGE Publications. https://doi.org/10.4135/9781071878781

Kwasnicka, D., Dombrowski, S. U., White, M., & Sniehotta, F. F. (2015). Data-prompted interviews: Using individual ecological data to stimulate narratives and explore meanings. Health Psychology , 34 (12), 1191–1194. https://doi.org/10.1037/hea0000234

Lim, L-A., Dawson, S., Gašević, D., Joksimović, S., Pardo, A., Fudge, A., & Gentili, S. (2021). Students’ perceptions of, and emotional responses to, personalised learning analytics-based feedback: An exploratory study of four courses. Assessment & Evaluation in Higher Education , 46 (3), 339–359. https://doi.org/10.1080/02602938.2020.1782831

Lodge, J. M., Panadero, E., Broadbent, J., & De Barba, P. G. (2019). Supporting self-regulated learning with learning analytics. In J. M. Lodge, J. Horvath, & L. Corrin (Eds.), Learning analytics in the classroom: Translating learning analytics research for teachers (pp. 45–55). Routledge. https://doi.org/10.4324/9781351113038-4

Marzouk, Z., Rakovic, M., Liaqat, A., Vytasek, J., Samadi, D., Stewart-Alonso, J., Ram, I., Woloshen, S., Winne, P. H., & Nesbit, J. C. (2016). What if learning analytics were based on learning science? Australasian Journal of Educational Technology , 32 (6), 1–18. https://doi.org/10.14742/ajet.3058

Matcha, W., Uzir, N. A., Gašević, D., & Pardo, A. (2020). A systematic review of empirical studies on learning analytics dashboards: A self-regulated learning perspective. IEEE Transactions on Learning Technologies , 13 (2), 226–245. https://doi.org/10.1109/TLT.2019.2916802

Mayring, P. (2000). Qualitative content analysis. Forum Qualitative Sozialforschung Forum: Qualitative Social Research , 1 (2). https://doi.org/10.17169/fqs-1.2.1089

Merriam, S. B., & Tisdell, E. J. (2016). Qualitative research: A guide to design and implementation (4th ed.). JosseyBass.

Molenaar, I., Horvers, A., & Baker, R. S. (2019). Towards hybrid human-system regulation: Understanding children’ SRL support needs in blended classrooms. Proceedings of the 9th International Conference on Learning Analytics and Knowledge (pp. 471–480). Association for Computing Machinery. https://doi.org/10.1145/3303772.3303780

Moos, D. C. (2018). Emerging classroom technology: Using self-regulation principles as a guide for effective implementation. In D. H. Schunk, & J. A. Greene (Eds.), Handbook of self-regulation of learning and performance (2nd ed., pp. 243–253). Routledge. https://doi.org/10.4324/9781315697048-16

Panadero, E. (2017). A review of self-regulated learning: Six models and four directions for research. Frontiers in Psychology , 8 ., Article 422. https://doi.org/10.3389/fpsyg.2017.00422

Panadero, E., & Alonso-Tapia, J. (2014). How do students self-regulate? Review of Zimmerman’s cyclical model of self-regulated learning. Anales De Psicología , 30 (2), 450–462. https://doi.org/10.6018/analesps.30.2.167221

Papamitsiou, Z., & Economides, A. A. (2015). Temporal learning analytics visualizations for increasing awareness during assessment. RUSC Universities and Knowledge Society Journal , 12 (3), 129–147. https://doi.org/10.7238/rusc.v12i3.2519

Park, Y., & Jo, I. (2015). Development of the learning analytics dashboard to support students’ learning performance. Journal of Universal Computer Science , 21 (1), 110–133. https://doi.org/10.3217/jucs-021-01-0110

Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 451–502). Academic Press. https://doi.org/10.1016/B978-012109890-2/50043-3

Puustinen, M., & Pulkkinen, L. (2001). Models of self-regulated learning: A review. Scandinavian Journal of Educational Research , 45 (3), 269–286. https://doi.org/10.1080/00313830120074206

Rets, I., Herodotou, C., Bayer, V., Hlosta, M., & Rienties, B. (2021). Exploring critical factors of the perceived usefulness of a learning analytics dashboard for distance university students. International Journal of Educational Technology in Higher Education , 18 (1). https://doi.org/10.1186/s41239-021-00284-9

Roberts, L. D., Howell, J. A., & Seaman, K. (2017). Give me a customizable dashboard: Personalized learning analytics dashboards in higher education. Technology Knowledge and Learning , 22 (3), 317–333. https://doi.org/10.1007/s10758-017-9316-1

Schreier, M. (2014). Qualitative content analysis. In U. Flick (Ed.), The SAGE handbook of qualitative data analysis (pp. 170–183). SAGE. https://doi.org/10.4135/9781446282243

Schumacher, C., & Ifenthaler, D. (2018). Features students really expect from learning analytics. Computers in Human Behavior , 78 , 397–407. https://doi.org/10.1016/j.chb.2017.06.030

Schwendimann, B. A., Rodríguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., Gillet, D., & Dillenbourg, P. (2017). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies , 10 (1), 30–41. https://doi.org/10.1109/TLT.2016.2599522

Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education—A review of UK and international practice: Full report . Jisc. https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v2_0.pdf

Silvola, A., Sjöblom, A., Näykki, P., Gedrimiene, E., & Muukkonen, H. (2023). Learning analytics for academic paths: Student evaluations of two dashboards for study planning and monitoring. Frontline Learning Research , 11 (2), 78–98. https://doi.org/10.14786/flr.v11i2.1277

Susnjak, T., Ramaswami, G. S., & Mathrani, A. (2022). Learning analytics dashboard: A tool for providing actionable insights to learners. International Journal of Educational Technology in Higher Education , 19 (1). https://doi.org/10.1186/s41239-021-00313-7

Teasley, S. D. (2017). Student facing dashboards: One size fits all? Technology Knowledge and Learning , 22 (3), 377–384. https://doi.org/10.1007/s10758-017-9314-3

Van Leeuwen, A., Teasley, S., & Wise, A. (2022). Teacher and student facing analytics. In C. Lang, G. Siemens, A. Wise, D. Gašević, & A. Merceron (Eds.), Handbook of learning analytics (2nd ed., pp. 130–140). Society for Learning Analytics Research. https://doi.org/10.18608/hla22.013

Verbert, K., Ochoa, X., De Croon, R., Dourado, R. A., & De Laet, T. (2020). Learning analytics dashboards: The past, the present and the future. Proceedings of the 10th International Conference on Learning Analytics and Knowledge (pp. 35–40). Association for Computing Machinery. https://doi.org/10.1145/3375462.3375504

Viberg, O., Khalil, M., & Baars, M. (2020). Self-regulated learning and learning analytics in online learning environments: A review of empirical research. Proceedings of the 10th International Conference on Learning Analytics and Knowledge (pp. 524–533). Association for Computing Machinery. https://doi.org/10.1145/3375462.3375483

Virtanen, P. (2019). Self-regulated learning in higher education: Basic dimensions, individual differences, and relationship with academic achievement (Helsinki Studies in Education, 1798–8322) [Doctoral dissertation, University of Helsinki]. University of Helsinki Open Repository. https://urn.fi/URN:ISBN:978-951-51-5681-5

West, D., Luzeckyj, A., Toohey, D., Vanderlelie, J., & Searle, B. (2020). Do academics and university administrators really know better? The ethics of positioning student perspectives in learning analytics. Australasian Journal of Educational Technology , 36 (2), 60–70. https://doi.org/10.14742/ajet.4653

Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated learning. In D. Hacker, J. Dunlosky, & A. Graesser (Eds.), Metacognition in educational theory and practice (pp. 277–304). Erlbaum.

Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. Proceedings of the 4th International Conference on Learning Analytics and Knowledge (pp. 203–211). Association for Computing Machinery. https://doi.org/10.1145/2567574.2567588

Wise, A. F., Vytasek, J. M., Hausknecht, S., & Zhao, Y. (2016). Developing learning analytics design knowledge in the middle space: The student tuning model and align design framework for learning analytics use. Online Learning , 20 (2), 155–182. https://doi.org/10.24059/olj.v20i2.783

Wong, J., Baars, M., Davis, D., Van Der Zee, T., Houben, G-J., & Paas, F. (2019). Supporting self-regulated learning in online learning environments and MOOCs: A systematic review. International Journal of Human–Computer Interaction , 35 (4–5), 356–373. https://doi.org/10.1080/10447318.2018.1543084

Zimmerman, B. J. (1999). Attaining self-regulation: A social cognitive perspective. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 13–39). Academic Press. https://doi.org/10.1016/B978-012109890-2/50031-7

Zimmerman, B. J. (2011). Motivational sources and outcomes of self-regulated learning and performance. In B. J. Zimmerman, & D. H. Schunk (Eds.), Handbook of self-regulation of learning and performance (pp. 49–64). Routledge/Taylor & Francis Group.

Zimmerman, B. J. (2015). Self-regulated learning: Theories, measures, and outcomes. In J. D. Wright (Ed.), International Encyclopedia of the Social & Behavioral Sciences (2nd ed., pp. 541–546), Elsevier. https://doi.org/10.1016/B978-0-08-097086-8.26060-1

Zimmerman, B. J., & Moylan, A. R. (2009). Self-regulation: Where metacognition and motivation intersect. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Handbook of metacognition in education (pp. 299–315). Routledge/Taylor & Francis Group.

Download references

Acknowledgements

Language process: In the preparation process of the manuscript, the Quillbot Paraphraser tool was used to improve language clarity in some parts of the text (e.g., word choice). The manuscript was also proofread by a professional. After using this tool and service, the authors reviewed and revised the text as necessary, taking full responsibility for the content of this manuscript.

The authors also thank the communications and information technology specialists of the UAS under study for their support in editing Fig. 1 for publication.

This research was partly funded by Business Finland through the European Regional Development Fund (ERDF) project “Utilization of learning analytics in the various educational levels for supporting self-regulated learning (OAHOT)” (Grant no. 5145/31/2019). The article was completed with grants from the Finnish Cultural Foundation’s Central Ostrobothnia Regional Fund (Grant no. 25221232) and The Emil Aaltonen Foundation (Grant no. 230078), which were awarded to the first author.

Open Access funding provided by University of Lapland.

Author information

Authors and affiliations.

Faculty of Education, University of Lapland, Rovaniemi, Finland

Riina Kleimola & Heli Ruokamo

School of Applied Educational Science and Teacher Education, University of Eastern Finland, Joensuu, Finland

Laura Hirsto

You can also search for this author in PubMed   Google Scholar

Contributions

Conceptualization: RK, LH. Data collection: RK. Formal analysis: RK, LH. Writing—original draft: RK. Writing—review and editing: RK, LH, HR. Supervision: LH, HR. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Riina Kleimola .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Kleimola, R., Hirsto, L. & Ruokamo, H. Promoting higher education students’ self-regulated learning through learning analytics: A qualitative study. Educ Inf Technol (2024). https://doi.org/10.1007/s10639-024-12978-4

Download citation

Received : 14 February 2024

Accepted : 09 August 2024

Published : 07 September 2024

DOI : https://doi.org/10.1007/s10639-024-12978-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Higher education student
  • Qualitative study
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. Qualitative Data Analysis stock illustration. Illustration of

    data analysis of qualitative research

  2. Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic

    data analysis of qualitative research

  3. FREE 10+ Qualitative Data Analysis Samples in PDF

    data analysis of qualitative research

  4. Qualitative Infographic

    data analysis of qualitative research

  5. Qualitative Data Analysis Methods And Techniques

    data analysis of qualitative research

  6. Your Guide to Qualitative and Quantitative Data Analysis Methods

    data analysis of qualitative research

VIDEO

  1. DATA ANALYSIS

  2. Data Analysis in Research

  3. Eng 518 lecture 26

  4. Definition of Qualitative research

  5. Qualitative Data Analysis Procedures in Linguistics

  6. AI Enhanced Qualitative Data Analysis Tools #shortsvideo

COMMENTS

  1. Qualitative Data Analysis: What is it, Methods + Examples

    Qualitative data analysis is a systematic process of examining non-numerical data to extract meaning, patterns, and insights. It focuses on the qualitative aspects of data, such as text, images, audio, and videos, and seeks to understand human experiences, perceptions, and behaviors.

  2. Qualitative Data Analysis: Step-by-Step Guide (Manual vs ...

    Learn how to conduct qualitative data analysis using manual or automatic methods. Compare the steps, techniques, challenges and benefits of each approach, and see how software can help automate the process.

  3. Learning to Do Qualitative Data Analysis: A Starting Point

    Learning to Do Qualitative Data Analysis: A Starting Point

  4. Qualitative Data Analysis Methods: Top 6 + Examples

    Learn the basics of six common qualitative data analysis methods, such as content analysis, narrative analysis, discourse analysis and more. See how to apply them to your research project and get extra resources to dive deeper.

  5. What Is Qualitative Research?

    Learn what qualitative research is, how it differs from quantitative research, and what methods and approaches are used to collect and analyze non-numerical data. Find out the advantages and disadvantages of qualitative research and see examples of qualitative data analysis.

  6. Data Analysis in Qualitative Research

    Data Analysis in Qualitative Research

  7. Qualitative Research: Data Collection, Analysis, and Management

    Qualitative Research: Data Collection, Analysis, and ...

  8. Data Analysis in Qualitative Research: A Brief Guide to Using Nvivo

    Data Analysis in Qualitative Research: A Brief Guide to ...

  9. PDF The SAGE Handbook of Qualitative Data Analysis

    Qualitative data analysis is the central step in qualitative research that involves various methods and approaches to reduce, expand or interpret data. This handbook maps the field of qualitative data analysis by discussing its extension, diversity, challenges and trends.

  10. From Data Management to Actionable Findings: A Five-Phase Process of

    This five-phase analysis process, or aspects of this process, can be applied as part of the analytic strategy across several qualitative methodological traditions and can also be engaged as a method of qualitative data analysis in mixed methods research, and in thematic analysis, discourse analysis, policy analysis, and content analysis.

  11. Qualitative Research Design and Data Analysis: Deductive and Inductive

    Learn how to use deductive and inductive analysis strategies to organize, make meaning, and explain qualitative data. Deductive analysis applies theory or predetermined codes to the data, while inductive analysis lets the data guide the codes and themes.

  12. How to use and assess qualitative research methods

    How to use and assess qualitative research methods - PMC

  13. Qualitative Data Analysis

    Learn how to analyze qualitative data, such as open-ended questions, planning documents, and narrative data, using methods like content analysis and software like Atlas.ti. This chapter also explains the difference between quantitative and qualitative data, and the value of qualitative research for planners.

  14. (PDF) Analysing data in qualitative research

    A key feature of qualitative data analysis is. the application of inductive reasoning, which. generates ideas from the collected data. e. researcher uses speci c observations (data) to. develop ...

  15. Qualitative Data Analysis

    Learn how to transcribe, organize, code, and analyze qualitative data with this comprehensive guide. Explore different qualitative research methods, approaches, and software tools for data analysis.

  16. (PDF) Qualitative Data Analysis and Interpretation: Systematic Search

    Qualitative data analysis in one of the most important steps in the qualitative research process (Leech & Onwuegbuzie, 2007) because it assists researchers to make sense of their qualitative data.

  17. PDF 12 Qualitative Data, Analysis, and Design

    This chapter introduces qualitative research in education, its basic principles, types of data, and methods of analysis. It does not address the query about the collected qualitative data describing the essential abstract ideas, which is a philosophical question beyond the scope of this chapter.

  18. Data Analysis for Qualitative Research: 6 Step Guide

    Learn how to perform data analysis for qualitative research using Microsoft Excel and Word, and various online tools. Follow the steps to collect, transcribe, code, and interpret non-numeric data for your research question.

  19. PDF A Step-by-Step Guide to Qualitative Data Analysis

    Learn how to organize, code, and analyze qualitative data from interviews for research purposes. This guide covers topics such as data displays, themes, reliability, validity, and explanations.

  20. Qualitative Data Analysis

    Learn about different methods and steps of qualitative data analysis, such as content analysis, narrative analysis, discourse analysis, framework analysis and grounded theory. Find examples of coding, identifying themes, patterns and relationships, and summarizing data.

  21. Observational Methods and Qualitative Data Analysis

    The course emphasizes the iterative nature of qualitative research, where data collection and analysis occur simultaneously, allowing for constant comparison and refinement of codes. Throughout the course, students will compare the purpose, focus, data gathering, and analysis methods of ethnographic inquiry and case study.

  22. PDF Qualitative Data Analysis

    Learn about the distinctive features, methods, and ethics of qualitative data analysis, with examples from research on youth conflict and homelessness. Explore different types of qualitative analysis, such as ethnography, grounded theory, and computer-assisted analysis.

  23. Data analysis in qualitative research

    Data analysis in qualitative research

  24. Qualitative Research in Healthcare: Data Analysis

    Qualitative Research in Healthcare: Data Analysis - PMC

  25. Balancing Qualitative and Quantitative Research Methods: Insights and

    The growth in qualitative research is a well-noted and welcomed fact within the social sciences; however, there is a regrettable lack of tools available for the analysis of qualitative material.

  26. Qualitative Research Methods & Data Analysis

    Qualitative Research Methods and Data Analysis presents strategies for analyzing and making sense of qualitative data. Both descriptive and interpretive qualitative studies will be discussed, as will more defined qualitative approaches such as grounded theory, narrative analysis, and case studies.

  27. Longitudinal analysis of teacher self-efficacy evolution ...

    Considering theoretical underpinnings in qualitative research is common when interpreting data (Strauss and Corbin 1990). First, a list based on our conceptual framework of teacher self-efficacy ...

  28. Qualitative family research: Innovative, flexible, theoretical, reflexive

    Qualitative research is increasingly part of the methodological repertoire of scholars who study families. In this article, we examine contemporary trends, tensions, and possibilities for the interdisciplinary enterprise of qualitative research on and about families. We situate our collaborative approach as critical family scholars who pursue social justice work. We then examine four trends ...

  29. Promoting higher education students' self-regulated learning through

    Data for the study were collected from students (N = 16) through semi-structured interviews and analyzed using a qualitative content analysis. Results indicated that the students perceived the use of the learning analytics dashboard as an opportunity for versatile learning support, providing them with a means to control and observe their ...