Art of Presentations

[Guide] How to Present Qualitative Research Findings in PowerPoint?

By: Author Shrot Katewa

[Guide] How to Present Qualitative Research Findings in PowerPoint?

As a researcher, it is quite pointless to do the research if we are unable to share the findings with our audience appropriately! Using PowerPoint is one of the best ways to present research outcomes. But, how does one present qualitative research findings using PowerPoint?

In order to present the qualitative research findings using PowerPoint, you need to create a robust structure for your presentation, make it engaging and visually appealing, present the patterns with explanations for it and highlight the conclusion of your research findings.

In this article, we will help you understand the structure of your presentation. Plus, we’ll share some handy tips that will make your qualitative research presentation really effective!

How to Create a Structure for your Qualitative Research Presentation?

Creating the right structure for your presentation is key to ensuring that it is correctly understood by your audience.

The structure of your Research Presentation not only makes it easier for you to create the document, it also makes it simple for the audience to understand what all will be covered in the presentation at the time of presenting it to your audience.

Furthermore, having a robust structure is a great way to ensure that you don’t miss out on any of the points while working on creating the presentation.

But, what structure should one follow?

Creating a good structure can be tricky for some. Thus, I’m sharing what has worked well for me during my previous research projects.

NOTE – It is important to note that although the following structure is highly effective for most research findings presentation, it has been generalized in order to serve a wide range of research projects. You may want to take a look at points that are very specific to the nature of your research project and include them at your discretion.

Here’s my recommended structure to create your Research Findings presentation –

1. Objective of the Research

A great way to start your presentation is to highlight the objective of your research project.

It is important to remember that merely sharing the objective may sometimes not be enough. A short backstory along with the purpose of your research project can pack a powerful punch ! It not only validates the reasoning for your project but also subtly establishes trust with your audience.

However, do make sure that you’re not reading the backstory from the slide. Let it flow naturally when you are delivering the presentation. Keep the presentation as minimalistic as possible.

2. Key Parameters Considered for Measurement

Once you’ve established the objective, the next thing that you may want to do is perhaps share the key parameters considered for the success of your project.

Every research project, including qualitative research, needs to have a few key parameters to measure against the objective of the research.

For example – If the goal of your project is to gather the sentiments of a certain group of people for a particular product, you may need to measure their feelings. Are they happy or unhappy using the product? How do they perceive the branding of the product? Is it affordable?

Make sure that you list down all such key parameters that were considered while conducting the qualitative research.

In general, laying these out before sharing the outcome can help your audience think from your perspective and look at the findings from the correct lens.

3. Research Methodology Adopted

The next thing that you may want to include in your presentation is the methodology that you adopted for conducting the research.

By knowing your approach, the audience can be better prepared for the outcome of your project. Ensure that you provide sound reasoning for the chosen methodology.

This section of your presentation can also showcase some pictures of the research being conducted. If you have captured a video, include that. Doing this provides further validation of your project.

4. Research Outcomes (Presenting Descriptive Analysis)

powerpoint presentation of qualitative data

This is the section that will constitute the bulk of the your presentation.

Use the slides in this section to describe the observations, and the resulting outcomes on each of the key parameters that were considered for the research project.

It is usually a good idea to dedicate at least 1 or more slides for each parameter . Make sure that you present data wherever possible. However, ensure that the data presented can be easily comprehended.

Provide key learnings from the data, highlight any outliers, and possible reasoning for it. Try not to go too in-depth with the stats as this can overwhelm the audience. Remember, a presentation is most helpful when it is used to provide key highlights of the research !

Apart from using the data, make sure that you also include a few quotes from the participants.

5. Summary and Learnings from the Research

Once you’ve taken the audience through the core part of your research findings, it is a good practice to summarize the key learnings from each of the section of your project.

Make sure your touch upon some of the key learnings covered in the research outcome of your presentation.

Furthermore, include any additional observations and key points that you may have had which were previously not covered.

The summary slide also often acts as “Key Takeaways” from the research for your audience. Thus, make sure that you maintain brevity and highlight only the points that you want your audience to remember even after the presentation.

6. Inclusions and Exclusions (if any)

While this can be an optional section for some of the researchers.

However, dedicating a section on inclusions and exclusions in your presentation can be a great value add! This section helps your audience understand the key factors that were excluded (or included) on purpose!

Moreover, it creates a sense of thoroughness in the minds of your audience.

7. Conclusion of the Research

The purpose of the conclusion slide of your research findings presentation is to revisit the objective, and present a conclusion.

A conclusion may simply validate or nullify the objective. It may sometimes do neither. Nevertheless, having a conclusion slide makes your presentation come a full circle. It creates this sense of completion in the minds of your audience.

8. Questions

Finally, since your audience did not spend as much time as you did on the research project, people are bound to have a few questions.

Thus, the last part of your presentation structure should be dedicated to allowing your audience to ask questions.

Tips for Effectively Presenting Qualitative Research Findings using PowerPoint

For a presentation to be effective, it is important that the presentation is not only well structured but also that it is well created and nicely delivered!

While we have already covered the structure, let me share with you some tips that you can help you create and deliver the presentation effectively.

Tip 1 – Use Visuals

powerpoint presentation of qualitative data

Using visuals in your presentation is a great way to keep the presentations engaging!

Visual aids not only help make the presentation less boring, but it also helps your audience in retaining the information better!

So, use images and videos of the actual research wherever possible. If these do not suffice or do not give a professional feel, there are a number of resources online from where you can source royalty-free images.

My recommendation for high-quality royalty-free images would be either Unsplash or Pexels . Both are really good. The only downside is that they often do not provide the perfect image that can be used. That said, it can get the job done for at least half the time.

If you are unable to find the perfect free image, I recommend checking out Dreamstime . They have a huge library of images and are much cheaper than most of the other image banks. I personally use Dreamstime for my presentation projects!

Tip 2 – Tell a Story (Don’t Show Just Data!)

I cannot stress enough on how important it is to give your presentation a human touch. Delivering a presentation in the form of a story does just that! Furthermore, storytelling is also a great tool for visualization .

Data can be hard-hitting, whereas a touching story can tickle the emotions of your audience on various levels!

One of the best ways to present a story with your research project is to start with the backstory of the objective. We’ve already talked about this in the earlier part of this article.

Start with why is this research project is so important. Follow a story arc that provides an exciting experience of the beginning, the middle, and a progression towards a climax; much like a plot of a soap opera.

Tip 3 – Include Quotes of the Participants

Including quotes of the participants in your research findings presentation not only provides evidence but also demonstrates authenticity!

Quotes function as a platform to include the voice of the target group and provide a peek into the mindset of the target audience.

When using quotes, keep these things in mind –

1. Use Quotes in their Unedited Form

When using quotes in your presentation, make sure that you use them in their raw unedited form.

The need to edit quotes should be only restricted to aid comprehension and sometimes coherence.

Furthermore, when editing the quotes, make sure that you use brackets to insert clarifying words. The standard format for using the brackets is to use square brackets for clarifying words and normal brackets for adding a missing explanation.

2. How to Decide which Quotes to Consider?

It is important to know which quotes to include in your presentation. I use the following 3 criteria when selecting the quote –

  • Relevance – Consider the quotes that are relevant, and trying to convey the point that you want to establish.
  • Length – an ideal quote should be not more than 1-2 sentences long.
  • Choose quotes that are well-expressed and striking in nature.

3. Preserve Identity of the Participant

It is important to preserve and protect the identity of the participant. This can be done by maintaining confidentiality and anonymity.

Thus, refrain from using the name of the participant. An alternative could be using codes, using pseudonyms (made up names) or simply using other general non-identifiable parameters.

Do note, when using pseudonyms, remember to highlight it in the presentation.

If, however, you do need to use the name of the respondent, make sure that the participant is okay with it and you have adequate permissions to use their name.

Tip 4 – Make your Presentation Visually Appealing and Engaging

It is quite obvious for most of us that we need to create a visually appealing presentation. But, making it pleasing to the eye can be a bit challenging.

Fortunately, we wrote a detailed blog post with tips on how to make your presentation attractive. It provides you with easy and effective tips that you can use even as a beginner! Make sure you check that article.

7 EASY tips that ALWAYS make your PPT presentation attractive (even for beginners)

In addition to the tips mentioned in the article, let me share a few things that you can do which are specific to research outcome presentations.

4.1 Use a Simple Color Scheme

Using the right colors are key to make a presentation look good.

One of the most common mistakes that people make is use too many colors in their presentation!

My recommendation would be to go with a monochromatic color scheme in PowerPoint .

4.2 Make the Data Tables Simple and Visually Appealing

When making a presentation on research outcomes, you are bound to present some data.

But, when data is not presented in a proper manner, it can easily and quickly make your presentation look displeasing! The video below can be a good starting point.

Using neat looking tables can simply transform the way your presentation looks. So don’t just dump the data from excel on your PowerPoint presentation. Spend a few minutes on fixing it!

4.3 Use Graphs and Charts (wherever necessary)

When presenting data, my recommendation would be that graphs and charts should be your first preference.

Using graphs or charts make it easier to read the data, takes less time for the audience to comprehend, and it also helps to identify a trend.

However, make sure that the correct chart type is used when representing the data. The last thing that you want is to poorly represent a key piece of information.

4.4 Use Icons instead of Bullet Points

Consider the following example –

powerpoint presentation of qualitative data

This slide could have been created just as easily using bullet points. However, using icons and representing the information in a different format makes the slide pleasing on the eye.

Thus, always try to use icons wherever possible instead of bullet points.

Tip 5 – Include the Outliers

Many times, as a research project manager, we tend to focus on the trends extracted from a data set.

While it is important to identify patterns in the data and provide an adequate explanation for the pattern, it is equally important sometimes to highlight the outliers prominently.

It is easy to forget that there may be hidden learnings even in the outliers. At times, the data trend may be re-iterating the common wisdom. However, upon analyzing the outlier data points, you may get insight into how a few participants are doing things successfully despite not following the common knowledge.

That said, not every outlier will reveal hidden information. So, do verify what to include and what to exclude.

Tip 6 – Take Inspiration from other Presentations

I admit, making any presentation can be a tough ask let alone making a presentation for showcasing qualitative research findings. This is especially hard when we don’t have the necessary skills for creating a presentation.

One quick way to overcome this challenge could be take inspiration from other similar presentations that we may have liked.

There is no shame in being inspired from others. If you don’t have any handy references, you can surely Google it to find a few examples.

One trick that almost always works for me is using Pinterest .

But, don’t just directly search for a research presentation. You will have little to no success with it. The key is to look for specific examples for inspiration. For eg. search for Title Slide examples, or Image Layout Examples in Presentation.

Tip 7 – Ask Others to Critic your Presentation

The last tip that I would want to provide is to make sure that you share the presentation with supportive colleagues or mentors to attain feedback.

This step can be critical to iron out the chinks in the armor. As research project manager, it is common for you to get a bit too involved with the project. This can lead to possibilities wherein you miss out on things.

A good way to overcome this challenge is to get a fresh perspective on your project and the presentation once it has been prepared.

Taking critical feedback before your final presentation can also prepare you to handle tough questions in an adept manner.

Final Thoughts

It is quite important to ensure that we get it right when working on a presentation that showcases the findings of our research project. After all, we don’t want to be in a situation wherein we put in all the hard-work in the project, but we fail to deliver the outcome appropriately.

I hope you will find the aforementioned tips and structure useful, and if you do, make sure that you bookmark this page and spread the word. Wishing you all the very best for your project!

  • AI Templates
  • Get a demo Sign up for free Log in Log in

Buttoning up research: How to present and visualize qualitative data

powerpoint presentation of qualitative data

15 Minute Read

powerpoint presentation of qualitative data

There is no doubt that data visualization is an important part of the qualitative research process. Whether you're preparing a presentation or writing up a report, effective visualizations can help make your findings clear and understandable for your audience. 

In this blog post, we'll discuss some tips for creating effective visualizations of qualitative data. 

First, let's take a closer look at what exactly qualitative data is.

What is qualitative data?

Qualitative data is information gathered through observation, questionnaires, and interviews. It's often subjective, meaning that the researcher has to interpret it to draw meaningful conclusions from it. 

The difference between qualitative data and quantitative data

When researchers use the terms qualitative and quantitative, they're referring to two different types of data. Qualitative data is subjective and descriptive, while quantitative data is objective and numerical.

Qualitative data is often used in research involving psychology or sociology. This is usually where a researcher may be trying to identify patterns or concepts related to people's behavior or attitudes. It may also be used in research involving economics or finance, where the focus is on numerical values such as price points or profit margins. 

Before we delve into how best to present and visualize qualitative data, it's important that we highlight how to be gathering this data in the first place. ‍

powerpoint presentation of qualitative data

How best to gather qualitative data

In order to create an effective visualization of qualitative data, ensure that the right kind of information has been gathered. 

Here are six ways to gather the most accurate qualitative data:

  • Define your research question: What data is being set out to collect? A qualitative research question is a definite or clear statement about a condition to be improved, a project’s area of concern, a troubling question that exists, or a difficulty to be eliminated. It not only defines who the participants will be but guides the data collection methods needed to achieve the most detailed responses.
  • ‍ Determine the best data collection method(s): The data collected should be appropriate to answer the research question. Some common qualitative data collection methods include interviews, focus groups, observations, or document analysis. Consider the strengths and weaknesses of each option before deciding which one is best suited to answer the research question.  ‍
  • Develop a cohesive interview guide: Creating an interview guide allows researchers to ask more specific questions and encourages thoughtful responses from participants. It’s important to design questions in such a way that they are centered around the topic of discussion and elicit meaningful insight into the issue at hand. Avoid leading or biased questions that could influence participants’ answers, and be aware of cultural nuances that may affect their answers.
  • ‍ Stay neutral – let participants share their stories: The goal is to obtain useful information, not to influence the participant’s answer. Allowing participants to express themselves freely will help to gather more honest and detailed responses. It’s important to maintain a neutral tone throughout interviews and avoid judgment or opinions while they are sharing their story. 
  • ‍ Work with at least one additional team member when conducting qualitative research: Participants should always feel comfortable while providing feedback on a topic, so it can be helpful to have an extra team member present during the interview process – particularly if this person is familiar with the topic being discussed. This will ensure that the atmosphere of the interview remains respectful and encourages participants to speak openly and honestly.
  • ‍ Analyze your findings: Once all of the data has been collected, it’s important to analyze it in order to draw meaningful conclusions. Use tools such as qualitative coding or content analysis to identify patterns or themes in the data, then compare them with prior research or other data sources. This will help to draw more accurate and useful insights from the results. 

By following these steps, you will be well-prepared to collect and analyze qualitative data for your research project. Next, let's focus on how best to present the qualitative data that you have gathered and analyzed.

powerpoint presentation of qualitative data

Create your own AI-powered templates for better, faster research synthesis. Discover new customer insights from data instantly.

powerpoint presentation of qualitative data

The top 10 things Notably shipped in 2023 and themes for 2024.

How to visually present qualitative data.

When it comes to how to present qualitative data visually, the goal is to make research findings clear and easy to understand. To do this, use visuals that are both attractive and informative. 

Presenting qualitative data visually helps to bring the user’s attention to specific items and draw them into a more in-depth analysis. Visuals provide an efficient way to communicate complex information, making it easier for the audience to comprehend. 

Additionally, visuals can help engage an audience by making a presentation more interesting and interactive.

Here are some tips for creating effective visuals from qualitative data:

  • ‍ Choose the right type of visualization: Consider which type of visual would best convey the story that is being told through the research. For example, bar charts or line graphs might be appropriate for tracking changes over time, while pie charts or word clouds could help show patterns in categorical data. 
  • ‍ Include contextual information: In addition to showing the actual numbers, it's helpful to include any relevant contextual information in order to provide context for the audience. This can include details such as the sample size, any anomalies that occurred during data collection, or other environmental factors.
  • ‍ Make it easy to understand: Always keep visuals simple and avoid adding too much detail or complexity. This will help ensure that viewers can quickly grasp the main points without getting overwhelmed by all of the information. 
  • ‍ Use color strategically: Color can be used to draw attention to certain elements in your visual and make it easier for viewers to find the most important parts of it. Just be sure not to use too many different colors, as this could create confusion instead of clarity. 
  • ‍ Use charts or whiteboards: Using charts or whiteboards can help to explain the data in more detail and get viewers engaged in a discussion. This type of visual tool can also be used to create storyboards that illustrate the data over time, helping to bring your research to life. 

powerpoint presentation of qualitative data

Visualizing qualitative data in Notably

Notably helps researchers visualize their data on a flexible canvas, charts, and evidence based insights. As an all-in-one research platform, Notably enables researchers to collect, analyze and present qualitative data effectively.

Notably provides an intuitive interface for analyzing data from a variety of sources, including interviews, surveys, desk research, and more. Its powerful analytics engine then helps you to quickly identify insights and trends in your data . Finally, the platform makes it easy to create beautiful visuals that will help to communicate research findings with confidence. 

Research Frameworks in Analysis

The canvas in Analysis is a multi-dimensional workspace to play with your data spatially to find likeness and tension. Here, you may use a grounded theory approach to drag and drop notes into themes or patterns that emerge in your research. Utilizing the canvas tools such as shapes, lines, and images, allows researchers to build out frameworks such as journey maps, empathy maps, 2x2's, etc. to help synthesize their data.

Going one step further, you may begin to apply various lenses to this data driven canvas. For example, recoloring by sentiment shows where pain points may distributed across your customer journey. Or, recoloring by participant may reveal if one of your participants may be creating a bias towards a particular theme.

powerpoint presentation of qualitative data

Exploring Qualitative Data through a Quantitative Lens

Once you have begun your analysis, you may visualize your qualitative data in a quantitative way through charts. You may choose between a pie chart and or a stacked bar chart to visualize your data. From here, you can segment your data to break down the ‘bar’ in your bar chart and slices in your pie chart one step further.

To segment your data, you can choose between ‘Tag group’, ‘Tag’, ‘Theme’, and ‘Participant'. Each group shows up as its own bar in the bar chart or slice in the pie chart. For example, try grouping data as ‘Participant’ to see the volume of notes assigned to each person. Or, group by ‘Tag group’ to see which of your tag groups have the most notes.

Depending on how you’ve grouped or segmented your charts will affect the options available to color your chart. Charts use colors that are a mix of sentiment, tag, theme, and default colors. Consider color as a way of assigning another layer of meaning to your data. For example, choose a red color for tags or themes that are areas of friction or pain points. Use blue for tags that represent opportunities.

powerpoint presentation of qualitative data

AI Powered Insights and Cover Images

One of the most powerful features in Analysis is the ability to generate insights with AI. Insights combine information, inspiration, and intuition to help bridge the gap between knowledge and wisdom. Even before you have any tags or themes, you may generate an AI Insight from your entire data set. You'll be able to choose one of our AI Insight templates that are inspired by trusted design thinking frameworks to stimulate generative, and divergent thinking. With just the click of a button, you'll get an insight that captures the essence and story of your research. You may experiment with a combination of tags, themes, and different templates or, create your own custom AI template. These insights are all evidence-based, and are centered on the needs of real people. You may package these insights up to present your research by embedding videos, quotes and using AI to generate unique cover image.

powerpoint presentation of qualitative data

You can sign up to run an end to end research project for free and receive tips on how to make the most out of your data. Want to chat about how Notably can help your team do better, faster research? Book some time here for a 1:1 demo with your whole team.

powerpoint presentation of qualitative data

Introducing Notably + Miro Integration: 3 Tips to Analyze Miro Boards with AI in Notably

powerpoint presentation of qualitative data

5 Steps to turn data into insights with Notably

Give your research synthesis superpowers..

Try Teams for 7 days

Free for 1 project

powerpoint presentation of qualitative data

  • UNC Libraries
  • HSL Subject Research
  • Qualitative Research Resources
  • Presenting Qualitative Research

Qualitative Research Resources: Presenting Qualitative Research

Created by health science librarians.

HSL Logo

  • What is Qualitative Research?
  • Qualitative Research Basics
  • Special Topics
  • Training Opportunities: UNC & Beyond
  • Help at UNC
  • Qualitative Software for Coding/Analysis
  • Software for Audio, Video, Online Surveys
  • Finding Qualitative Studies
  • Assessing Qualitative Research
  • Writing Up Your Research
  • Integrating Qualitative Research into Systematic Reviews
  • Publishing Qualitative Research

Presenting Qualitative Research, with a focus on posters

  • Qualitative & Libraries: a few gems
  • Data Repositories

Example posters

  • The Meaning of Work for People with MS: a Qualitative Study A good example with quotes
  • Fostering Empathy through Design Thinking Among Fourth Graders in Trinidad and Tobago Includes quotes, photos, diagrams, and other artifacts from qualitative study
  • Examining the Use and Perception of Harm of JUULs by College Students: A Qualitative Study Another interesting example to consider
  • NLM Informationist Supplement Grant: Daring to Dive into Documentation to Determine Impact An example from the Carolina Digital Repository discussed in a class more... less... Allegri, F., Hayes, B., & Renner, B. (2017). NLM Informationist Supplement Grant: Daring to Dive into Documentation to Determine Impact. https://doi.org/10.17615/bk34-p037
  • Qualitative Posters in F1000 Research Archive (filtered on "qualitative" in title) Sample qualitative posters
  • Qualitative Posters in F1000 Research Archive (filtered on "qualitative" in keywords) Sample qualitative posters

Michelle A. Krieger Blog (example, posts follow an APA convention poster experience with qualitative posters):

  • Qualitative Data and Research Posters I
  • Qualitative Data and Research Posters II

"Oldies but goodies":

  • How to Visualize Qualitative Data: Ann K. Emery, September 25, 2014 Data Visualization / Chart Choosing, Color-Coding by Category, Diagrams, Icons, Photographs, Qualitative, Text, Timelines, Word Clouds more... less... Getting a little older, and a commercial site, but with some good ideas to get you think.
  • Russell, C. K., Gregory, D. M., & Gates, M. F. (1996). Aesthetics and Substance in Qualitative Research Posters. Qualitative Health Research, 6(4), 542–552. Older article with much good information. Poster materials section less applicable.Link is for UNC-Chapel Hill affiliated users.

Additional resources

  • CDC Coffee Break: Considerations for Presenting Qualitative Data (Mark D. Rivera, March 13, 2018) PDF download of slide presentation. Display formats section begins on slide 10.
  • Print Book (Davis Library): Miles, M. B., Huberman, A. M., & Saldaña, J. (2014). Qualitative data analysis: A methods sourcebook, 3rd edition From Paul Mihas, Assistant Director of Education and Qualitative Research at the Odum Institute for Research in Social Science at UNC: Qualitative Data Analysis: A Methods Sourcebook (4th ed.) by Miles, Huberman, and Saldana has a section on Displaying the Data (and a chapter on Designing Matrix, Network, and Graphic Displays) that can help students consider numerous options for visually synthesizing data and findings. Many of the suggestions can be applied to designing posters (April 15, 2021).
  • << Previous: Publishing Qualitative Research
  • Next: Qualitative & Libraries: a few gems >>
  • Last Updated: May 14, 2024 12:50 PM
  • URL: https://guides.lib.unc.edu/qual

SlideTeam

Researched by Consultants from Top-Tier Management Companies

Banner Image

Powerpoint Templates

Icon Bundle

Kpi Dashboard

Professional

Business Plans

Swot Analysis

Gantt Chart

Business Proposal

Marketing Plan

Project Management

Business Case

Business Model

Cyber Security

Business PPT

Digital Marketing

Digital Transformation

Human Resources

Product Management

Artificial Intelligence

Company Profile

Acknowledgement PPT

PPT Presentation

Reports Brochures

One Page Pitch

Interview PPT

All Categories

Top 10 Qualitative Research Report Templates with Samples and Examples

Top 10 Qualitative Research Report Templates with Samples and Examples

“Research is to see what everybody else has seen, and to think what nobody else has thought, ” said Hungarian biochemist and Nobel laureate Albert Szent-Gyorgyi, who discovered Vitamin C. This fabulous statement on research as a human endeavor reminds us that execution matters, of course, but the solid pillar of research that backs it is invaluable as well.

Here’s an example to illustrate this in action.

Have you ever wondered what makes Oprah Winfrey a successful businesswoman? It's her research abilities. Oprah might not have been as successful as a news anchor and television show host if she hadn't done her exploratory research on key topics and public figures. Additionally, without the research and development that went into the internet, there was no way that you could be reading this post right now. Research is an essential tool for understanding the intricacies of many topics and advancing knowledge.

Businesses in the modern world are, increasingly, based on research. Within research too, the qualitative world of non-numerical observations, data, and impactful insights is what business owners are most interested in. This is not to say that numbers or empirical research is not important. It is, of course, one of the founding blocks of business.

In this blog, however, we focus on qualitative research PPT Templates that help you move forward and get on the profitable highway and take the best decisions for your business.

These presentation templates are 100% customizable, and editable. Use these to leave a lasting impact on your audience and get recall for your business value offering.

Top 10 Qualitative Research Report Templates

The goal of qualitative research methods is to monitor market trends and attitudes through surveys, analyses, historical research, and open-ended interviews. It helps interpret and comprehend human behavior using data. With the use of qualitative market research services, you may get access to the appropriate data that could help you make decisions.

After finishing the research portion of your assignment effectively, you'll need a captivating way to present your findings to your audience. Here, SlideTeam's qualitative research report templates come in handy. Our top ten qualitative research templates will help you effectively communicate your message. Let’s start a tour of this universe.

Template 1 : Qualitative Research Proposal Template PowerPoint Presentation Slides

For the reader to understand your research proposal, you must have well-structured PPT slides. Don't worry, SlideTeam has you covered. Our pre-made research proposal template presentation slides have no learning curve. This implies that any user may rapidly create a powerful professional research proposal presentation using our PPT slides. Download these PowerPoint slides in a way that will convince your reviewers to accept your strategy.

qualitative research template powerpoint presentation slides wd

Download Now!

Template 2 : Qualitative Research Powerpoint PPT Template Bundles

You may have observed that some brands have taken the place of generic words for comparable products in our language.  Even though we are aware that Band-Aid is a brand, we always ask for Band-Aid whenever we require a plastic bandage. The power of branding is quite astounding. This is the benefit that our next PPT template bundles will provide for your business. Potential customers will find it simpler to recognize your brand and correctly associate it with a certain good or service because of our platform-independent PowerPoint Slides. Download now!

qualitative research powerpoint ppt template bundles

Template 3 : Qualitative Research Interviewing Presentation Deck

Do you find it hard to handle challenging conversations at work? Then, you may conduct effective interviews employing this PowerPoint presentation. Our presentation on qualitative research interviews aimed to "give voice" to the subjects. It provides details on interviews, information, research, participant, and study methodologies. Download this PowerPoint Presentation if you need to introduce yourself effectively during a quick visual communication.

qualitative research interviewing presentation deck wd

Template 4 : Thematic Analysis Qualitative Research PPT PowerPoint Presentation Outline Rules CPB

Thematic analysis is a technique used in qualitative research to arrive at  hidden patterns and other inferences based on a theme. Any research can employ our Thematic analysis qualitative research PPT. By using all the features of this adaptable PPT, you may convey information well. By including the proper icons and symbols, this presentation can be improved as an instructional tool and opened on any platform. Download now!

thematic analysis qualitative research ppt powerpoint presentation outline rules cpb

Template 5 : Comparative Analysis of Qualitative Research Methods

Conducting a successful comparison analysis is essential if you or your company wants to make sure that your decision-making process is efficient. With the help of our comparative analysis of qualitative research techniques, you can make choices that work for both your company and your clients. Focus Group Interviews, Cognitive Mapping, Critical Incident Technique, Verbal Protocol, Data Collection, Data Analysis, Research Scope, and Objective are covered in this extensive series of slides. Download today to carry out efficient business operations.

comparative analysis of qualitative research methods wd

Template 6 : Five-Type of Qualitative Research Designs

Your business can achieve significant results with the help of our five  qualitative research design types. Given that it incorporates layers of case studies, phenomenology, historical studies, and action research, it qualifies as a full-fledged presentation. Download this presentation template to perform an objective, open-ended technique and to carefully consider probable sources of errors.

5 types of qualitative research designs wd 5

Template 7 : Key Phases for the Qualitative Research Process

Any attempt at qualitative research, no matter how small, must follow the prescribed procedures. The key stages of the qualitative research method are combined in this pre-made PPT template. This set of slides covers data analysis, research approach, research design, research aim, issue description, research questions, philosophical assumptions, data collecting, and result interpretation. Get it now.

key phases for qualitative research process wd

Template 8 : Thematic Analysis of Qualitative Research Data

Thematic analysis is performed on the raw data that is acquired through focus groups, interviews, surveys, etc. We go over each and every critical step in our slides on thematic analysis of qualitative research data, including how to uncover codes, identify themes in the data, finalize topics, explore each theme, and analyze documents. This completely editable PowerPoint presentation is available for instant download.

thematic analysis of qualitative research data wd

Template 9 : Swot Analysis of Qualitative Research Approach

Use this PowerPoint set to determine the strengths, weaknesses, opportunities, and threats facing your company. Each slide comes with a unique tool that may be utilized to strengthen your areas of weakness, grasp opportunities, and lessen risks. This template can be used to collect statistics, add your own information, and then begin considering how you might get better.

swot analysis of qualitative research approach wd

Download now!

Template 10 : Qualitative Research through Graph Showing Revenue Growth

A picture truly is worth a thousand words even when it comes to summarizing your research's findings. Researchers encounter an unavoidable issue when presenting qualitative study data; to address this challenge, Slide Team has created a user-responsive Graph Showing Revenue Growth template. This slideshow graph could help you make informed decisions and encourage your company's growth.

qualitative research through graph showing growth wd

Template 11 : Qualitative Research Data Collection Approaches and Implications

Like blood moving through the circulatory system, data moves through an organization. Businesses cannot run without data. The first step in making better decisions is gathering data. This presentation template includes all the elements necessary to create a successful business plan, from data collection to analysis of the best method to comprehend concepts, opinions, or experiences. Get it now.

qualitative research data collection approaches and implications wd

Template 12 : Qualitative Research Analysis of Comments with Magnify Glass

The first step in performing a qualitative analysis of your data is gathering all the comments and feedback you want to look at. Our templates help you document those comments. These slides are fully editable and contain a visual accessibility function. The organization and formatting of the sections are excellent. Download it now.

qualitative research analysis of comment with magnify glass wd

PS For more information on qualitative and quantitative data analysis, as well as to determine which type of market research is best for your company, check out this blog.

FAQs on Qualitative Research 

Writing a qualitative research report.

A qualitative report is a summary of an experience, activity, event, or observation. The format of a qualitative report includes an abstract, introduction, background information on the issue, the researcher's role, theoretical viewpoint, methodology, ethical considerations, results, data analysis, limitations, discussion, conclusions, implications, references, and an appendix. A qualitative research report requires extensive detail and is typically divided into several sections. These start with the title, a table of contents, and an abstract; these form the beginning. Then, the meat of a qualitative report comprises an introduction, the literature review, an account of investigation, findings, discussion, and conclusions. The final section is references.

How do you Report Data in Qualitative Research?

A qualitative research report is frequently built around themes. You should be aware that it can be difficult to express qualitative findings as thoroughly as they deserve. It is customary to use direct quotes from sources like interviews to support the viewpoint. To develop a precise description or explanation of the primary theme being studied, it is also crucial to clarify concepts and connect them. There is the need to state about design, which is how were the subject choices made, leading through other steps to documenting that how the researcher verified the research’s findings/results.

What is an Example of a Report of Qualitative Data?

Qualitative data are categorical by nature. Reports that use qualitative data make it easier to present complex information. The semi-structured interview is one of the best illustrations of a qualitative data collection technique that provides open-ended responses from informants while allowing researchers to ask questions based on a set of predetermined themes. Since they enable both inductive and deductive evaluative reasoning, these are crucial tools for qualitative research.

How do you write an Introduction for a Qualitative Report?

A qualitative report must have a strong introduction. In this section, the researcher emphasizes the aims and objectives of the methodical study. It also addresses the problem that the systematic study aims to solve. In this section, it's imperative to state whether the research's goals were met. The researcher goes into further depth about the research problem in the introduction part and discusses the need for a methodical enquiry. The researcher must define any technical words or phrases used.

Related posts:

  • [Updated 2023] Report Writing Format with Sample Report Templates
  • Top 10 Academic Report and Document Templates
  • 10 Best PowerPoint Templates for Non-Profit Organizations
  • Top 10 Proposal Executive Summary Templates With Samples And Examples

Liked this blog? Please recommend us

powerpoint presentation of qualitative data

Top 10 Business Investment Proposal Templates With Samples and Examples (Free PDF Attached)

Top 10 Marketing Cover Letter Templates With Samples and Examples (Free PDF Attached)

Top 10 Marketing Cover Letter Templates With Samples and Examples (Free PDF Attached)

This form is protected by reCAPTCHA - the Google Privacy Policy and Terms of Service apply.

digital_revolution_powerpoint_presentation_slides_Slide01

Digital revolution powerpoint presentation slides

sales_funnel_results_presentation_layouts_Slide01

Sales funnel results presentation layouts

3d_men_joinning_circular_jigsaw_puzzles_ppt_graphics_icons_Slide01

3d men joinning circular jigsaw puzzles ppt graphics icons

Business Strategic Planning Template For Organizations Powerpoint Presentation Slides

Business Strategic Planning Template For Organizations Powerpoint Presentation Slides

Future plan powerpoint template slide

Future plan powerpoint template slide

project_management_team_powerpoint_presentation_slides_Slide01

Project Management Team Powerpoint Presentation Slides

Brand marketing powerpoint presentation slides

Brand marketing powerpoint presentation slides

Launching a new service powerpoint presentation with slides go to market

Launching a new service powerpoint presentation with slides go to market

agenda_powerpoint_slide_show_Slide01

Agenda powerpoint slide show

Four key metrics donut chart with percentage

Four key metrics donut chart with percentage

Engineering and technology ppt inspiration example introduction continuous process improvement

Engineering and technology ppt inspiration example introduction continuous process improvement

Meet our team representing in circular format

Meet our team representing in circular format

Google Reviews

Enago Academy

How to Use Creative Data Visualization Techniques for Easy Comprehension of Qualitative Research

' src=

“A picture is worth a thousand words!”—an adage used so often stands true even whilst reporting your research data. Research studies with overwhelming data can perhaps be difficult to comprehend by some readers or can even be time-consuming. While presenting quantitative research data becomes easier with the help of graphs, pie charts, etc. researchers face an undeniable challenge whilst presenting qualitative research data. In this article, we will elaborate on effectively presenting qualitative research using data visualization techniques .

Table of Contents

What is Data Visualization?

Data visualization is the process of converting textual information into graphical and illustrative representations. It is imperative to think beyond numbers to get a holistic and comprehensive understanding of research data. Hence, this technique is adopted to help presenters communicate relevant research data in a way that’s easy for the viewer to interpret and draw conclusions.

What Is the Importance of Data Visualization in Qualitative Research?

According to the form in which the data is collected and expressed, it is broadly divided into qualitative data and quantitative data. Quantitative data expresses the size or quantity of data in a countable integer. Unlike quantitative data, qualitative data cannot be expressed in continuous integer values; it refers to data values ​​described in the non-numeric form related to subjects, places, things, events, activities, or concepts.

What Are the Advantages of Good Data Visualization Techniques?

Excellent data visualization techniques have several benefits:

  • Human eyes are often drawn to patterns and colors. Moreover, in this age of Big Data , visualization can be considered an asset to quickly and easily comprehend large amounts of data generated in a research study.
  • Enables viewers to recognize emerging trends and accelerate their response time on the basis of what is seen and assimilated.
  • Illustrations make it easier to identify correlated parameters.
  • Allows the presenter to narrate a story whilst helping the viewer understand the data and draw conclusions from it.
  • As humans can process visual images better than texts, data visualization techniques enable viewers to remember them for a longer time.

Different Types of Data Visualization Techniques in Qualitative Research

Here are several data visualization techniques for presenting qualitative data for better comprehension of research data.

1. Word Clouds

data visualization techniques

  • Word Clouds is a type of data visualization technique which helps in visualizing one-word descriptions.
  • It is a single image composing multiple words associated with a particular text or subject.
  • The size of each word indicates its importance or frequency in the data.
  • Wordle and Tagxedo are two majorly used tools to create word clouds.

2. Graphic Timelines

data visualization techniques

  • Graphic timelines are created to present regular text-based timelines with pictorial illustrations or diagrams, photos, and other images.
  • It visually displays a series of events in chronological order on a timescale.
  • Furthermore, showcasing timelines in a graphical manner makes it easier to understand critical milestones in a study.

3. Icons Beside Descriptions

data visualization techniques

  • Rather than writing long descriptive paragraphs, including resembling icons beside brief and concise points enable quick and easy comprehension.

4. Heat Map

data visualization techniques

  • Using a heat map as a data visualization technique better displays differences in data with color variations.
  • The intensity and frequency of data is well addressed with the help of these color codes.
  • However, a clear legend must be mentioned alongside the heat map to correctly interpret a heat map.
  • Additionally, it also helps identify trends in data.

5. Mind Map

data visualization techniques

  • A mind map helps explain concepts and ideas linked to a central idea.
  • Allows visual structuring of ideas without overwhelming the viewer with large amounts of text.
  • These can be used to present graphical abstracts

Do’s and Don’ts of Data Visualization Techniques

data visualization techniques

It perhaps is not easy to visualize qualitative data and make it recognizable and comprehensible to viewers at a glance. However, well-visualized qualitative data can be very useful in order to clearly convey the key points to readers and listeners in presentations.

Are you struggling with ways to display your qualitative data? Which data visualization techniques have you used before? Let us know about your experience in the comments section below!

' src=

nicely explained

None. And I want to use it from now.

powerpoint presentation of qualitative data

Would it be ideal or suggested to use these techniques to display qualitative data in a thesis perhaps?

Using data visualization techniques in a qualitative research thesis can help convey your findings in a more engaging and comprehensible manner. Here’s a brief overview of how to incorporate data visualization in such a thesis:

Select Relevant Visualizations: Identify the types of data you have (e.g., textual, audio, visual) and the appropriate visualization techniques that can represent your qualitative data effectively. Common options include word clouds, charts, graphs, timelines, and thematic maps.

Data Preparation: Ensure your qualitative data is well-organized and coded appropriately. This might involve using qualitative analysis software like NVivo or Atlas.ti to tag and categorize data.

Create Visualizations: Generate visualizations that illustrate key themes, patterns, or trends within your qualitative data. For example: Word clouds can highlight frequently occurring terms or concepts. Bar charts or histograms can show the distribution of specific themes or categories. Timeline visualizations can help display chronological trends. Concept maps can illustrate the relationships between different concepts or ideas.

Integrate Visualizations into Your Thesis: Incorporate these visualizations within your thesis to complement your narrative. Place them strategically to support your arguments or findings. Include clear and concise captions and labels for each visualization, providing context and explaining their significance.

Interpretation: In the text of your thesis, interpret the visualizations. Explain what patterns or insights they reveal about your qualitative data. Offer meaningful insights and connections between the visuals and your research questions or hypotheses.

Maintain Consistency: Maintain a consistent style and formatting for your visualizations throughout the thesis. This ensures clarity and professionalism.

Ethical Considerations: If your qualitative research involves sensitive or personal data, consider ethical guidelines and privacy concerns when presenting visualizations. Anonymize or protect sensitive information as needed.

Review and Refinement: Before finalizing your thesis, review the visualizations for accuracy and clarity. Seek feedback from peers or advisors to ensure they effectively convey your qualitative findings.

Appendices: If you have a large number of visualizations or detailed data, consider placing some in appendices. This keeps the main body of your thesis uncluttered while providing interested readers with supplementary information.

Cite Sources: If you use specific software or tools to create your visualizations, acknowledge and cite them appropriately in your thesis.

Hope you find this helpful. Happy Learning!

Rate this article Cancel Reply

Your email address will not be published.

powerpoint presentation of qualitative data

Enago Academy's Most Popular Articles

Research Interviews for Data Collection

  • Reporting Research

Research Interviews: An effective and insightful way of data collection

Research interviews play a pivotal role in collecting data for various academic, scientific, and professional…

Planning Your Data Collection

Planning Your Data Collection: Designing methods for effective research

Planning your research is very important to obtain desirable results. In research, the relevance of…

powerpoint presentation of qualitative data

  • Manuscript Preparation
  • Publishing Research

Qualitative Vs. Quantitative Research — A step-wise guide to conduct research

A research study includes the collection and analysis of data. In quantitative research, the data…

explanatory variables

Explanatory & Response Variable in Statistics — A quick guide for early career researchers!

Often researchers have a difficult time choosing the parameters and variables (like explanatory and response…

hypothesis testing

6 Steps to Evaluate the Effectiveness of Statistical Hypothesis Testing

You know what is tragic? Having the potential to complete the research study but not…

powerpoint presentation of qualitative data

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • 10+ Checklists
  • Research Guides

We hate spam too. We promise to protect your privacy and never spam you.

I am looking for Editing/ Proofreading services for my manuscript Tentative date of next journal submission:

powerpoint presentation of qualitative data

As a researcher, what do you consider most when choosing an image manipulation detector?

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Qualitative Data Collection Methods

Profile image of Lionel R Amarakoon

Related Papers

Research Methods for Business & Management

Kevin D O'Gorman

As one of our primary methodologies in the Methods Map (see chapter 4), qualitative techniques can yield valuable, revelatory, and rich data. They can be used on their own, or in conjunction with other research tools depending on the nature of the research project. For example, interviews can be used to explain and interpret the results of quantitative research, or conversely, to provide exploratory data that are later developed by quantitative research. MacIntosh & Bonnet (2007, p. 321) note with humour that “[q]ualitative research is sometimes styled as the poor cousin of ‘real science’…” This position can represent an added challenge to researchers. This chapter discusses some common approaches to qualitative research methods (see the ‘Techniques’ section of the Methods Map) and the issues that must be considered with their application in order for them not to be viewed as somehow inferior to ‘real science’.

powerpoint presentation of qualitative data

The Qualitative Report

Nancy Leech

Introduction In many disciplines and fields representing the social and behavioral sciences, the quantitative research paradigm, which has its roots in (logical) positivism, marked the first methodological wave (circa the 19th century), inasmuch as it was characterized by a comprehensive and formal set of assumptions and principles surrounding epistemology (e.g., independence of knower and known, objectivism, real causes determining outcomes reliably and validly, time- and context-free generalizations), ontology (e.g., single reality), axiology (e.g., value-free), methodology (e.g., deductive logic, testing or confirming hypotheses/theory), and rhetoric (e.g., rhetorical neutrality, formal writing style, impersonal passive voice, technical terminology). The years 1900 to 1950 marked what could be termed as the second methodological wave, in which many researchers who rejected (logical) positivism embraced the qualitative research paradigm (1). Denzin and Lincoln (2005a) refer to thi...

Kevin Meethan

• Data analysis–the examination of research data.• Data collection–the systematic process of collecting data.• Deduction–arriving at logical conclusions through the application of rational processes eg theory testing. Quantitative research tends to be deductive.• Documentary research–the use of texts or documents as source materials (eg historical reports, newspapers, diaries).

Sapthami KKM

Seda Khadimally

In the process of designing a solid research study, it is imperative that researchers be aware of the data collection methods within which they are to conduct their study. Sound data collection cannot be performed without the choice of a particular research design best suited for the study, as well as the research questions of the study that need to be answered. In order to conduct a sound research, researchers need to ask right research questions that need to correspond to the problem of their study. Data collection is driven by the design based on which researchers prefer to conduct their study. According to Creswell (2013), this process comprises a “series of interrelated activities aimed at gathering good information to answer emerging research questions” (p. 146). There is an array of data collection methods and whatever research design researchers choose to work with, they should keep in mind that they cannot gather their data in solitude. This means that their study participants’ role plays a crucial part during data collection. There are also other factors pivotal to collecting data. Creswell (2013) described the entire process as a circle of activities which, when researchers are engaged in, need to be considered through multiple phases such as “locating the site/individual, gaining access and making rapport, purposefully sampling, collecting data, recording information, resolving field issues, and storing data” (p. 146). Researchers should be informed prior to gathering their data that data collection extends beyond conducting interviews with their participants or observing the site, individuals, or groups of people. They should particularly be cognizant of the fact that some of the data collection methods they choose overlap with each other while some considerably differ from one another. With this in mind, the purpose of this paper is to delineate similarities and differences among methods of data collection employed in three different research designs: ethnographic studies, phenomenological studies, and narrative histories. Issues that would lend themselves to these three study types will be addressed, and the challenges that researchers encounter when collecting their data using each type of design will also be discussed.

Catherine N . Mwai

Nassima SACI

RELATED PAPERS

in A. Dolfi (a cura di), Gli scrittori/intellettuali ebrei e il dovere della testimonianza. In ricordo di Giorgio Bassani, Firenze University Press, 2017, pp. 475-487.

Eleonora Conti

Ahmet Güven

Revista Da Associacao Paulista De Cirurgioes Dentistas

Fabio Pedraza da Silva

Davao Research Journal

Gemma Valdez

PsycEXTRA Dataset

Journal of the American College of Cardiology

Wayne Batchelor

Kassaye Hussen Belay

Piyush Tiwari

brunoi andres rodriguez quispe

Las segundas residencias en México: un balance

Daniel Hiernaux

Doklady Earth Sciences

Ilya Serykh

xa.yimg.com

Noel Alegre

AFRICAN HUMAN MOBILITY REVIEW

Marcia V Mutambara

Oceanologia

M. Paszkuta

Blucher Design Proceedings

Edilene Eich

Dmitri Zakharine

Frattura ed Integrità Strutturale

Sally Hassan

Muhammad Sholeh

Acta Scientiarum Polonorum Technologia Alimentaria

Angelika Śmidowicz

Biochimica et Biophysica Acta (BBA) - Reviews on Cancer

Funmilayo Ligali

Current Research in Nutrition and Food Science Journal

Dasel M Kaindi

UAB divulga

Javier Buesa

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024
  • Open access
  • Published: 14 May 2024

Evaluation of the feasibility of a midwifery educator continuous professional development (CPD) programme in Kenya and Nigeria: a mixed methods study

  • Duncan N. Shikuku 1 , 2 ,
  • Hauwa Mohammed 3 ,
  • Lydia Mwanzia 4 ,
  • Alice Norah Ladur 2 ,
  • Peter Nandikove 5 ,
  • Alphonce Uyara 6 ,
  • Catherine Waigwe 7 ,
  • Lucy Nyaga 1 ,
  • Issak Bashir 8 ,
  • Eunice Ndirangu 9 ,
  • Carol Bedwell 2 ,
  • Sarah Bar-Zeev 10 &
  • Charles Ameh 2 , 11 , 12  

BMC Medical Education volume  24 , Article number:  534 ( 2024 ) Cite this article

35 Accesses

9 Altmetric

Metrics details

Midwifery education is under-invested in developing countries with limited opportunities for midwifery educators to improve/maintain their core professional competencies. To improve the quality of midwifery education and capacity for educators to update their competencies, a blended midwifery educator-specific continuous professional development (CPD) programme was designed with key stakeholders. This study evaluated the feasibility of this programme in Kenya and Nigeria.

This was a mixed methods intervention study using a concurrent nested design. 120 randomly selected midwifery educators from 81 pre-service training institutions were recruited. Educators completed four self-directed online learning (SDL) modules and three-day practical training of the blended CPD programme on teaching methods (theory and clinical skills), assessments, effective feedback and digital innovations in teaching and learning. Pre- and post-training knowledge using multiple choice questions in SDL; confidence (on a 0–4 Likert scale) and practical skills in preparing a teaching a plan and microteaching (against a checklist) were measured. Differences in knowledge, confidence and skills were analysed. Participants’ reaction to the programme (relevance and satisfaction assessed on a 0–4 Likert scale, what they liked and challenges) were collected. Key informant interviews with nursing and midwifery councils and institutions’ managers were conducted. Thematic framework analysis was conducted for qualitative data.

116 (96.7%) and 108 (90%) educators completed the SDL and practical components respectively. Mean knowledge scores in SDL modules improved from 52.4% (± 10.4) to 80.4% (± 8.1), preparing teaching plan median scores improved from 63.6% (IQR 45.5) to 81.8% (IQR 27.3), and confidence in applying selected pedagogy skills improved from 2.7 to 3.7, p  < 0.001. Participants rated the SDL and practical components of the programme high for relevance and satisfaction (median, 4 out of 4 for both). After training, 51.4% and 57.9% of the participants scored 75% or higher in preparing teaching plans and microteaching assessments. Country, training institution type or educator characteristics had no significant associations with overall competence in preparing teaching plans and microteaching ( p  > 0.05). Qualitatively, educators found the programme educative, flexible, convenient, motivating, and interactive for learning. Internet connectivity, computer technology, costs and time constraints were potential challenges to completing the programme.

The programme was feasible and effective in improving the knowledge and skills of educators for effective teaching/learning. For successful roll-out, policy framework for mandatory midwifery educator specific CPD programme is needed.

Peer Review reports

Introduction

Quality midwifery education underpins the provision of quality midwifery care and is vital for the health and well-being of women, infants, and families [ 1 ]. The recent State of the World’s Midwifery report (SoWMy) (2021) indicates that urgent investments are needed in midwifery, especially quality midwifery education, to improve health outcomes for women and neonates. Despite evidence to support midwifery, midwifery education and training is grossly underfunded in low- and middle-income countries (LMICs) with variation in the quality, content and duration of content between and within countries [ 2 ]. Barriers to achieving quality education are: inadequate content, lack of learning and teaching materials, insufficient and poorly trained educators and weak regulation, midwifery educators having no connection with clinical practice or opportunities for updating their knowledge or skills competencies [ 3 , 4 ].

The WHO, UNFPA, UNICEF and the International Confederation of Midwives’ (ICM) seven-step action plan to strengthen quality midwifery education, and ICM’s four pillars for midwives to achieve their potential emphasize strengthening midwifery faculty to teach students as a key priority [ 4 , 5 ]. Consequently, ICM recommends that (i) at least 50% of midwifery education curriculum should be practise-based with opportunities for clinical experience, (ii) midwifery faculty should use fair, valid and reliable formative and summative assessment methods to measure student performance and progress in learning and (iii) midwifery programmes have sufficient and up-to-date teaching and learning resources and technical support for virtual/distance learning to meet programme needs [ 6 ]. To achieve this, WHO’s Midwifery Educator Core Competencies and ICM’s Global Standards for Midwifery Education provide core competencies that midwifery educators must possess for effective practice [ 6 , 7 ]. The WHO’s global midwifery educator survey in 2018–2019 reported that fewer than half of the educators (46%) were trained or accredited as educators [ 5 ]. Educators are important determinants of quality graduates from midwifery programmes [ 7 ]. However, the survey identified that none of the educators felt confident in all of WHO’s midwifery educator core competencies [ 5 ]. Further evidence shows that many midwifery educators are more confident with theoretical classroom teaching than clinical teaching despite advances in teaching methods and have low confidence in facilitating online/virtual teaching and learning [ 4 , 8 , 9 ]. To remain competent, design and deliver competency-based curriculum and strengthen midwifery practice, ICM and WHO emphasize that midwifery faculty should engage in ongoing professional development as a midwifery practitioner, teacher/lecturer and leader [ 6 , 10 , 11 ]. However in many settings there is inadequate provision or access to faculty development opportunities [ 12 ].

Continuous professional development (CPD)

Continuous professional development has been defined as the means by which members of the profession maintain, improve and broaden their knowledge, expertise, and competence, and develop the personal and professional qualities required throughout their professional lives [ 13 ]. This can be achieved through multiple formal educational pathways based on the ICM Global Standards for Midwifery Education whilst incorporating the ICM Essential Competencies for Basic Midwifery Practice [ 6 , 14 ]. There are formal CPD activities where there is structured learning that often follows set curricula, usually approved by independent accreditation services or informal CPD that is usually self-directed learning. Participating in accredited CPD programmes is beneficial to the profession. A requirement of regular CPD renewal by a country to maintain licensure ensures an up-to-date, relevant nursing and midwifery workforce [ 15 ] and increases the legitimacy of CPD [ 16 ]. Structured learning (direct or distant), mandatory training, attending workshops and conferences, accredited college/university courses and trainings, research and peer review activities are opportunities for CPD [ 17 ]. Importantly, these CPD programmes are essential for safe, competent and effective practice that is essential to the universal health coverage (UHC) & maternal and newborn health SDGs agenda particularly in developing countries [ 18 , 19 ].

Whilst regulatory bodies and employers in many countries have requirements for midwives to complete CPD programmes and activities, these programmes and supporting activities are found to be ineffective if CPD is irrelevant to the practitioners’ practice setting, attended only because of monetary or non-monetary benefits, geared towards improving a skill for which there is no demonstrated need, and taken only to meet regulatory requirements rather than to close a competency gap [ 20 ]. In most LMICs, midwifery licensure is permanent, without obligation to demonstrate ongoing education or competence [ 15 ]. Consequently, CPD processes are not in place, and if in place, not fully utilised. A systematic review on CPD status in WHO regional office for Africa member states reported that nurses and midwives are required to attend formalised programmes delivered face-to-face or online, but only16 out of 46 (34.7%) member states had mandatory CPD programmes [ 15 ]. This underscores the need for designing regulator approved midwifery educator CPD programmes to improve the quality of midwifery education in LMICs.

Modes and approaches for delivery of CPD

Face-to-face contact is a common mode of delivery of CPD although mHealth is an emerging platform that increases access, particularly to nurses and midwives in rural areas [ 12 , 21 ]. Emerging platforms and organisations such as World Continuing Education Alliance (WCEA) offer mHealth learning opportunities in LMICs for skilled health personnel to access CPD resources that can improve health care provider knowledge and skills and potentially positively impact healthcare outcomes [ 22 ]. Although there is evidence of capacity building initiatives and CPD for midwifery educators in LMICs [ 23 ], these have been largely delivered as part of long duration (2-year) fellowship programmes and led by international organisations. In addition, these programmes have largely focused on curriculum design, leadership, management, research, project management and programme evaluation skills in health professions education with little on teaching and learning approaches and assessment for educators [ 24 , 25 , 26 ]. Successful CPD initiatives should be (i) accredited by the national regulatory bodies (Nursing and Midwifery Councils); (ii) multifaceted and provide different types of formal and informal learning opportunities and support; (iii) combine theory and clinical practice to develop the knowledge, skills and attitudes and (iv) must be adapted to fit the local context in which participants work and teach to ensure local ownership and sustainability of the initiatives [ 16 ].

Short competency-based blended trainings for educators improve their competence and confidence in delivering the quality midwifery teaching. However, systems for regular updates to sustain the competencies are lacking [ 27 , 28 ]. Evidence on effectiveness of the available CPD initiatives is limited. Even where these initiatives have been evaluated, this has largely focused on the outcomes of the programmes and little attention on the feasibility and sustainability of such programmes in low-resourced settings [ 24 , 25 , 29 ]. As part of global investments to improve the quality of midwifery education and training, Liverpool School of Tropical Medicine (LSTM) in collaboration with the UNFPA Headquarters Global Midwifery Programme and Kenya midwifery educators developed a blended midwifery educator CPD programme (described in detail in the methods section). The CPD programme modules in this programme are aligned to the WHO’s midwifery educators’ core competencies [ 7 ] and ICM essential competencies for midwifery practice [ 14 ]. The programme is also aligned to the nursing and midwifery practice national regulatory requirements of Nursing and Midwifery Councils in LMICs such as Kenya and Nigeria, and relevant national policy [ 30 , 31 , 32 ].This programme aimed at sustaining and improving the educators’ competencies in delivery of their teaching, assessments, mentoring and feedback to students. To promote uptake, there is need to test the relevance and practicability of the CPD programme. Feasibility studies are used to determine whether an intervention is appropriate for further testing, relevant and sustainable in answering the question – Can it work [ 33 ]? The key focus of these studies are acceptability of the intervention, resources and ability to manage and implement intervention (availability, requirements, sustainability), practicality, adaptation, integration into the system, limited efficacy testing of the intervention in controlled settings and preliminary evaluation of participant responses to the intervention [ 33 , 34 , 35 ].

This study evaluated the feasibility of the LSTM/UNFPA midwifery educator CPD programme using the Kirkpatrick’s model for evaluating training programmes [ 36 ]. This model is an effective tool with four levels for evaluating training programmes. Level 1 (Participants’ reaction to the programme experience) helps to understand how satisfying, engaging and relevant participants find the experience. Level 2 (Learning) measures the changes in knowledge, skills and confidence after training. Level 3 (Behaviour) measures the degree to which participants apply what they learned during training when they are back on job and this can be immediately and several months after the training. This level is critical as it can also reveal where participants might need help to transfer learning during the training to practice afterwards. Level 4 (Results) measures the degree to which targeted outcomes occur because of training. In this study, participants’ reaction to the programme – satisfaction and relevance of the programme to meeting their needs (level 1) and change in knowledge, confidence and skills after the CPD programme (level 2) were assessed. Also, user perspectives and barriers to implementing the CPD programme were explored.

Study design

This was a mixed methods intervention study using a concurrent nested/embedded/convergent design conducted in Kenya and Nigeria in May and June 2023. This was designed to evaluate the feasibility of the midwifery educator CPD programme. The goal was to obtain different but complementary data to better understand the CPD programme with the data collected from the same participants or similar target populations [ 37 ].

The quantitative component of the evaluation used a quasi-experimental pre-post and post-test only designs to evaluate the effectiveness of the blended CPD programme intervention among midwifery educators from mid-level training colleges and universities from the two countries. Pre and post evaluation of knowledge (online self-directed component) and skills (developing a teaching plan during the face-to-face component) was performed. Post intervention evaluation on programme satisfaction, relevance of CPD programme and microteaching sessions for educators was conducted.

The qualitative component of the evaluation included open-ended written responses from the midwifery educators and master trainers to describe what worked well (enablers), challenges/barriers experienced in the blended programme and key recommendations for improvement were collected. In addition, key informant interviews with the key stakeholders (nursing and midwifery councils and the national heads of training institutions) were conducted. Data on challenges anticipated in the scale up of the programme and measures to promote sustainability, access and uptake of the programme were collected from both educators and key stakeholders.

A mixed methods design was used for its strengths in (i) collecting the two types of data (quantitative and qualitative) simultaneously, during a single data collection phase, (ii) provided the study with the advantages of both quantitative and qualitative data and (iii) helped gain perspectives and contextual experiences from the different types of data or from different levels (educators, master trainers, heads of training institutions and nursing and midwifery councils) within the study [ 38 , 39 ].

The study was conducted in Kenya and Nigeria. Kenya has over 121 mid-level training colleges and universities offering nursing and midwifery training while Nigeria has about 300. Due to the vastness in Nigeria, representative government-owned nursing and midwifery training institutions were randomly selected from each of the six geo-political zones in the country and the Federal Capital Territory. Mid-level training colleges offer the integrated nursing and midwifery training at diploma level while universities offer integrated nursing and midwifery training at bachelor/master degree level in the two countries (three universities in Kenya offer midwifery training at bachelor level). All nurse-midwives and midwives trained at both levels are expected to possess ICM competencies to care for the woman and newborn. Midwifery educators in Kenya and Nigeria are required to have at least advanced diploma qualifications although years of clinical experience are not specified.

It is a mandatory requirement of the Nursing and Midwifery Councils for nurse/midwives and midwifery educators in both countries to demonstrate evidence of CPD for renewal of practising license in both countries [ 40 , 41 ]. A minimum of 20 CPD points (equivalent to 20 credit hours) is recommended annually for Kenya and 60 credit hours for Nigeria every three years. However, there are no specific midwifery educator CPD that incorporated both face-to-face and online modes of delivery, available for Kenya and Nigeria and indeed for many countries in the region. Nursing and midwifery educators are registered and licensed to practice nursing and midwifery while those from other disciplines who teach in the midwifery programme are qualified in the content they teach.

Study sites

In Kenya, a set of two mid-level colleges (Nairobi and Kakamega Kenya Medical Training Colleges (KMTCs) and two universities (Nairobi and Moi Universities), based on the geographical distribution of the training institutions were identified as CPD Centres of Excellence (COEs)/hubs. In Nigeria, two midwifery schools (Centre of Excellence for Midwifery and Medical Education, College of Nursing and Midwifery, Illorin, Kwara State and Centre of Excellence for Midwifery and Medical Education, School of Nursing Gwagwalada, Abuja, FCT) were identified. These centres were equipped with teaching and EmONC training equipment for the practical components of the CPD programme. The centres were selected based on the availability of spacious training labs/classes specific for skills training and storage of equipment and an emergency obstetrics and newborn care (EmONC) master trainer among the educators in the institution. They were designated as host centres for the capacity strengthening of educators in EmONC and teaching skills.

Intervention

Nursing and midwifery educators accessed and completed 20 h of free, self-directed online modules on the WCEA portal and face-to-face practical sessions in the CPD centres of excellence.

The design of the midwifery educator CPD programme

The design of the CPD modules was informed by the existing gap for professional development for midwifery educators in Kenya and other LMICs and the need for regular updates in knowledge and skills competencies in delivery of teaching [ 9 , 15 , 23 , 28 ]. Liverpool School of Tropical Medicine led the overall design of the nursing and midwifery educator CPD programme (see Fig.  1 for summarised steps taken in the design of the blended programme).

This was a two-part blended programme with a 20-hour self-directed online learning component (accessible through the WCEA platform at no cost) and a 3-day face-to-face component designed to cover theoretical and practical skills components respectively. The 20-hour self-directed online component had four 5-hour modules on reflection practice, teaching/learning theories and methods, student assessments and effective feedback and mentoring. These modules had pretest and post-test questions and were interactive with short videos, short quizzes within modules, links for further directed reading and resources to promote active learning. This online component is also available on the WCEA platform as a resource for other nurses and midwifery educators across the globe ( https://wcea.education/2022/05/05/midwifery-educator-cpd-programme/ ).

Practical aspects of competency-based teaching pedagogy, clinical teaching skills including selected EmONC skills, giving effective feedback, applying digital innovations in teaching and learning for educators and critical thinking and appraisal were delivered through a 3-day residential face-to-face component in designated CPD centres of excellence. Specific skills included: planning and preparing teaching sessions (lesson plans), teaching practical skills methodologies (lecture, simulation, scenario and role plays), selected EmONC skills, managing teaching and learning sessions, assessing students, providing effective feedback and mentoring and use of online applications such as Mentimeter and Kahoot in formative classroom assessment of learning. Selected EmONC skills delivered were shoulder dystocia, breech delivery, assisted vaginal delivery (vacuum assisted birth), managing hypovolemic shock and pre-eclampsia/eclampsia and newborn resuscitation. These were designed to reinforce the competencies of educators in using contemporary teaching pedagogies. The goal was to combine theory and practical aspects of effective teaching as well as provide high quality, evidence-based learning environment and support for students in midwifery education [ 4 ]. These modules integrated the ICM essential competencies for midwifery practice to provide a high quality, evidence-based learning environment for midwifery students. The pre and post tests form part of the CPD programme as a standard assessment of the educators.

As part of the design, this programme was piloted among 60 midwifery educators and regulators from 16 countries across Africa at the UNFPA funded Alliance to Improve Midwifery Education (AIME) Africa regional workshop in Nairobi in November 2022. They accessed and completed the self-directed online modules on the WCEA platform, participated in selected practical sessions, self-evaluated the programme and provided useful feedback for strengthening the modules.

The Nursing and Midwifery Councils of Kenya and Nigeria host the online CPD courses from individual or organisation entities on the WCEA portal. In addition, the Nursing Council of Kenya provides opportunities for self-reporting for various CPD events including accredited online CPD activities/programmes, skill development workshops, attending conferences and seminars, in-service short courses, practice-based research projects (as learner, principal investigator, principal author, or co-author) among others. In Nigeria, a certificate of attendance for Mandatory Continuing Professional Development Programme (MCPDP) is required as evidence for CPD during license renewal. However, the accredited CPD programmes specific for midwifery educators are not available in both countries and Africa region [ 15 , 42 ].

figure 1

Midwifery educator CPD programme design stages

Participants and sample size

Bowen and colleagues suggest that many feasibility studies are designed to test an intervention in a limited way and such tests may be conducted in a convenience sample, with intermediate rather than final outcomes, with shorter follow-up periods, or with limited statistical power [ 34 ].

A convenience random sample across the two countries was used. Sample size calculations were performed using the formula for estimation of a proportion: a 95% confidence interval for estimation of a proportion can be estimated using the formula: \(p\pm 1.96\sqrt{\frac{\text{p}(1-\text{p})}{n}}\) The margin of error (d) is the second term in the equation. For calculation of the percentage change in competence detectable Stata’s power paired proportion function was used.

To achieve the desired level of low margin of error of 5% and a 90% power (value of proportion) to detect competence change after the training, a sample of 120 participants was required. Using the same sample to assess competence before and after training, so that the improvement in percentage competent can be derived and 2.5% are assessed as competent prior to training but not after training (regress), a 90% power would give a 12% improvement change in competence after the training.

A random sample of 120 educators (60 each from Kenya & Nigeria; 30 each from mid-level training colleges and universities) were invited to participate via an email invitation in the two components of the CPD programme (Fig.  2 ). Importantly, only participants who completed the self-directed online modules were eligible to progress to the face-to-face practical component.

figure 2

Flow of participants in the CPD programme (SDL = self-directed online learning; F2F = face-to-face practical)

For qualitative interviews, eight key informant interviews were planned with a representative each from the Nursing and Midwifery Councils, mid-level training institutions’ management, university and midwifery associations in both countries. Interviews obtained data related to challenges anticipated in the scale up of the programme and measures to promote sustainability, access and uptake of the programme.

Participant recruitment

Only nursing and midwifery educators registered and licensed by the Nursing and Midwifery Councils were eligible and participated. This was because they can access the WCEA website with the self-directed online programme via the Nursing and Midwifery Councils’ websites, only accessible to registered and licensed nurses and midwives.

The recruitment process was facilitated through the central college management headquarters (for mid-level training colleges’ educators) and Nursing and Midwifery Councils (for university participants). Training institutions’ heads of nursing and midwifery departments were requested to share the contact details of all educators teaching midwifery modules, particularly the antepartum, intrapartum, postpartum and newborn care modules in the two countries. A list of 166 midwifery educators from 81 universities and mid-level training colleges was obtained through the Heads of the Department in the institutions.

The research lead, with the assistance by the co-investigator from Nigeria then randomly sampled 120 educators based on institution type and region for representativeness across the countries. Following the selection of participants, the two investigators shared the electronic detailed participant study information sheet and consent form to the potential participants one week before the start of the self-directed online modules. Clear guidance and emphasis on the conduct of the two-part program including completing the mandatory four self-directed online modules was provided. Due to the large number of eligible participants, the recruitment and consenting process was closed after reaching the first 30 participants consenting per institution type and region, with 1–2 educators per institution randomly recruited. This allowed as many institutions to be represented across the country as possible. Participants received a study information sheet and an auto-generated copy of the electronic consent form completed in their emails. Other opportunities for participating in the two-part programme were provided as appropriate for those who missed out. Only those who completed the four online modules were invited for the practical component. A WhatsApp community group for the recruited participants was formed for clarifications about the study, troubleshooting on challenges with online access and completion of the modules before and during the programme.

Self-directed online component

Upon consenting, the contact details of the educators from each level were shared with WCEA program director for generation of a unique identification code to access the self-directed online modules on the WCEA portal. Educators completed their baseline characteristics (demographic and academic) in the online platform just before the modules. Each self-directed online module was estimated to be completed in five hours. Only after completing a module was the participant allowed to progress to the next module. The modules were available for participants to complete at their own time/schedule. An autogenerated certificate of completion with the participant’s post-completion score was awarded as evidence of completing a module. Participants completed a set of 20 similar pretest and posttest multiple choice questions in each module for knowledge check. A dedicated staff from WCEA actively provided technical support for educators to register, access and complete the online modules. At the end of each module, participants completed a self-evaluation on a 5-point Likert scale for satisfaction (0 = very unsatisfied, 1 = unsatisfied, 2 = neutral, 3 = satisfied and 4 = very satisfied) and relevance of the modules (0 = very irrelevant, 1 = irrelevant, 2 = neutral, 3 = relevant and 4 = very relevant). This provided participants’ reactions to the different components of the modules on whether they met the individual educator’s development needs. In addition, participants responded to the open-ended questions at the end of the modules. These were on what they liked about the modules, challenges encountered in completing the modules and suggestions for improvement of the modules. A maximum period of two weeks was given for educators to complete the modules before progressing to the practical component.

Practical component

The practical component was delivered by a pool of 18 master trainers who received a 1-day orientation from the research lead before the training. The master trainers were a blend of experienced midwifery and obstetrics faculty in teaching and clinical practice actively engaged in facilitating EmONC trainings selected from Kenya and Nigeria. Four of these master trainers from Kenya participated in the delivery of both sets of trainings in Kenya and Nigeria.

Only educator participants who completed the self-directed online modules and certified were invited to participate in a 3-day residential practical component. Two separate classes were trained (mid-level and university level educators) per country by the same group of eight master trainers. The sessions were delivered through short interactive lectures; small group and plenary discussions; skills demonstrations/simulations and scenario teaching in small breakout groups; role plays and debrief sessions. Sessions on digital innovations in teaching and learning were live practical sessions with every participant using own laptop. Nursing and Midwifery Councils representatives and training institutions’ managers were invited to participate in both components of the programme.

Participant costs for participating in the two-part CPD programme were fully sponsored by the study. These were internet data for completing the self-directed online component and residential costs – transport, accommodation, and meals during the practical component.

Data collection

Self-directed online knowledge pretests and post-tests results, self-rated measures of satisfaction and relevance of the modules including what they liked about the modules, challenges encountered in accessing and completing the modules and suggestions for improvement data was extracted from the WCEA platform in Microsoft Excel.

On day 1 of the practical component, participants using their personal computers developed a teaching plan. On the last day (day 3), participants prepared a teaching plan and powerpoint presentation for the microteaching sessions. No teaching plan template from the trainers was provided to the participants before the training. However, they used formats from their institutions if available. A standard teaching plan template was provided at the end of the training.

The group of master trainers and participants were divided into groups for the microteaching sessions which formed part of the formative assessment. Each participant delivered a powerpoint presentation on a topic of interest (covered in the teaching plan) to the small group of 13–15 participants. This was followed by a structured session of constructive feedback that started with a self-reflection and assessment. This was followed by peer supportive and constructive feedback from the audience participants and faculty/master trainers identifying areas of effective practice and opportunities for further development. Each microteaching session lasted 10–15 min. Each of the microteaching session presentation and teaching plan were evaluated against a pre-determined electronic checklist by two designated faculty members independently during/immediately after the microteaching session. The checklist was adapted from LSTM’s microteaching assessment of the United Kingdom’s Higher Education Academy (HEA)’s Leading in Global Health Teaching (LIGHT) programme. The evaluation included preparing a teaching plan, managing a teaching and learning session using multiple interactive activities, designing and conducting formative assessments for learning using digital/online platforms, and giving effective feedback and critical appraisal. The master trainers received an orientation training on the scoring checklist by the lead researcher/corresponding author.

Self-rated confidence in different teaching pedagogy skills were evaluated before (on day 1) and after (day 3) the training on a 5-point Likert scale (0 = not at all confident, 1 = slightly confident, 2 = somewhat confident, 3 = quite confident and 4 = very confident). A satisfaction and relevance of practical component evaluation on a 5-point Likert scale was completed by the participants on an online designed form on day 3 after the microteaching sessions of the practical component. This form also had a similar qualitative survey with open-ended questions on what they liked about the practical component, challenges encountered in completing the practical component and suggestions for improvement of the component.

Using a semi-structured interview guide, six qualitative key informant interviews, each lasting about 30–45 min, were conducted by the lead researcher with the Nursing and Midwifery Councils focal persons and training institutions’ managers. These were audio recorded in English, anonymized, and deleted after transcription. These interviews were aimed at getting their perspectives on the programme design, anticipated barriers/enablers with the CPD programme and strategies for promoting uptake of the CPD programme. These interviews were considered adequate due to their information power (indicating that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed) [ 43 ] and upon obtaining data saturation, considered the cornerstone of rigor in qualitative research [ 44 , 45 ].

Assessment of outcomes

Participants’ reaction to the programme (satisfaction and relevance) (Kirkpatrick level 1) was tested using the self-rated 5-point Likert scales. Change in knowledge, confidence and skills (Kirkpatrick level 2) was tested as follows: knowledge through 20 pretest and post-test multiple choice questions per module in the self-directed online modules; confidence in applying different pedagogy skills through the self-rated 5-point Likert scale; and teaching skills through the observed microteaching sessions using a checklist.

Reliability and validity of the data collection tools

The internal consistency (a measure of the reliability, generalizability or reproducibility of a test) of the Likert scales/tools assessing the relevance of the online and practical modules and satisfaction of educators with the two blended modules were tested using the Cronbach’s alpha statistic. The Cronbach’s alpha statistics for the four Likert scales/tools ranged from 0.835 to 0.928, all indicating acceptably good to excellent level of reliability [ 46 ]. Validity (which refers to the accuracy of a measure) of the Likert scales were tested using the Pearson correlation coefficient statistic. Obtained correlation values were compared to the critical values and p-values reported at 95% confidence intervals. All the scales were valid with obtained Pearson correlation coefficients reported − 0.1946, which were all greater than the critical values ( p  < 0.001) [ 46 ]. The semi-structured interview guides for the qualitative interviews with the training institutions’ managers and midwifery councils (regulators) were developed and reviewed by expert study team members with experience in qualitative research.

Data management and analysis

Data from the online/electronic tools was extracted in Microsoft Excel and exported to SPSS version 28 for cleaning and analysis. Normality of data was tested using the Kolmogorov-Smirnov test suitable for samples above 50. Proportions of educator characteristics in the two countries were calculated. Differences between the educator characteristics in the two countries were tested using chi-square tests (and Fishers-exact test for cells with counts of less than 5).

For self-rated relevance of CPD programme components and satisfaction with the programme on the 0–4 Likert scales, descriptive statistics were calculated (median scores and proportions). Results are presented as bar graphs and tables. Cronbach alpha and Pearson correlation coefficients were used to test the reliability and validity of the test items respectively.

Change in knowledge in online modules, confidence in pedagogy skills and preparing teaching plans among educators was assessed by comparing pre-training scores and post-training scores. Descriptive statistics are reported based on normality of data. Differences in the scores were analysed using the Wilcoxon signed ranks tests, a non-parametric equivalent of the paired t-test. Differences between educators scores in microteaching by country and institution type were performed by Mann-Whitney U test. Level of competence demonstrated in the teaching plan and microteaching skill was defined as the percentage of the desired characteristics present in the teaching plan and microteaching session, set at 75% and above. The proportion of participants that achieved the desired level of competence in their teaching plan and microteaching skill was calculated. Binary logistic regression models were used to assess for the strengths of associations between individual educator and institutional characteristics (age, gender, qualifications, length of time as educator, training institution and country) and the overall dichotomised competent score (proportion achieved competence in teaching plan and microteaching skills). P-values less than 0.05 at 95% confidence interval were considered statistically significant.

Preparation for qualitative data analysis involved a rigorous process of transcription of recorded interviews with key informants. In addition, online free text responses by midwifery educators on what worked well, challenges encountered, and recommendations were extracted in Microsoft Excel format and exported to Microsoft Word for data reduction (coding) and theme development. Qualitative data was analysed using thematic framework analysis by Braun and Clarke (2006) as it provides clear steps to follow, is flexible and uses a very structured process and enables transparency and team working [ 47 ]. Due to the small number of transcripts, computer assisted coding in Microsoft Word using the margin and comments tool were used. The six steps by Braun and Clarke in thematic analysis were conducted: (i) familiarising oneself with the data through transcription and reading transcripts, looking for recurring issues/inconsistencies and, identifying possible categories and sub-categories of data; (ii) generating initial codes – both deductive (using topic guides/research questions) and inductive coding (recurrent views, phrases, patterns from the data) was conducted for transparency; (iii) searching for themes by collating initial codes into potential sub-themes/themes; (iv) reviewing themes by generating a thematic map (code book) of the analysis; (v) defining and naming themes (ongoing analysis to refine the specifics of each sub-theme/theme, and the overall story the analysis tells); and (vi) writing findings/producing a report. Confidentiality was maintained by using pseudonyms for participant identification in the study. Trustworthiness was achieved by (i) respondent validation/check during the interviews for accurate data interpretation; (ii) using a criterion for thematic analysis; (iii) returning to the data repeatedly to check for accuracy in interpretation; (iv) quality checks and discussions with the study team with expertise in mixed methods research [ 39 , 47 ].

Integration of findings used the parallel-databases variant and are synthesised in the discussion section. In this common approach, two parallel strands of data are collected and analysed independently and are only brought together during interpretation. The two sets of independent results are then synthesized or compared during the discussion [ 39 ].

Quantitative findings

Midwifery educators’ characteristics.

A total of 116 (96.7%) and 108 (90.0%) educators from 81 institutions completed the self-directed online learning and practical component respectively from the two countries. There were no significant differences between countries in educators’ qualifications, when last taught a midwifery class and whether attended any CPD training in the preceding year before the study ( p  > 0.05). Overall, only 28.7% of the educators had a midwifery related CPD training in the preceding year before the study. Midwifery educator characteristics are outlined below (Table  1 ).

Change in knowledge

This was assessed in each of the four self-directed online modules. The results from ranked scores based on Wilcoxon signed ranks test showed significant improvements in educators’ knowledge in all the four online modules completed ( p  < 0.001). The highest mean score improvement was observed in students’ assessment module, 48.1% (SD ± 15.1) to 85.2% (SD ± 15.7), a 37.1% improvement. Improvements in knowledge in the other modules were as follows: reflective practice (27.6%), mentoring and giving effective feedback (27.4%) and teaching methods (19.2%). Overall knowledge score for all modules improved from 52.4% (SD ± 10.4) to 80.4 (SD ± 8.1), p  < 0.001 (Table  2 ).

Relevance of self-directed online modules

The internal consistency of each of the four modules was tested with Cronbach’s alpha. The overall Cronbach’s alpha for the four items was 0.837, a good and acceptable level of reliability. All the four modules assessed were valid with calculated Pearson correlation coefficient values greater than the critical value of 0.1946 ( p  < 0.001) at 95% confidence interval.

Educators from the two countries, on a scale of 0–4 rated the online modules as very relevant with a median score of 4 out of 4 (IQR 0) for each of the four modules: reflective practice, teaching methods, students’ assessments and mentoring and giving effective feedback. There were no ratings of 0, 1 and 2 for all the modules (Fig.  3 ).

figure 3

Educators’ ratings of the relevance of self-directed online modules

Satisfaction with the self-directed online modules

The internal consistency of each of the eight items was tested with Cronbach’s alpha. The overall Cronbach’s alpha for the eight items was 0.928, an excellent level of reliability. All the eight items assessed were valid with their obtained Pearson correlation coefficient values greater than the critical value of 0.1946 ( p  < 0.001) at 95% confidence interval.

Each of the eight items rated on satisfaction had a median score of 4 out of 4 (IQR 0). Over 80% of the educators were very satisfied with the online modules’ content as presented in a logical format and informative. Also, the modules helped them to learn something new, updated their knowledge and the materials were useful and valuable for their practice. Over 70% were very satisfied with the modules as they helped them refresh their knowledge and skills with the links and activities embedded in the modules useful in adding to their learning. None of the educators were dissatisfied (rated 0 or 1) with the online modules (Table  3 ).

Change in confidence in different pedagogy skills

The internal consistency of each of the eight items assessed was tested with Cronbach’s alpha using the baseline data. The overall Cronbach’s alpha for the eight items was 0.893, a good level of reliability. All the eight items assessed were valid with their obtained Pearson correlation coefficient values greater than the critical value of 0.1946 ( p  < 0.001) at 95% confidence interval.

Changes in confidence before and after the training were compared using the Wilcoxon signed rank test, a parametric equivalent of the paired t-test when data is not normally distributed. The mean score of self-rated confidence of educators on a scale of 0–4 for all the eight skills significantly improved after the training from 2.73 (SD ± 0.68) to 3.74 (SD ± 0.34) ( p  < 0.001). Mean confidence was highest in facilitating a lecture (3.23, SD ± 0.8) and lowest on using digital innovations (Mentimeter) in formative assessment of teaching/learning (1.75, SD ± 1.15) before the training. These improved significantly after the training to 3.84 (SD ± 0.41) for facilitating a lecture and 3.50 (SD ± 0.63) for using digital innovations (Mentimeter) in formative assessment of teaching/learning, p  < 0.001. The mean confidence of educators was largely average before the training and significantly improved after the training in six skills ( p  < 0.001). These were designing learning outcomes using measurable Bloom’s taxonomy verbs, preparing a teaching plan, identifying relevant resources to enhance learning, facilitating a scenario teaching, facilitating a practical simulation/demonstration and giving effective feedback for learning (Table  4 ).

Preparing a teaching plan and microteaching skills

The overall median score in preparing a teaching plan was 63.6% (IQR 45.5) before the training and improved significantly to 81.8% (IQR 27.3) after the training, p  < 0.001. The median scores differed significantly by country before and after the training. Before the training, Kenyan educators had higher median scores (72.7%, IQR 27.3) compared to Nigeria counterparts (54.5%, IQR 36.4), p  < 0.001. After the training, Kenyan educators had significantly higher median scores (81.2%, IQR 18.2) than Nigerian counterparts (72.7%, IQR 18.2), p  = 0.024. However, there were no significant differences in the median scores between the training institutions before and after the training, p  > 0.05. For microteaching, the overall median score was 76.5% (IQR 29.4). There were no significant differences between countries and training institutions in the microteaching scores, p  > 0.05. Kenya educators (82.4%, IQR 29.4) had slightly higher scores than Nigeria (76.5%, IQR 29.4), p  = 0.78. Mid-level educators (79.4%, IQR 29.4) had slightly higher scores than university educators (76.5%, IQR 28.7), p  = 0.515 (Table  5 ).

The inter-rater reliability/agreement of the eight pairs of assessors in both countries were assessed by Cohen Kappa statistic. The Kappa statistics for the eight pairs ranged between 0.806 and 0.917, p  < 0.001, showing near perfect agreement between the pairs of assessors.

Association between independent educator and institutional characteristics and the microteaching skill scores

Categorised skills scores (≥ 75% mean score as competent) showed that 55 (51.4%) and 62 (57.9%) of the educators scored 75% or higher in the teaching plan preparation and microteaching skill assessments respectively. Logistic regression analysis showed that educator’s country, age, gender, qualifications, training institution type and length as educator were not significantly associated with the overall categorised teaching plan or microteaching scores ( p  > 0.05).

Relevance of the practical component

The internal consistency of each of the six skills items was tested with Cronbach’s alpha. The overall Cronbach’s alpha for the six items was 0.866, a good level of reliability. All the six skills items assessed were valid with their obtained Pearson correlation coefficient values greater than the critical value of 0.1946 ( p  < 0.001) at 95% confidence interval.

On a self-rating Likert scale of 0–4, the median score for each of the six skills assessed and trained was 4 out of a maximum of 4, indicating that the educators found the different pedagogy skills very relevant after the training. Over 80% of the educators rated the sessions on teaching plan (85.2%), scenario teaching (87.0%), simulation/demonstration teaching (82.4%) and giving effective feedback (85.2%) as very relevant. Over three-quarters (77.8%) of the educators rated the sessions on lecture teaching and use of digital innovations (Mentimeter) in assessment as very relevant (Fig.  4 ).

figure 4

Relevance of the practical components

Satisfaction with the practical component

The internal consistency of each of the six skills items was tested with Cronbach’s alpha. The overall Cronbach’s alpha for the six items was 0.835, a good level of reliability. All the six skills items assessed were valid with their obtained Pearson correlation coefficient values greater than the critical value of 0.1946 ( p  < 0.001) at 95% confidence interval.

On a self-rating Likert scale of 0–4, the median score for each of the six skills assessed was 4 out of a maximum of 4, indicating that educators were very satisfied with the practical skills sessions. Over 70% of the educators were very satisfied with the sessions on giving effective feedback (79.6%), lecture teaching (75.9%), scenario and simulation teaching (73.1% each). Two-thirds of the educators (67.6%) were very satisfied with the digital innovations in teaching (use of Mentimeter) for formative assessment in teaching and learning. All educators were satisfied with the preparing of teaching plan in teaching and learning with the majority (63.0%) as very satisfied while the remaining 37.0% satisfied. None of the educators were dissatisfied with the practical component of the training (Fig.  5 ).

figure 5

Satisfaction with practical skills

Qualitative findings

What educators liked about the self-directed online modules.

Educators from both levels and countries had similar views on the online component. These are broadly summarised under the sub-themes: (i) educative and relevant for practice, (ii) flexible and convenient learning and (iii) motivating, interesting and interactive.

Educative and relevant for practice

Educators reported the online modules as educative and informative and, improved their knowledge in teaching, assessments, reflective practice and providing effective feedback to students to promote learning as well as increasing their self-confidence and critical thinking skills. Besides, educators found the modules valuable and relevant for their professional growth and practice.

“The modules were well organized, they were relevant to my practice and met my expectations” university midwifery educator, Kenya. “The materials are very rich with current information to guide. Very informative & valuable to my professional growth” university midwifery educator, Nigeria.

Flexible and convenient learning

Educators reported that they could access and complete the online modules at their flexible and convenient time. This flexibility enhanced and stimulated them to complete the informative modules at their comfort times either at home or office without disruption to their schedules.

“(The modules) gave me ample time to read at my own pace and time without any hurry to understand the content well. They were well organised. Also, flexibility of learning and the access to materials was excellent” university midwifery educator, Kenya. “It is flexible and convenient. It empowers the learner to take ownership of the learning process. Learning is personalized” mid-level training college midwifery educator, Nigeria.

Motivating, interesting and interactive

Educators reported that the online modules were well structured, motivating, interesting and had components that promoted interaction for learning. For example, pretests, various quizzes within the modules and posttest questions and the added specific short extra reading segments promoted interaction and learning.

“The intermittent assessment questions. It helped maintain my focus” university midwifery educator, Nigeria . “Very interactive. They were very informative and extra reading assignments complemented the content” university midwifery educator, Kenya .

Challenges encountered with the self-directed online learning modules

Four sub-themes emerged that summarised the challenges experienced by midwifery educators in the two countries to access and complete the self-directed online modules. These are (i) network/internet connectivity, (ii) technology challenges, (iii) electricity power supply and power outages and, (iv) time constraints.

Network/internet connectivity

Network and internet connectivity difficulties and fluctuations was the commonest reported challenge in completing the self-directed online modules by educators from both countries. This affected the access, progress, downloading extra resources embedded within the modules and completing the integrated evaluations within the modules.

“Accessing the modules, problem with submitting forms and exams, had network problem” mid-level training college midwifery educator, Nigeria . “I kept going offline and I would have to restart every time. They were too internet dependent” university midwifery educator, Kenya.

Technology challenges

Technological challenges were observed as well as reported among educators from both countries. These ranged from poor access to emails due to forgotten email addresses, usernames or passwords, difficult access and navigation through the online modules, completing the matching questions that required dragging items, completing the evaluations and downloading certificates after completion of the modules.

“I am not very good with ICT, so I had issues using my laptop” mid-level training college midwifery educator, Nigeria. “Accessibility was difficult. I had to restart the process a number of times. The modules would sometimes take you back more than 20 slides which delayed the completion rate” university midwifery educator, Kenya.

Electricity power supply interruptions and fluctuations

Power interruptions, fluctuations and outages especially in Nigeria were cited as a challenge to complete the online modules. This delayed the completion of the modules as electric power was critical to access and complete the modules on either WCEA app on mobile phones or computers.

“The modules should not start from beginning whenever there is interrupted power supply” MLC midwifery educator, Nigeria. “Network failure due to interrupted power supply” university midwifery educator, Nigeria.

Time constraints

Although educators commented the flexibility with which to complete the online modules, time to complete the online modules was also cited as a challenge in both countries.

“It requires a lot of time, this is a challenge because I am also involved with other activities at the place of work which require my attention” university midwifery educator, Kenya.

What educators liked about the practical component

Educators written feedback on what they liked about the practical component of the CPD programme was categorised into the four sub-themes: new knowledge and relevant for practice; improved knowledge, skills and confidence to teach; enhanced participatory and active learning; individualised support in learning.

New knowledge and relevant for practice

The practical component provided new learning particularly on the use of digital platforms (Mentimeter and Kahoot) for formative assessment to evaluate learning during classroom teaching. In their integrated teaching using both online and face-to-face delivery, use of technology (Mentimeter and Kahoot) in classroom assessment was not a common practice as most of them had not heard about the available online platforms. They found Mentimeter (and Kahoot) to be interesting resources for formative assessments in class to facilitate teaching and learning. The techniques of giving effective feedback using the sandwich and ‘stop, start, continue’ methods were viewed to promote interaction between the educator and the learner for effective learning. Educators also acknowledged new knowledge and skills updates on EmONC relevant for their practice.

“Giving feedback, innovation of the online formative assessment, the teaching plan. I wish we would adapt them for daily application rather than the traditional teacher centered one.” Mid-level training college educator, Kenya . “(I liked) Everything, especially the technological innovations for assessment” Mid-level training college educator, Nigeria .

Improved knowledge, skills and confidence to teach

Educators reported that the practical sessions were interactive and engaging with good combination of theory and practice which facilitated learning. They reported that participating in the practical component enabled them to update and improve their knowledge, skills and confidence in planning and delivering theoretical and practical teaching using multiple methods. Similar improvements were reported on preparing and conducting students’ assessments and giving effective feedback to promote learning. On use of technology in formative assessments, the interactive practical sessions boosted the confidence of educators in using Mentimeter (and Kahoot) online platforms during classroom teaching.

“It helped build my confidence, had hands on practice on clinical skills and teaching skills, learnt about outdated practices and current evidence based clinical and teaching skills.” Mid-level training college educator, Nigeria . “They were very interesting especially the scenarios and skills. I was able to enhance my practical skills and technology in evaluating learning.” University midwifery educator, Kenya .

Enhanced participatory and active learning

The practical component complemented the self-directed online learning for educators. They highly commented and benefitted from the hands-on opportunities to actively engage through return demonstrations during the practical programme. This component also enabled them to brainstorm and contribute actively during the sessions. They highlighted that the practical component enhanced and reinforced learning through active participation in demonstrations, questions, group discussions and plenary sessions.

“This face-to-face module provided me with the opportunity to brainstorm with other educators, facilitators and resource persons. This will enhance my teaching skills.” Mid-level training college midwifery educator, Nigeria . “Interaction with facilitators who could clarify points that I had earlier not understood, interaction with other participants and was also able to learn from them.” University midwifery educator, Kenya .

Individualised support in learning

Educators received individualised peer support and learning during the practical component. They had opportunities within the small breakout groups for peer learning and one-to-one support from the facilitators to update and learn new knowledge and skills.

“A chance to get immediate feedback was availed by the presenters.” University midwifery educator, Kenya . “Facilitators were well informed and gave learners opportunity for return demonstration and support.” Mid-level training college midwifery educator, Kenya .

Challenges encountered with the practical component

Key challenges reported by the mixed group of educators and master trainers across the two countries include: inadequate time, computer technology challenges and poor internet connectivity for practical components.

Inadequate time

Although small breakout sessions were utilised to provide each educator with an opportunity to practice the skills, it was commonly reported that time was inadequate for skills demonstrations and return demonstrations by all educators. This was especially for areas educators had inadequate knowledge and new skills that were observed thus adequate time for teaching and repeat demonstrations for mastery was required. Similar observations were made by the master trainers who felt that some educators had never encountered some of the basic EmONC skills demonstrated or never practised and thus required a longer duration for familiarisation and practice.

“Time was short hence not enough to return demo” Mid-level training college midwifery educator, Kenya . “Some of the things were new and required more time for demonstration and practice.” Mid-level training college midwifery educator, Nigeria .

Computer technology challenges and poor internet connectivity for practical components

Some educators encountered technical difficulties in using computers during the practical component. In some cases, this was compounded by poor network/internet connectivity. This delayed completion of practical components requiring the use of computers including pretests, preparing teaching plans and presentations, post-tests and classroom demonstrations using digital innovations in teaching and learning. However, assistance was provided by the trainers as appropriate to those who needed technical support.

“(There were) technical challenges with use of computers for few participants.” Master trainer, Nigeria . “Slow internet can hinder smooth flow of sessions.” Master trainer, Kenya .

Key areas for additional support

For quality education and training, master trainers generally recommended that all educators should be trained and regularly supported in the basic EmONC course to strengthen their competencies for effective teaching of EmONC skills. Further support in computer technology use including basics in navigation around windows/programmes, formatting in Microsoft Office Word and Powerpoint, literature searching, and referencing were other critical components to be strengthened.

Perspectives from training institutions managers and midwifery regulators

Measures to ensure midwifery educators take specific cpds that have been designed to improve their teaching competencies.

Key informant interviews with the pre-service training institutions’ managers and nursing and midwifery councils from the two countries were conducted and revealed key strategies outlined below that should ensure access and completion of the blended CPD programme specific for educators’ teaching competencies.

Awareness creation, integrating programme into policy and performance appraisal

The aspect of online CPD was highlighted as a new concept in Nigeria. Due to this novelty, the country was reluctant to accredit many online CPD programmes for in-service and pre-service nursing and midwifery personnel. However, the regulatory Nursing and Midwifery Council of Nigeria had established monitoring mechanisms to evaluate its uptake to meet the definition of CPD and is still work in progress.

“For the online, it’s actually a relatively new concept, in fact because of monitoring and evaluation, we have struggled with accepting online CPDs… So, we’re struggling on how to develop a guideline for online CPDs. So, we’re now starting with the WCEA. So far, only the WCEA has that approval to provide CPD…We said let’s look at how this works out before we can extend it to other providers.” Nursing and Midwifery Council, Nigeria .

Both countries emphasized the need to create awareness of the CPD programme for midwifery educators and a policy framework for CPD. Regulators emphasized the need to have the CPD programme as mandatory for all midwifery educators through a policy directive. They suggested that the blended CPD programme should form a mandatory specified proportion of the content addressing their specific competencies. Besides, the training institution recommended that the programme should form part of the educator’s performance appraisal on a regular basis. Active monitoring systems were suggested to be in place to ensure compliance of participation and completion to acquire specific relevant competencies in pedagogy.

“…Ensure that educators take the particular modules before license renewal. Tie modules that are related to midwifery education to the educators and make them mandatory. Yes, we make it as a matter of policy that you should be taking these courses over and over again.” Nursing and Midwifery Council, Nigeria .

It was strongly suggested that attaching incentives as motivators to completing the programme would attract educators to complete the CPD programme. These incentives include certification, recognition for participation in curriculum reviews, national examination setting, facilitating national examinations, promotion and service and eligibility as trainers of trainers to colleagues.

“You attach a course, one training, you cannot guarantee that these courses will be taken. So we find a way to attach something to it. You must have evidence that you attended these programs. So once you attach something like that, they will all flock because there is an incentive to it. Because we say, as an educator, before you go after every examination to examine students, you must have taken these courses.” Nursing and Midwifery Council, Nigeria .

Internet connectivity

Training institutions’ managers suggested investments in internet connectivity for training institutions to support educators access and complete the self-directed online programme. This was also highlighted as a critical challenge for the online component by the educators in both countries.

“The issues of internet connectivity and I think we need to be proactive about it so that we have a way to constantly bring it to the forefront especially in our policies. But connectivity would be a major area to look at as people are using their money.” Mid-level training college manager, Kenya .

Anticipated challenges in the scale-up of the CPD programme

Key challenges anticipated in the roll-out and scale-up of the blended CPD programme were identified as inadequate skills of the educators in the use of information and communication technology during the practical component (including preparation of powerpoint presentations and completing tasks using a computer), and participant costs to attend the practical component (including participants’ residential costs and investments in proctor technology for ensuring academic integrity and monitoring and evaluation tool for educators’ compliance.) It was also emphasized that due to low remuneration of the educators, additional costs from their pocket to undertake the CPD could be a limiting factor for the intended faculty development initiatives. Other challenges included maintaining quality and academic integrity of the programme, potential bias in the selection of educators to attend future CPD programmes that is based on pre-existing relationships and ensuring an adequate pool of in-country trainers of trainers with midwifery competencies to deliver the practical component of the CPD programme.

There were strong suggestions that personal commitment by educators was required for personal and professional development. There were observations that educators sometimes completed the professional development programmes purely for relicensing and not necessarily for professional development. Regulators and institutional managers emphasized that educators need to understand the value of continuous professional development and create time to participate in the targeted CPD programmes to improve their competencies.

“We do advise our nurses, or we continue to inform them that taking these courses shouldn’t be tied to license renewal. It shouldn’t be tied to licence expiration or renewal of licences. You should continue to take these courses to develop yourself and not waiting until your licence expired before you take the courses. Yes, we actually try as much as possible to dissociate the renewal of licences with these courses.” Nursing and Midwifery Council, Nigeria .

Key results

Our study evaluated the feasibility of what the authors believe to be the first blended programme with online and face-to-face learning available in Africa, as a tool to reach midwifery educators in both urban and rural low-resource areas. In addition, our study is in line to an important call by WHO, UNFPA, UNICEF and ICM for an effective midwifery educator with formal preparation for teaching and engages in ongoing development as a midwifery practitioner, teacher/lecturer and leader [ 6 , 7 ]. Consequently, our intervention is part of investments for improving and strengthening the capacity of midwifery educators for quality and competent midwifery workforce as recommended by multiple global reports [ 4 , 5 , 11 ] and other publications [ 12 , 15 , 23 , 42 ]. Our study findings showed that the midwifery educators were very satisfied with the blended CPD programme. Educators rated the programme as highly relevant, educative, flexible, interesting and interactive, improved their knowledge, confidence and practical skills in their professional competencies for practice. Use of digital technology in teaching and students’ assessment was found to be an effective and innovative approach in facilitating teaching and learning. Key challenges experienced by educators included deficiencies in computer technology use, internet/network connectivity for online components, time constraints to complete the blended programme and isolated electric power outages and fluctuations which affected completion of the self-directed online components. Costs for participating and completing the programme, motivation, investments in information and communication technology, quality assurance and academic integrity were highlighted as critical components for the scale-up of the programme by institutional managers and training regulators. Establishment of a policy framework for educators to complete mandatory specific and relevant CPD was recommended for a successful roll-out in the countries.

Interpretation of our findings

Our study findings demonstrated that educators found the theoretical and practical content educative, informative and relevant to their practice. Recent evidence showed that midwifery educators had no/limited connection with clinical practice or opportunities for updating their knowledge or skills [ 15 , 42 ]. This underscores the value and importance of regular opportunities of CPD specific for educators to improve their professional competencies. It has provided these educators with a flexible educational model that allows them to continue working while developing their professional practice.

The use of a blended programme was beneficial as educators’ needs were met. It provided opportunities for educators to reflect, critically think, internalise and complement what was learned in the self-directed online component during the practical phase. This approach has been considered a means to adequately prepare midwifery faculty and improving national midwifery programmes in low-resource and remote settings [ 48 , 49 ]. Use of self-directed online platforms has emerged as a key strategy to improve access to CPD with flexibility and convenience as educators take responsibility for their own learning. Evidence suggests that the flexibility of net-based learning offers the midwifery educators a new and effective educational opportunity that they previously did not have [ 50 , 51 ]. A practical – based learning is important in pre-service education settings where the capacity of midwifery educators needs to be strengthened [ 52 , 53 ]. However, without continuous regular training, the midwives’ competence deteriorate and this in turn threaten the quality of pre-service midwifery education [ 52 , 54 ]. Implementation of this flexible blended educational model allows educators to continue working while developing their professional practice.

The quality of educators is an important factor affecting the quality of graduates from midwifery programmes to provide quality maternal and newborn health services [ 7 ]. Evidence suggests that many midwifery educators are more confident with theoretical classroom teaching than clinical practice teaching and that they also struggle to maintain their own midwifery clinical skills [ 4 , 5 ]. Our findings showed that the programme was effective, and educators improved their knowledge, confidence and skills in teaching, students’ assessment, effective feedback, reflective practice, mentoring and use of digital innovations in teaching and assessments. Our findings are similar to other related models of capacity building midwifery educators in other developing countries [ 24 , 50 , 53 , 55 , 56 , 57 ]. It is expected that educators will apply the learning in their planning for teaching, delivery of interactive and stimulatory teaching, monitoring learning through formative and summative assessments and mentoring their students into competent midwives. This is a pathway for accelerating the achievement of maternal and newborn health SDGs, universal health coverage, ending preventable maternal mortalities and every newborn action plan targets.

The value for CPD on educators’ knowledge, confidence and skills has been demonstrated with opportunities for improvement. Specific CPD targeted to relevant professional competencies is beneficial to the profession, quality of graduates for maternal and newborn health care and global targets. However, further investments in strengthening capacity of educators in EmONC skills and information and communication technology for effective teaching and learning is mandatory. Related challenges with individual technical capacity, technological deficiencies and infrastructure to support the technological advancement have been reported in other studies that have used a blended learning approach [ 58 ]. Resource constraints – financial and infrastructural (e.g. computers) as well as internet access are key challenges to participation in CPD activities especially the self-directed learning [ 16 ]. Designing self-directed modules that can be accessed and completed offline will increase access especially in poorly connected settings with electric power and network coverage.

Strengths and limitations

This study assessed the feasibility a blended midwifery educator CPD programme in low resource settings. This was conducted in a multi-country and multi-site context which provided opportunities for learning across the two countries, two levels of training institutions and specific in-country experiences [ 20 ]. The study served to improve awareness of the availability of the CPD programme so that (1) regulators can ensure that midwifery educators take this as part of mandatory CPD required for relicensing and (2) training institutions can plan to support their educators access/participate in the practical components of the programme after the study. It is a mandatory requirement of the Nursing and Midwifery Councils of Kenya and Nigeria for nurse/midwives and midwifery educators to demonstrate evidence of CPD for renewal of practising license [ 40 , 41 ]. The use of mixed methods research design with multiple evaluations was relevant to address the aims and objectives of the study and ensure methodological rigour, depth and scientific validity as recommended for good practice in designing pilot studies [ 37 , 38 ]. This also enhanced triangulation of findings and enabled the capturing of broad perspectives important in strengthening sustainable implementation of the blended CPD programme [ 39 ]. Preliminary findings were disseminated to participant stakeholders from Kenya and Nigeria at the knowledge management and learning event in Nairobi. This approach enhanced the credibility and trustworthiness of the final findings reported. We believe our study findings from different participants using multiple data collection methods are robust, transparent and trustworthy for generalization to other contexts [ 38 ].The self-directed learning component of the blended CPD programme is hosted on the WCEA platform which is accessible to healthcare professionals in over 60 countries in Africa, Asia and Middle East and accredited for continuous professional development (59). Although our sample size was small, it is sufficient, geographically representative for training institutions across the countries and acceptable for feasibility studies [ 34 ].

The additional cost analysis of implementing the blended midwifery educator CPD programme is relevant and key to the uptake, scale-up and sustainability of the programme but this was not conducted due to limited funding. Different CPD programme funding models exist. In Nigeria, educators are required to meet the costs for accessing and completing the CPD programme components, while in Kenya the cost of accessing the online component is minimal (internet access costs only) and the face-to-face component has to be funded. The cost of implementing the programme should be explored in future studies and optional models for sustainable funding explored with stakeholders.

Implications

Our findings show demand for the CPD programme. Regular continuous professional development could help to bridge the gap between theory and practice and improve the quality of teaching by midwifery educators. A blended CPD programme is effective in improving the teaching and clinical skills of midwifery educators and increasing their confidence in effective teaching. However, midwifery educators require motivation and close support (individual capacity, time, technological infrastructure and policy) if the blended CPD approach is to be mandatory and successfully implemented in resource limited settings. Besides, regular quality assurance modalities including review of content, monitoring and evaluation of uptake of the CPD programme should be undertaken to ensure that updated and relevant content is available.

For quality CPD programmes, hands-on teaching is more effective than didactic classroom teaching and should be used when feasible to transfer clinical skills. Distance education models (self-directed learning) in combination with short residential training and mentoring should be embraced to strengthen capacity strengthening of midwifery educators; and CPD programmes must consider the local context in which participants work and teach [ 16 , 23 ]. Evidence has shown that knowledge and clinical skills are retained for up to 12 months after training [ 54 ]. Taking the CPD programme annually will potentially maintain/improve knowledge, skills and practice by midwifery educators for quality teaching and learning leading to a competent midwifery workforce.

For quality midwifery education and practice, educators need contact with clinical practice to strengthen classroom teaching [ 6 , 7 ]. This will promote and enable students to acquire the skills, knowledge, and behaviours essential to become autonomous midwifery practitioners. Therefore, demonstrating relevant practical clinical CPD should be included in midwifery educator CPD policy. In addition, a business case by the CPD hubs on the sustainability of the face-to-face practical components in the centres is necessary. Stakeholder engagement on cost and sustainability are required as key policy components for the scale-up of the blended midwifery educator CPD programme for impact.

The blended CPD programme was relevant, acceptable and feasible to implement. Midwifery educators reacted positively to its content as they were very satisfied with the modules meeting their needs and rated the content as relevant to their practice. The programme also improved their knowledge, confidence and skills in teaching, students’ assessments and providing effective feedback for learning and using digital/technological innovations for effective teaching and learning. Investments in information and communication technology, quality assurance and academic integrity were highlighted as critical components for the scale-up of the programme. For successful and mandatory implementation of the specific midwifery educator CPD programme to enhance practice, a policy framework by midwifery training regulators is required by countries.

Data availability

The datasets generated and/or analysed during the current study are not publicly available due to the confidentiality of the data but are available from the corresponding author on request.

Renfrew MJ, McFadden A, Bastos MH, Campbell J, Channon AA, Cheung NF, et al. Midwifery and quality care: findings from a new evidence-informed framework for maternal and newborn care. Lancet. 2014;384(9948):1129–45.

Article   Google Scholar  

World Health Organization, United Nations Population Fund, International Confederation of Midwives. The State of the World’s Midwifery 2021: Building a health workforce to meet the needs of women, newborns and adolescents everywhere 2021. https://www.unfpa.org/publications/sowmy-2021 .

Filby A, McConville F, Portela A. What prevents quality midwifery care? A systematic mapping of barriers in low and middle income countries from the provider perspective. PLoS ONE. 2016;11(5):e0153391.

WHO. Strengthening quality midwifery education for Universal Health Coverage 2030: Framework for action 2019. https://www.who.int/publications/i/item/9789241515849 .

United Nations Population Fund, International Confederation of Midwives, World Health Organization. The State of the World’s Midwifery 2021: Building a health workforce to meet the needs of women, newborns and adolescents everywhere 2021. https://www.unfpa.org/publications/sowmy-2021 .

International Confederation of Midwives. ICM Global Standards for Midwifery Education. (Revised 2021) 2021. https://www.internationalmidwives.org/assets/files/education-files/2021/10/global-standards-for-midwifery-education_2021_en-1.pdf .

WHO. Midwifery educator core competencies: building capacities of midwifery educators 2014. https://www.who.int/hrh/nursing_midwifery/14116_Midwifery_educator_web.pdf .

Gavine A, MacGillivray S, McConville F, Gandhi M, Renfrew MJ. Pre-service and in-service education and training for maternal and newborn care providers in low-and middle-income countries: an evidence review and gap analysis. Midwifery. 2019;78:104–13.

Shikuku DN, Tallam E, Wako I, Mualuko A, Waweru L, Nyaga L, et al. Evaluation of capacity to deliver Emergency Obstetrics and Newborn Care updated midwifery and reproductive health training curricula in Kenya: before and after study. Int J Afr Nurs Sci. 2022;17:100439.

Google Scholar  

International Confederation of Midwives. Global Standards for Midwifery Regulation. 2011. https://www.internationalmidwives.org/assets/files/regulation-files/2018/04/global-standards-for-midwifery-regulation-eng.pdf .

World Health Organization. Global strategic directions for nursing and midwifery 2021–2025. Geneva: World Health Organization. 2021. https://iris.who.int/bitstream/handle/10665/344562/9789240033863-eng.pdf?sequence=1 .

Smith RM, Gray JE, Homer CSE. Common content, delivery modes and outcome measures for faculty development programs in nursing and midwifery: a scoping review. Nurse Educ Pract. 2023:103648.

Nursing and Midwifery Board of Australia. Registration standard: Continuing professional development 2016 3rd January 2022. https://www.nursingmidwiferyboard.gov.au/Registration-Standards/Continuing-professional-development.aspx .

International Confederation of Midwives. Essential competencies for midwifery practice: 2019 Update. 2019.

Baloyi OB, Jarvis MA. Continuing Professional Development status in the World Health Organisation, Afro-region member states. Int J Afr Nurs Sci. 2020;13:100258.

Mack HG, Golnik KC, Murray N, Filipe HP. Models for implementing continuing professional development programs in low-resource countries. MedEdPublish. 2017;6(1).

Lucas A. Continuous professional development, friend or foe? Br J Midwifery. 2012;20(8):576–81.

Ingwu JA, Efekalam J, Nwaneri A, Ohaeri B, Israel C, Chikeme P, et al. Perception towards mandatory continuing professional development programme among nurses working at University of Nigeria Teaching Hospital, Enugu-Nigeria. Int J Afr Nurs Sci. 2019;11:100169.

Hasumi T, Jacobsen KH. Healthcare service problems reported in a national survey of South africans. Int J Qual Health Care. 2014;26(4):482–9.

Giri K, Frankel N, Tulenko K, Puckett A, Bailey R, Ross H. Keeping up to date: continuing professional development for health workers in developing countries. IntraHealth Int. 2012.

Botha A, Booi V, editors. mHealth implementation in South Africa. 2016 IST-Africa Week Conference; 2016: IEEE.

World Continuing Education Alliance (WCEA). World Continuing Education Alliance: About us2022 3rd January 2022. https://lmic.wcea.education/about-us/ .

West F, Homer C, Dawson A. Building midwifery educator capacity in teaching in low and lower-middle income countries. A review of the literature. Midwifery. 2016;33:12–23.

van Wyk JM, Wolvaardt JE, Nyoni CN. Evaluating the outcomes of a faculty capacity development programme on nurse educators in sub-saharan Africa. Afr J Health Professions Educ. 2020;12(4):201–5.

Frantz JM, Bezuidenhout J, Burch VC, Mthembu S, Rowe M, Tan C, et al. The impact of a faculty development programme for health professions educators in sub-saharan Africa: an archival study. BMC Med Educ. 2015;15(1):1–8.

Fullerton JT, Johnson PG, Thompson JB, Vivio D. Quality considerations in midwifery pre-service education: exemplars from Africa. Midwifery. 2011;27(3):308–15.

Shikuku DN, Tallam, E., Wako, I., Mualuko, A., Waweru, L., Nyaga, L., ... & Ameh, C. Evaluation of capacity to deliver Emergency Obstetrics and Newborn Care updated midwifery and reproductive health training curricula in Kenya: Before and after study. 2022

Shikuku DN, Jebet J, Nandikove P, Tallam E, Ogoti E, Nyaga L, et al. Improving midwifery educators’ capacity to teach emergency obstetrics and newborn care in Kenya universities: a pre-post study. BMC Med Educ. 2022;22(1):1–10.

Akiode A, Fetters T, Daroda R, Okeke B, Oji E. An evaluation of a national intervention to improve the postabortion care content of midwifery education in Nigeria. Int J Gynecol Obstet. 2010;110(2):186–90.

Nursing Council of Kenya. Continuing Professional Development guidelines. Nursing Council of Kenya; 2021.

Nursing and Midwifery Council of Nigeria. Promoting & Maintaining Excellence in Nursing Education and Practice: Functions2022. https://www.nmcn.gov.ng/function.html .

Ministry of Health. Kenya Health Policy 2014–2030: Towards attaining the highest standard of health 2014. http://publications.universalhealth2030.org/uploads/kenya_health_policy_2014_to_2030.pdf .

Orsmond GI, Cohn ES. The distinctive features of a feasibility study: objectives and guiding questions. OTJR: Occupation Participation Health. 2015;35(3):169–77.

Bowen DJ, Kreuter M, Spring B, Cofta-Woerpel L, Linnan L, Weiner D, et al. How we design feasibility studies. Am J Prev Med. 2009;36(5):452–7.

Arain M, Campbell MJ, Cooper CL, Lancaster GA. What is a pilot or feasibility study? A review of current practice and editorial policy. BMC Med Res Methodol. 2010;10(1):1–7.

Kirkpatrick DL. Implementing the four levels: A practical guide for effective evaluation of training programs: Easyread super large 24pt edition: ReadHowYouWant. com; 2009.

Warfa A-RM. Mixed-methods design in biology education research: Approach and uses. CBE—Life Sci Educ. 2016;15(4):rm5.

Creswell JW, Creswell JD. Research design: qualitative, quantitative, and mixed methods approaches. Sage; 2017.

Creswell JW, Clark VLP. Designing and conducting mixed methods research. Third ed: Sage; 2018.

NCK Online CPD Portal: Continuous Professional Development [Internet]. 2021. https://osp.nckenya.com/cpd? .

Nursing and Midwifery Council of Nigeria. Promoting & maintaining Excellence in nursing education and practice: Renewal of License. 2022. Available from. https://www.nmcn.gov.ng/renewal.html .

Warren N, Gresh A, Mkhonta NR, Kazembe A, Engelbrecht S, Feraud J et al. Pre-service midwifery education in sub-saharan Africa: a scoping review. Nurse Educ Pract. 2023:103678.

Malterud K, Siersma VD, Guassora AD. Sample size in qualitative interview studies: guided by information power. Qual Health Res. 2016;26(13):1753–60.

Muellmann S, Brand T, Jürgens D, Gansefort D, Zeeb H. How many key informants are enough? Analysing the validity of the community readiness assessment. BMC Res Notes. 2021;14(1):1–6.

Hennink M, Kaiser BN. Sample sizes for saturation in qualitative research: a systematic review of empirical tests. Soc Sci Med. 2022;292:114523.

Shumway JM, Harden RM. AMEE Guide 25: the assessment of learning outcomes for the competent and reflective physician. Med Teach. 2003;25(6):569–84.

Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Res Psychol. 2006;3(2):77.

Erlandsson K, Doraiswamy S, Wallin L, Bogren M. Capacity building of midwifery faculty to implement a 3-years midwifery diploma curriculum in Bangladesh: A process evaluation of a mentorship programme. Nurse Educ Pract. 2018;29:212–8.

Erlandsson K, Byrskog U, Osman F, Pedersen C, Hatakka M, Klingberg-Allvin M. Evaluating a model for the capacity building of midwifery educators in Bangladesh through a blended, web-based master’s programme. Global Health Action. 2019;12(1):1652022.

Hatakka M, Osman F, Erlandsson K, Byrskog U, Egal J, Klingberg-Allvin M. Change-makers in midwifery care: exploring the differences between expectations and outcomes—A qualitative study of a midwifery net-based education programme in the Somali region. Midwifery. 2019;69:135–42.

Erlandsson K, Osman F, Hatakka M, Egal JA, Byrskog U, Pedersen C, et al. Evaluation of an online master’s programme in Somaliland. A phenomenographic study on the experience of professional and personal development among midwifery faculty. Nurse Educ Pract. 2017;25:96–103.

Bogren M, Rosengren J, Erlandsson K, Berg M. Build professional competence and equip with strategies to empower midwifery students–an interview study evaluating a simulation-based learning course for midwifery educators in Bangladesh. Nurse Educ Pract. 2019;35:27–31.

Msosa A, Msiska M, Mapulanga P, Mtambo J, Mwalabu G. Simulation-based education in classroom and clinical settings in sub-saharan Africa: a systematic review. Higher Education, Skills and Work-Based Learning; 2023.

Ameh CA, White S, Dickinson F, Mdegela M, Madaj B, van den Broek N. Retention of knowledge and skills after emergency Obstetric Care training: a multi-country longitudinal study. PLoS ONE. 2018;13(10):e0203606.

Evans C, Razia R, Cook E. Building nurse education capacity in India: insights from a faculty development programme in Andhra Pradesh. BMC Nurs. 2013;12(1):1–8.

Koto-Shimada K, Yanagisawa S, Boonyanurak P, Fujita N. Building the capacity of nursing professionals in Cambodia: insights from a bridging programme for faculty development. Int J Nurs Pract. 2016;22:22–30.

Kitema GF, Laidlaw A, O’Carroll V, Sagahutu JB, Blaikie A. The status and outcomes of interprofessional health education in sub-saharan Africa: a systematic review. J Interprof Care. 2023:1–23.

Ladur AN, Kumah EA, Egere U, Mgawadere F, Murray C, Ravit M et al. A blended learning approach for capacity strengthening to improve the quality of integrated HIV, TB, and Malaria services during antenatal and postnatal care in LMICs: a feasibility study. medRxiv. 2023:2023.05. 04.23289508.

World Continuing Education Alliance (WCEA). Improving Health Outcomes: WCEA delivering sustainable solutions for CPD & lifelong learning2023 26th December 2023. https://wcea.education/ .

Download references

Acknowledgements

The study was made possible through the financial support of the Johnson and Johnson Foundation for the three-year “Design, implementation and evaluation of Nursing/Midwifery CPD Educator Programme in Kenya” (2021 – 2023) and the Alliance to Improve Midwifery Education through UNFPA Headquarters. Special acknowledgement to nursing and midwifery educators from mid-level training colleges and universities in Kenya and Nigeria, Ministries of Health, Nursing Council of Kenya, Nursing and Midwifery Council of Nigeria, KMTC headquarters management who participated in the study. Also, we specially appreciate the World Continuing Education Alliance for the dedicated support with the online modules and expert trainers who participated in the delivery of the face-to-face training component: Aisha Hassan, Dr. Mojisola Ojibara, Dr. Eniola Risikat Kadir, Aminat Titi Kadir, Benson Milimo, Esther Ounza, Marthar Opisa, Millicent Kabiru, Sylvia Kimutai, Dr. Joyce Jebet, Dr. Steve Karangau, Dr. Moses Lagat and Dr. Evans Ogoti. Gratitude to Boslam Adacha and Roselynne Githinji for their dedicated support with data preparation for analysis and Dr. Sarah White for her statistical analysis expert guidance and support. Thank you also to Geeta Lal at UNFPA Headquarters. Lastly, the authors would like to acknowledge the special technical and logistical support provided by the LSTM – Kenya team (Onesmus Maina, Martin Eyinda, David Ndakalu, Diana Bitta, Esther Wekesa and Evans Koitaba) and LSTM Nigeria team (Dr. Michael Adeyemi and Deborah Charles) during the trainings.

The study was funded by the Johnson and Johnson Foundation as part of the three-year “Design, implementation and evaluation of Nursing/Midwifery CPD Educator Programme in Kenya” and the Alliance to Improve Midwifery Education through UNFPA. The Johnson and Johnson Foundation were not involved in the research – study design; in the collection, analysis and interpretation of data; in the writing of the report; and in the decision to submit the article for publication.

Author information

Authors and affiliations.

Liverpool School of Tropical Medicine (Kenya), P.O. Box 24672-00100, Nairobi, Kenya

Duncan N. Shikuku & Lucy Nyaga

Liverpool School of Tropical Medicine (UK), Liverpool, L3 5QA, UK

Duncan N. Shikuku, Alice Norah Ladur, Carol Bedwell & Charles Ameh

Liverpool School of Tropical Medicine (Nigeria), Utako District, P.O Box 7745, Abuja, Nigeria

Hauwa Mohammed

Moi University, P.O. Box 4606-30100, Eldoret, Kenya

Lydia Mwanzia

Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega, Kenya

Peter Nandikove

Maseno University, P.O. Box 3275-40100, Kisumu, Kenya

Alphonce Uyara

Kenya Medical Training College, P.O Box 30195-00100, Nairobi, Kenya

Catherine Waigwe

Department of Family Health, Ministry of Health (Kenya), P.O. Box 30016-00100, Nairobi, Kenya

Issak Bashir

Aga Khan University of East Africa, P.O Box 39340-00623, Nairobi, Kenya

Eunice Ndirangu

Burnet Institute, 85 Commercial Road Prahran Victoria, Melbourne, Australia

Sarah Bar-Zeev

University of Nairobi, P. O. Box 19676-00100, Nairobi, Kenya

Charles Ameh

Diponegoro University, JI. Prof Sudarto No 13, Temalang, Kec, Tembalang, Kota, Semarang, Jawa Tengah, 50275, Indonesia

You can also search for this author in PubMed   Google Scholar

Contributions

DNS, SBZ and CA conceived the idea and designed the study protocol; DNS designed the online data collection tools/checklists/assessments, performed data extraction, cleaning, analysis and interpretation of the results, drafted the primary manuscript, reviewed and prepared it for publication; DNS, HM, LM, PN and AU conducted the training intervention, collected data and reviewed the drafts and final manuscript; AL participated in the design of the study, qualitative data analysis, interpretation of findings and reviewed draft manuscripts; CW, LN, IB, EN, CB and SBZ participated in the design of the study procedures and substantively reviewed the drafts and final manuscript. CA reviewed study procedures, data collection tools, provided oversight in investigation, analysis, interpretation and substantively reviewed the manuscript drafts. SBZ and CA obtained funding for the study. All the authors read and approved the final manuscript.

Corresponding author

Correspondence to Duncan N. Shikuku .

Ethics declarations

Ethics approval and consent to participate.

Ethics review and approvals were obtained from Liverpool School of Tropical Medicine’s Research Ethics Committee (LSTM REC No. 23 − 004) and in-country ethical approvals from Kenya (MTRH/MU – IREC FAN 0004383; NACOSTI License No: NACOSTI/P/23/25498) and Nigeria (NHREC Approval Number NHREC/01/01/2007- 31/03/2023). Participation in the study was strictly voluntary and did not form part of the educator’s performance appraisals. Not taking part in the study did not disadvantage some educators who consented but missed out. Informed electronic and written consent was obtained from all participants. Unique participant codes were used for identification and all the data collection tools/forms and datasets were de-identified with no participant identifying information. All interviews were conducted at the offices of the respective stakeholders maintaining privacy during data collection process.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Shikuku, D.N., Mohammed, H., Mwanzia, L. et al. Evaluation of the feasibility of a midwifery educator continuous professional development (CPD) programme in Kenya and Nigeria: a mixed methods study. BMC Med Educ 24 , 534 (2024). https://doi.org/10.1186/s12909-024-05524-w

Download citation

Received : 24 January 2024

Accepted : 06 May 2024

Published : 14 May 2024

DOI : https://doi.org/10.1186/s12909-024-05524-w

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Continuous professional development
  • Feasibility

BMC Medical Education

ISSN: 1472-6920

powerpoint presentation of qualitative data

qualitative data management

Qualitative Data Management

Nov 05, 2019

190 likes | 215 Views

Qualitative Data Management. Establishing Trustworthiness. We enhance reliability or rigor of our data analysis by:. Comparing our categories to pre-existing frameworks. Having an additional person redo the analysis. Comparing notes from more than one source.

Share Presentation

  • qualitative data
  • data analysis
  • data elements
  • qualitative data management
  • called constant comparative method

corchado

Presentation Transcript

Qualitative Data Management Establishing Trustworthiness

We enhance reliability or rigor of our data analysis by: • Comparing our categories to pre-existing frameworks. • Having an additional person redo the analysis. • Comparing notes from more than one source. • Using more than one type of qualitative data in our analysis (observation, interviews, document analysis). • Supplementing the qualitative analysis with information from another quantitative source (for example, a survey). • Keeping a record (audit) of how you established data categories and identified themes. • Establishing a feedback loop so that participants can verify whether or not the analysis is accurate enough to reflect their views (member checking). • We call this process in qualitative research as “trustworthiness.”

Padgett (1998) defines trustworthiness as: • A trustworthy study is one that is carried out fairly and ethically and whose findings represent as closely as possible the experiences of the respondents (p. 92) • Lack of trustworthiness comes from: a) reactivity b) researcher biases c) respondent biases/ such as withholding information or the halo effect.

Qualitative Data Analysis • Looks for common themes and patterns. • Sample quotations are used. • Categories of responses are identified and the number of responses that fall within these categories are identified. • No statistical analysis is required – but demographic information may be expressed in percentages or placed in tables.

Example of using quotes (from gang study in Sin City) One fourth of the youths surveyed said that they didn’t like gangs or didn’t want to join a gang. Almost 40% said that being in a gang was bad or stupid. One youth described being in a gang as a “dead end choice.” Nine percent felt that being in a gang was dangerous or scary. One respondent said being in a gang was scary because “they always talk about killing or beating up other people.” Only 6% of the respondents thought being in a gang was fun or was necessary for protection. However, one youth said that being in a gang made him feel “safe and good, cool to be around.”

In the previous quote there were categories: • Didn’t like gangs • Gangs are bad or stupid • Being in a gang is fun

One approach to writing narratives can also include: • The writer’s own thoughts, values, and beliefs. • An interpretation of the research participant’s behavior or thoughts.

For example, this quotationfrom Fadiman, A. (1997). The spirit catches you and you fall down. New York: Farrar, Straus, & Giroux. While Foua was telling me about the dozens of tasks that constituted her “easy” work in Laos, I was thinking that when she said she was stupid, what she really meant was that none of her former skills were transferable to the U.S. – none, that is, except for being an excellent mother to her nine surviving children. It then occurred to me that this last skill had been officially contradicted by the American government, which had legally declared her a child abuser.

One approach to writing a narrative is “thick description” – creation of a picture of observed events, people involved, rules associated with certain activities, and social context or environment. Thick description can also incorporate the researcher’s perspectives.

Clients articulate their belief that the welfare system is not designed to help them succeed or care for their families….Often it feels as if the information they received from workers is blatantly wrong. In one focus group, participants talked assuredly of the misinformation they had received ….As one women said“[The policy] is a lie. This what happens in the welfare system”….Such a lack of trust raises serious questions about whether or not clients will heed front-line staff. From Sandfort, Kalil, & Gottschalk (1999). The mirror has two faces. Journal of Poverty, 3 (3), 71-91.

A narrative or text-based summary should include • Identification of common themes in responses. • Patterns of behavior • Cultural or other symbols found in the setting or described by respondents. • Identification/description of cultural norms • Common words or phrases used by many respondents with sample quotations • Minority responses with sample quotations

According to Jorgensen (1989 as cited in Seidel (1998) Analysis is a breaking up, separating, or dissassembling of research materials into pieces, parts, elements, or units. With facts broken down into manageable pieces, the researcher sorts and sifts them, searching for types, classes, sequences, processes, patterns, or wholes. The aim of this process is to assemble or reconstruct the data in a meaningful or comprehensible fashion (p. 107).

Steps in Data Analysis • Choose a unit of analysis (word, phrase, sentence, paragraph, entire transcript). • Identify possible categories to classify these data elements. • Classification can take place using one or both of the following methods: - Categories are developed for each of the questions to which respondents gave answers. - Categories are developed for the responses as a whole. These categories are also called themes.

The process for doing this is often called constant comparative method • You make comparisons among data elements. • You may need to take three or four attempts to analyze and interpret the data. • Your categories or themes may change as you examine additional transcripts. • You can also use a process called “negative case analysis.” Are there responses that don’t fit the other categories. How does this change your analysis?

Additional Steps • Make sure that you have been able to classify all your data. • May develop alternative categories and then choose one that best indicate a pattern or theme. • You probably will need to make a count of all the possible data elements in order to ensure that you’ve been able to classify all of them.

Data Management Techniques • Cut and paste transcripts on note cards and then put all elements that fit into one category in a pile. This allows you to reshuffle the cars. • Use magic marker or crayon to color code your categories. • Just use a scissor to cut and paste data elements. • “Cut and paste” on the computer – some qualitative software packages allow you to do these or allow you to search for similar words and phrases. • Develop a numerical code and place these codes within a copy of the transcript.

First stage of Coding is “Open-Coding, identifying what you see in a transcript:

Important to Keep Notes • Rationale for Coding Category (could be on transcript) • Need notes on your reasoning for classification. • Keep track of themes in data. • Try to tie categories together. Are there cause and effect relationships in the data? • Can keep analytic memos • Can keep memos on your own reaction to the data. [Note – all these elements can be incorporated into write up]

For example, if we were to use the following interview guide, we would transcribe all responses underneath each question in a WORD processing program. (Sample interview guide) 1. Can you describe how you first became aware of your deafness? Respondent #1 Respondent #2 Respondent #3 2. How do you see yourself today, in terms of your deafness? Respondent #1 Respondent #2 Respondent #3 From Janesick, V. (1998). "Stretching" exercises for qualitative researchers. Thousand Oaks, CA: Sage, p. 75.

  • More by User

Analyzing Qualitative Data

Analyzing Qualitative Data

Analyzing Qualitative Data. What does that mean?. Analysis. Qualitative analysis refers to ways of examining, comparing and contrasting, discerning, and interpreting meaningful patterns or themes in your data. Descriptive Analysis Causal Analysis

749 views • 18 slides

QUALITATIVE DATA COLLECTION

QUALITATIVE DATA COLLECTION

QUALITATIVE DATA COLLECTION . The aim of this session is to enable you to: • justify decisions about sampling select and apply an appropriate method of data collection construct questions for qualitative data collection and elicit high quality responses

730 views • 22 slides

Qualitative Data Analysis

Qualitative Data Analysis

Qualitative Data Analysis. Judith Lane. Qualitative methods. Interviews Questionnaires Focus groups Observation. The nature of qualitative data. Possibly smaller sample sizes – BUT probably larger amounts of data – one short interview can amount to 20-40 sides of printed text!!

699 views • 24 slides

Qualitative Data Analysis

Qualitative Data Analysis. Quantitative research . Involves information or data in the form of numbers Allows us to measure or to quantify things Respondents don’t necessarily give numbers as answers - answers are analysed as numbers Good example of quantitative research is the survey .

1.56k views • 86 slides

Qualitative Data Analysis

Qualitative Data Analysis. Looking for Themes and Patterns. Data to analyzed will consist of:. Words recorded on tape or transcribed. Your notes. Documents or other pre-existing items. Components of qualitative analysis. Organizing words or behaviors into categories, patterns, and themes.

635 views • 14 slides

Analyzing Qualitative Data

Analyzing Qualitative Data. After The Focus Group…. The researcher is left with a collection of comments from respondents Analyzing qualitative findings means discovering themes, patterns that emerge from these comments. The Rules of Qualitative Analysis.

305 views • 10 slides

Qualitative Data Analysis

Qualitative Data Analysis. Tim Winchell Analytical Techniques for Public Service The Evergreen State College Winter 2011. “It wasn’t curiosity that killed the cat. It was trying to make sense of all the data curiosity generated.” - Halcolm. Qualitative Data.

677 views • 45 slides

Qualitative Data

Qualitative Data

QUALITATIVE DATA ANALYSIS Transparency and Accountability Program Launch Workshop May 2 – 6, 2011 – Johannesburg, South Africa The Protea Hotel Balalaika ( Sandton ) May 4, 2011 11.00 -13.00 HRS .

353 views • 16 slides

Responsible Conduct of Research: Qualitative Data Management

Responsible Conduct of Research: Qualitative Data Management

Responsible Conduct of Research: Qualitative Data Management. Edna Greene Medford, Ph.D. Department of History. Exercise: Qualitative Data Management Nightmares.

267 views • 12 slides

QUALITATIVE DATA ANALYSIS

QUALITATIVE DATA ANALYSIS

QUALITATIVE DATA ANALYSIS. NERA Webinar Presentation Felice D. Billups, Ed.D. GETTING STARTED?. Have you just conducted a qualitative study involving… Interviews Focus Groups Observations Document or artifact analysis Journal notes or reflections?. WHAT TO DO WITH ALL THIS DATA?.

1.07k views • 36 slides

Coding Qualitative Data

Coding Qualitative Data

Coding Qualitative Data. Oral History Project: SS10 2014. #. http://youtu.be/ 57dzaMaouXA. https://support.twitter.com/articles/49309-using-hashtags-on-twitter #. # to Coding Your Data. s ee what’s popular (trending). m eans of organising information. e nabled by technology.

846 views • 7 slides

Qualitative Data Analysis

When you need a set of data analyzed, or you need any sort of conclusions to be drawn from a grouping of information, it is not always easy to find reliable help. There are plenty of services that take care of these things, but you need to find one that you can trust to perform accurate calculations.

174 views • 9 slides

Qualitative Data Analysis

Qualitative Data Analysis. What is Qualitative Analysis?. It is the non-numerical examination and interpretation of observations. Theorizing and analysis are tightly interwoven. The primary activity of analysis is the search for patterns and explanations for those patterns.

1.42k views • 22 slides

Qualitative Data Analysis

Qualitative Data Analysis. With QSR NVivo Graham R Gibbs and Kathryn Sharratt. QSR NVivo. Developed by Lyn and Tom Richards in Australia. Started as NUD.IST in 1980s. Now NVivo v. 10. NVivo at Huddersfield. The University now has a site licence for NVivo.

1.19k views • 72 slides

Qualitative Data

Qualitative Data. S. Venkatraman UIS-UNESCO. What it collects?. Actions, Events Activities Text Visual Behavioural Oral- Opinions, statements, discussion, stories. Qualitative research. Different methods Ethnographic

207 views • 5 slides

Qualitative Data

Qualitative Data. Presented by: Carol Askin Instructional Media/Data Analysis Advisor Local District 6. What is Qualitative Data and Research?.

238 views • 9 slides

Qualitative Data Analysis

Qualitative Data Analysis. With QSR NVivo Graham R Gibbs. QSR NVivo. Developed by Lyn and Tom Richards in Australia. Started as NUD.IST in 1980s. Now NVivo v. 10. NVivo at Huddersfield. The University now has a site licence for NVivo.

861 views • 65 slides

Qualitative Data

Building a Scientific Language Card. Kwal i t a tiv. Def. Experimenting in science involves collecting data and / or making observations. Qualitative data includes descriptions of things that you observe. descriptive - senses - observed not measured.

123 views • 3 slides

Reporting qualitative data

Reporting qualitative data

Reporting qualitative data. The problem: how to convey to the rest of the design team what was learned in qualitative needs assessment?. Representation. No representation is an objective, unbiased report of what’s real

201 views • 19 slides

Qualitative Data Analysis

Qualitative Data Analysis. Grounded theory. Grounded theory. GT is the systematic generation of theory from data could be seen as the o pposite to traditional research using scientific method

339 views • 11 slides

Qualitative data analysis

Qualitative data analysis

Qualitative data analysis. Principles of qualitative data analysis. I mportant for researchers to recognise and account for own perspective Respondent validation Seek alternative explanations

376 views • 21 slides

The Federal Register

The daily journal of the united states government, request access.

Due to aggressive automated scraping of FederalRegister.gov and eCFR.gov, programmatic access to these sites is limited to access to our extensive developer APIs.

If you are human user receiving this message, we can add your IP address to a set of IPs that can access FederalRegister.gov & eCFR.gov; complete the CAPTCHA (bot test) below and click "Request Access". This process will be necessary for each IP address you wish to access the site from, requests are valid for approximately one quarter (three months) after which the process may need to be repeated.

An official website of the United States government.

If you want to request a wider IP range, first request access for your current IP, and then use the "Site Feedback" button found in the lower left-hand side to make the request.

IMAGES

  1. Qualitative Data Analysis With Aim Techniques And Size

    powerpoint presentation of qualitative data

  2. The 7 most common qualitative slides and how to use them

    powerpoint presentation of qualitative data

  3. Qualitative Data Analysis PowerPoint Presentation Slides

    powerpoint presentation of qualitative data

  4. Qualitative Data Analysis PowerPoint Template

    powerpoint presentation of qualitative data

  5. Qualitative Data Analysis PowerPoint Template

    powerpoint presentation of qualitative data

  6. Examples Qualitative Data Ppt Powerpoint Presentation Model Slide Download Cpb

    powerpoint presentation of qualitative data

VIDEO

  1. Data Collection for Qualitative Studies

  2. Presentation of Data |Chapter 2 |Statistics

  3. 04. Lecture 2.1 tables for qualitative data

  4. 3 new ways to visualize qualitative data. Head to my blog for more

  5. Graphical Summaries of Qualitative Data

  6. Presentation of data ch 2 lec 1

COMMENTS

  1. [Guide] How to Present Qualitative Research Findings in PowerPoint?

    In order to present the qualitative research findings using PowerPoint, you need to create a robust structure for your presentation, make it engaging and visually appealing, present the patterns with explanations for it and highlight the conclusion of your research findings. In this article, we will help you understand the structure of your ...

  2. How to present and visualize qualitative data

    To do this, use visuals that are both attractive and informative. Presenting qualitative data visually helps to bring the user's attention to specific items and draw them into a more in-depth analysis. Visuals provide an efficient way to communicate complex information, making it easier for the audience to comprehend.

  3. PDF PowerPoint Presentation

    2. Generating research hypotheses that can be tested using more quant.tat.ve approaches. 3. Stimulating new .deas and creative concepts. 4. Diagnosing the potential for prob ems with a new program, service, or product. 5. Generating impressions of products, programs, services, institutions, or other objects of interest.

  4. Qualitative Research Resources: Presenting Qualitative Research

    Qualitative Data Analysis: A Methods Sourcebook (4th ed.) by Miles, Huberman, and Saldana has a section on Displaying the Data (and a chapter on Designing Matrix, Network, and Graphic Displays) that can help students consider numerous options for visually synthesizing data and findings. Many of the suggestions can be applied to designing ...

  5. PDF Asking the Right Question: Qualitative Research Design and Analysis

    Limitations of Qualitative Research. Lengthy and complicated designs, which do not draw large samples. Validity of reliability of subjective data. Difficult to replicate study because of central role of the researcher and context. Data analysis and interpretation is time consuming. Subjective - open to misinterpretation.

  6. Qualitative data analysis

    Qualitative data analysis. Mar 6, 2012 •. 917 likes • 1,228,234 views. Tilahun Nigatu Haregu. Follow. This presentation summarizes qualitative data analysis methods in a brief manner. Read and use for your qualitative researches. Health & Medicine Technology Business. 1 of 64.

  7. Top 10 Qualitative Research Report Templates with Samples ...

    Template 1 : Qualitative Research Proposal Template PowerPoint Presentation Slides. For the reader to understand your research proposal, you must have well-structured PPT slides. Don't worry, SlideTeam has you covered. Our pre-made research proposal template presentation slides have no learning curve.

  8. 5 Creative Data Visualization Techniques for Qualitative Research

    Here are several data visualization techniques for presenting qualitative data for better comprehension of research data. 1. Word Clouds. Word Clouds is a type of data visualization technique which helps in visualizing one-word descriptions. It is a single image composing multiple words associated with a particular text or subject.

  9. How to Present Data in PowerPoint: Expert Strategies

    Make sure your data is accurate, up-to-date, and relevant to your presentation topic. Your goal will be to create clear conclusions based on your data and highlight trends. 2. Know your audience. Knowing who your audience is and the one thing you want them to get from your data is vital.

  10. Qualitative Data Analysis PowerPoint Presentation Slides

    Download our customizable Qualitative Data Analysis PPT template to showcase the steps, methods, and other critical elements/features of the qualitative data analysis process. Qualitative Data Analysis PowerPoint Presentation Slides - PPT Template

  11. Analyzing and Interpreting Qualitative Data Powerpoint

    Analyzing-and-Interpreting-Qualitative-Data-Powerpoint.pptx - Free download as Powerpoint Presentation (.ppt / .pptx), PDF File (.pdf), Text File (.txt) or view presentation slides online. Scribd is the world's largest social reading and publishing site.

  12. PPT

    Presentation Transcript. Analyzing Qualitative Data A process of making sense of non-numeric data. Data from: • Narrative documents (speeches, newspapers, diaries, reports, etc) • Open-ended interviews • Open-ended questions on a survey • Case study (as the principal method or as embedded in a larger complex of qualitative data and ...

  13. Quantitative And Qualitative Research

    Quantitative and qualitative research methods differ in important ways. Quantitative research uses statistical analysis of numeric data from standardized instruments, while qualitative research relies on descriptive analysis of text or image data collected from a small number of individuals. The two approaches also differ in how the research ...

  14. Data analysis

    Data analysis - qualitative data presentation 2. Jun 2, 2012 • Download as PPT, PDF •. 45 likes • 43,669 views. Azura Zaki. Technology Education. 1 of 33. Download now. Data analysis - qualitative data presentation 2 - Download as a PDF or view online for free.

  15. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  16. Qualitative Data

    Presenting this set of slides with name example qualitative data ppt powerpoint presentation infographic template tips cpb. This is a three stage process. The stages in this process are example qualitative data. This is a completely editable PowerPoint presentation and is available for immediate download. Download now and impress your audience.

  17. Chapter 14: Qualitative Data Collection

    Qualitative Data Analysis. Qualitative Data Analysis. Quantitative research . Involves information or data in the form of numbers Allows us to measure or to quantify things Respondents don't necessarily give numbers as answers - answers are analysed as numbers Good example of quantitative research is the survey . 1.56k views • 86 slides

  18. (PPT) Qualitative Data Collection Methods

    Qualitative Research Methods: Glossary. 2011 •. Kevin Meethan. • Data analysis-the examination of research data.•. Data collection-the systematic process of collecting data.•. Deduction-arriving at logical conclusions through the application of rational processes eg theory testing. Quantitative research tends to be deductive.•.

  19. Slideshare Presentation of Qualitative Data

    3. Importance Of Proper Selection Of Data Presentation Strategies Presentation of primary data allows the participants in the study to provide rich source of information separate from the interpretation of the researcher. Presentation of few quotations taken out of context make this task extremely difficult. Wolcott (1990) suggests that, it is more elucidating for researchers to include more ...

  20. Quantitative Data Vs Qualitative Data PowerPoint Template

    Download the PPT right away and give your presentations a whole new spin! Exclusive access to over 200,000 completely editable slides. Download our Quantitative Data vs Qualitative Data PPT template to demonstrate the key distinctions between quantitative and qualitative data types. The set offers hassle-free editing.

  21. PPT

    Presentation Transcript. QUALITATIVE DATA ANALYSIS A/Professor Denis McLaughlin School of Educational Leadership. QUALITATIVE DATA ANALYSIS You have a book of readings with relevant extracts from the following books. They must be read Dey, I (1993) Qualitative data analysis, London: Routledge Miles, M & Huberman, A (1984).

  22. Qualitative Data Analysis

    Qualitative Data Analysis. Dec 1, 2016 • Download as PPT, PDF •. 9 likes • 8,562 views. Dr. Senthilvel Vasudevan. For Qualitative Data Analysis. Health & Medicine. 1 of 30. Download now. Qualitative Data Analysis - Download as a PDF or view online for free.

  23. Evaluation of the feasibility of a midwifery educator continuous

    Qualitative data was analysed using thematic framework analysis by Braun and Clarke (2006) as it provides clear steps to follow, is flexible and uses a very structured process and enables transparency and team working . Due to the small number of transcripts, computer assisted coding in Microsoft Word using the margin and comments tool were used.

  24. PPT

    75. Qualitative Data Management. Establishing Trustworthiness. We enhance reliability or rigor of our data analysis by:. Comparing our categories to pre-existing frameworks. Having an additional person redo the analysis. Comparing notes from more than one source. Slideshow 8911762 by corchado.

  25. Notice of Funding Opportunity for Projects Located on the Northeast

    Applicants are expected to provide the requested data to the maximum extent practicable. Appropriate rounding or best estimates are acceptable in instances where precise data is unavailable or to account for possible uncertainty. Where data is not available, applicants may provide a qualitative explanation of the anticipated impact of the project.