presentation of methodology

Princeton Correspondents on Undergraduate Research

How to Make a Successful Research Presentation

Turning a research paper into a visual presentation is difficult; there are pitfalls, and navigating the path to a brief, informative presentation takes time and practice. As a TA for  GEO/WRI 201: Methods in Data Analysis & Scientific Writing this past fall, I saw how this process works from an instructor’s standpoint. I’ve presented my own research before, but helping others present theirs taught me a bit more about the process. Here are some tips I learned that may help you with your next research presentation:

More is more

In general, your presentation will always benefit from more practice, more feedback, and more revision. By practicing in front of friends, you can get comfortable with presenting your work while receiving feedback. It is hard to know how to revise your presentation if you never practice. If you are presenting to a general audience, getting feedback from someone outside of your discipline is crucial. Terms and ideas that seem intuitive to you may be completely foreign to someone else, and your well-crafted presentation could fall flat.

Less is more

Limit the scope of your presentation, the number of slides, and the text on each slide. In my experience, text works well for organizing slides, orienting the audience to key terms, and annotating important figures–not for explaining complex ideas. Having fewer slides is usually better as well. In general, about one slide per minute of presentation is an appropriate budget. Too many slides is usually a sign that your topic is too broad.

presentation of methodology

Limit the scope of your presentation

Don’t present your paper. Presentations are usually around 10 min long. You will not have time to explain all of the research you did in a semester (or a year!) in such a short span of time. Instead, focus on the highlight(s). Identify a single compelling research question which your work addressed, and craft a succinct but complete narrative around it.

You will not have time to explain all of the research you did. Instead, focus on the highlights. Identify a single compelling research question which your work addressed, and craft a succinct but complete narrative around it.

Craft a compelling research narrative

After identifying the focused research question, walk your audience through your research as if it were a story. Presentations with strong narrative arcs are clear, captivating, and compelling.

  • Introduction (exposition — rising action)

Orient the audience and draw them in by demonstrating the relevance and importance of your research story with strong global motive. Provide them with the necessary vocabulary and background knowledge to understand the plot of your story. Introduce the key studies (characters) relevant in your story and build tension and conflict with scholarly and data motive. By the end of your introduction, your audience should clearly understand your research question and be dying to know how you resolve the tension built through motive.

presentation of methodology

  • Methods (rising action)

The methods section should transition smoothly and logically from the introduction. Beware of presenting your methods in a boring, arc-killing, ‘this is what I did.’ Focus on the details that set your story apart from the stories other people have already told. Keep the audience interested by clearly motivating your decisions based on your original research question or the tension built in your introduction.

  • Results (climax)

Less is usually more here. Only present results which are clearly related to the focused research question you are presenting. Make sure you explain the results clearly so that your audience understands what your research found. This is the peak of tension in your narrative arc, so don’t undercut it by quickly clicking through to your discussion.

  • Discussion (falling action)

By now your audience should be dying for a satisfying resolution. Here is where you contextualize your results and begin resolving the tension between past research. Be thorough. If you have too many conflicts left unresolved, or you don’t have enough time to present all of the resolutions, you probably need to further narrow the scope of your presentation.

  • Conclusion (denouement)

Return back to your initial research question and motive, resolving any final conflicts and tying up loose ends. Leave the audience with a clear resolution of your focus research question, and use unresolved tension to set up potential sequels (i.e. further research).

Use your medium to enhance the narrative

Visual presentations should be dominated by clear, intentional graphics. Subtle animation in key moments (usually during the results or discussion) can add drama to the narrative arc and make conflict resolutions more satisfying. You are narrating a story written in images, videos, cartoons, and graphs. While your paper is mostly text, with graphics to highlight crucial points, your slides should be the opposite. Adapting to the new medium may require you to create or acquire far more graphics than you included in your paper, but it is necessary to create an engaging presentation.

The most important thing you can do for your presentation is to practice and revise. Bother your friends, your roommates, TAs–anybody who will sit down and listen to your work. Beyond that, think about presentations you have found compelling and try to incorporate some of those elements into your own. Remember you want your work to be comprehensible; you aren’t creating experts in 10 minutes. Above all, try to stay passionate about what you did and why. You put the time in, so show your audience that it’s worth it.

For more insight into research presentations, check out these past PCUR posts written by Emma and Ellie .

— Alec Getraer, Natural Sciences Correspondent

Share this:

  • Share on Tumblr

presentation of methodology

Guide to Research Methods

About the guide

This guide will

  • Introduce you to a range of research methods
  • Help you think about the value and limitations of different research methods
  • Identify when to use alternative research methods

You should use the guide

  • After or while you establish your research questions (See the Guide to Research Questions )
  • When you are completing your Research Design Framework
  • When you are thinking about who you want to talk to and why (See the Guide to Sampling )

You should print or read this guide

These slides are set up so that they can be printed back to back (two/four sided) to give:

  • A short hand overview about when to use each method
  • A summary of the method, what it’s good for and limitations (linking to other slides in this pack)

Choosing research methods

When you need to think about which method is best in theory and in practice

Choosing Research Methods

Providing a rationale for the methods you choose to use and how you employ them.

  • What are your research goals? If you are looking to influence experts or policy makers, quantitative approaches will add weight to your findings. If you are looking to understand problems, inform innovation or develop a prototype, look at qualitative methods or user research
  • What are your research questions? If they begin with ‘explore’ or ‘what’ look at qualitative methods (talking). If they begin with ‘identify’ or ‘why’ look at quantitative (see guide to research questions )
  • What research traditions exist? You may choose to follow or challenge them. Think about whether you want your research to be noted for its quality and robustness or creative approach and unique insights
  • What are your/your teams skills? You may not be an expert in the most appropriate method so consider asking for other team members or commissioning out research
  • Who are you research participants? Think about your relationship to participants (especially if you are doing qualitative research) and how they will respond to you and the method. Consider if they are often consulted or surveyed and whether if could be helpful or unhelpful to stick with their comfort zone or not.

Using online tools

When you need to decide which tools to use for research

What to think about when choosing a tool to conduct research

  • What’s the cost to the research quality ? Most tools are ‘freemium’, use a basic version for free. BUT these are designed to annoy you to pay to do good research. Consider privacy settings, data access, storage and value for money. Survey tools will have no option to filter participants (if yes/no answer this q), a 10Q limit, no branding. Mapping/visualisations are published online and open source tools aren’t always user friendly
  • Start with user needs, understand the context and think about everyone. Consider what technology they have, how they will access the tool and what they need to do this. Do they have internet, data, time?
  • Be creative: Online tools may not be designed for research, but Google Forms, Trello, Workflowy and Slack are all valuable collaboration tools. Twitter and Facebook polls may increase participation in research. However, think about what they are missing, what they can’t do and pilot your analysis approach first
  • See what’s out there: This online sheet of Applied Social Research Guides and Resources includes a list of online tools for research and evaluation to test. Those widely used for your research method or sector are likely to be the best starting point. Some tools allow you to do research (see Tags for Twitter data capture), analyse it or present it in new ways (see Raw Graph s for data visualisation)

Contents: Methods summary

  • Structured Interviews : When you want to gain a broad range of perspectives about specific questions
  • Semi-Structured Interviews : When you want to gain in-depth insights about broad questions
  • Unstructured Interviews : When you want to gain in-depth insights about a complex research topics
  • Telephone Interviews : A tool for when you want to interview people quickly and easily
  • Guerilla Interviews : When you want to carry out user research or explore general perspectives quickly
  • Contextual Interviews : When you want to understand actions and particular experiences indepth and in context
  • Focus Groups : When you want to understand shared experiences and different perspectives
  • Participant Observation : When you want to ‘learn by doing’ or observe social interactions and behaviour
  • Ethnography : When you want to experience social practices, interactions and behaviour with minimal influence
  • Surveys: When you want to generate numerical data about the scale of people’s opinions and feelings
  • Mixed Methods: When one method cannot fully answer your main research question
  • User Research : When you want to learn about the behaviours and motivations of your target audience
  • Service Design Research : When you want to design a service to meet people’s needs.
  • Content Analysis : When you want to understand public discourse through secondary or online data
  • Workshops : When you want to engage stakeholders in research, generate ideas or codesign solutions
  • Usability tests : When you want to test prototypes or learn about problems with an existing service

Find out more

How to do good…

  • Applied social research: A curated online sheet of Applied Social Research Guides and Resources
  • Surveys : Guide to creating questions here and here , build on existing data/questions , analysis guide
  • Interviews : A nice overview here which includes how to structure an interview
  • User research : The GDS for intro guides and DisAmbiguity blog
  • Service design: This is Service Design Doing has great tools and formats for workshops

Inspiration for emerging research methods and creative formats for research

  • Ethnography and mixed methods presented well: Ikea At Home Report
  • User mapping techniques as a social research method NPC Report
  • User Research to understand domestic abuse experiences and the potential for technology Tech Vs Abuse
  • Using Twitter data for social research Demos
  • Data visualisation as a tool for research communication - Nesta data visualisation and Women’s Aid Map
  • Data journalism and data storytelling - Guardian reading the riots
  • An online games to shift perspective on a social problem - Financial Times Uber Story
  • Content analysis to map trends - Nesta analysed creative skills in job adverts
  • Issue mapping online - networks of websites and people on Twitter - Warwick University Issue Mapping

Structured Interviews

When you want to gain a broad range of perspectives about specific questions

Also consider

Semi-structured interviews

A conversation with a set structure (a script of fixed questions) and specific purpose. Can be a method to undertake a survey or called a ‘directed’ interview.

  • Asking standardised questions across many participants makes data easier to analyse and compare
  • Giving participants a clear guide about what you want to learn from them
  • Topics that would be too complex to capture in a questionnaire tick box/short response
  • Respondents with limited time, who want to consider responses in advance or do not want to write
  • The quality of the interview is less dependent on the interviewer and their rapport with the interviewee

Limitations (and how to avoid or what to consider instead)

  • The structure prevents participants from bringing in other ideas (consider semi-structured interviews )
  • Whilst quicker to conduct and analyse than semi-structured interviews, they are still resource intensive and only possible to do with limited numbers of people (consider questionnaires online - see surveys )

Semi-Structured Interviews

When you want to gain in-depth insights about broad questions

Participant Observation

User research

Focus groups

Semi-Structured interviews

Conversation with a structure (set of open questions) and clear purpose. Also called directed interviews.

  • Exploring a range of perspectives on research questions, engaging experts and getting buy-in to research
  • Gaining in-depth insights about how people feel or interpret complex issues
  • Topics which are sensitive, difficult to express in writing or to articulate views about in a survey
  • Allowing participants to respond in their words, framing what they see as important

Limitations

  • Quality can depend on interviewer skills and put people on the spot (consider setting topics in advance)
  • The set-up affects the quality of engagement and discussion (consider location, relationship with the interviewee and whether you should do a face to face or Telephone/Online interview )
  • Time consuming to do, analyse and compare (consider Structured Interviews or Focus groups )
  • Can lack validity as evidence (consider Surveys )
  • Explore what people say, think and remember, not what they actually do (consider Participant Observation contextual interviews or User Research ) or shared perspectives (consider Focus groups )
  • Easy to provide too much structure and prevent open exploration of a topic (see unstructured interviews )

Unstructured Interviews

When you want to gain in-depth insights about a complex research topics

Contextual interviews

Unstructured interviews

A loosely structured open conversation guided by research topics (also called non-directed interviews)

  • Very exploratory research and broad research questions
  • Letting the participant guide the interview according to their priorities and views
  • In-depth and broad discussion about a person's expertise, experiences and opinions
  • Participant can feel like the they are not saying the ‘right’ thing (explain technique and rationale well)
  • Whilst useful for expert interviews, an unstructured approach can give the impression that the interviewer is unprepared, lacks knowledge or the research purpose is unclear (consider semi-structured interviews )
  • Interviews are longer, resource intensive and only smaller numbers are possible (consider focus groups )
  • Generates in-depth insights that are difficult to analyse and compare
  • A lack of structure can encourage participants to focus in-depth on one thing they are positive about or know very well in-depth (consider using desk research to inform the interview topics)

Guerilla Interviews

When you want to carry out user research or explore general perspectives quickly and easily

An ‘impromptu’ approach to interviewing, often talking to real people on the street or at a key site

  • Gaining immediate responses to a tool or design and insights into a problem
  • Informal method means participants can be more relaxed and open
  • Speaking to a lot of people, simply, quickly and cheaply about one key question
  • User research and user experience of interacting with digital products
  • Speaking to people for convenience (users are available in a single place and time) introduces sample bias (but you can add more targeting and profiling of participants, see the Guide to Sampling )
  • The lack of formal structure can mean that you miss important questions or insights
  • Findings are often unreliable and not generalisable because they rely on a single type of user
  • Difficult to understand complexity or gain contextual insights

Telephone / online interviews

A tool for when you want to interview people quickly and easily

Telephone or Online interviews

A tool to conduct an interview (it is not a method in itself) which is not in person/ face to face

  • Conducting interviews without the costs of travel and meeting time (often shorter)
  • Expert and stakeholder interviews, when you already know the participant well or they are short of time
  • Taking notes and looking up information whilst interviewing is less disruptive than in person, easy to record
  • Sending informed consent information and interview questions in advance
  • Can be difficult to undertake an engaging interview (hard to build rapport on the phone)
  • Often need to be shorter and put alongside other meetings

What method are you using?

  • Structured interviews : When you want to gain a broad range of perspectives about specific questions
  • Semi-structured interviews : When you want to gain in-depth insights about broad questions
  • Unstructured interviews : When you want to gain in-depth insights about a complex research topics

Further guides to Interviews : A nice overview here , including how to structure an interview

Contextual Interview

When you want to understand actions and particular experiences in-depth and in context

Ethnography

Interviews conducted with people in a situational context relevant to the research question; also known as contextual inquiry.

  • Understanding what happens, experiences and emotions whilst interacting with a tool, service or event.
  • Easier for research participants to show rather than explain, participants are active and engaged
  • Uncover what happens, what people do, how they behave in the moment, rather than how they remember this and give meaning to these responses later.
  • Open and flexible method giving depth of insights about a tool or specific interaction
  • Time and resource intensive for the researcher
  • Each context is unique - making it difficult to generalise from or to answer broader research questions about experiences (consider semi-structured interviews )
  • The researcher influences the interactions and events (consider ethnography or participant observation )

When you want to understand shared experiences and different perspectives

Focus Groups

An organised discussion with a group of participants, led by a facilitator around a few key topics

  • Gaining several perspectives about the same topic quickly
  • Research contexts and topics where familiarity between participants can generate discussion about similar experiences (or different ones) which may not arise in a one to one interview
  • When attitudes, feelings and beliefs are more likely to be revealed in social gathering and interactions
  • Including tasks and creative methods to elicit views (e.g. shared ranking of importance of statements)
  • Difficult to identify the individual view from the group view (consider semi-structured interviews )
  • Group dynamics will affect the conversation focus and participation levels of different members
  • The role of the moderator is very significant. Good levels of group leadership and interpersonal skill are required to moderate a group successfully.
  • The group set-up is an ‘artificial’ social setting and discussion (consider Participant Observation )

Participant observation

When you want to ‘learn by doing’ and observe social interactions and behaviour

Participant observation/ shadowing

The researcher immerses themselves in lives of participants as an ‘observer’ of their behaviours, practices and interactions. A type of ethnography. The people being observed know about the research.

  • Understanding everyday behaviours, interactions and practice in the context that they occur
  • Gaining an intuitive understanding of what happens in practice and what this means for those involved
  • Allowing research participants to show you what they do, when they can’t describe and remember this well
  • Establishing topics for further investigation through more structured or focused research methods
  • If explicit (shadowing for example) the research situation is still ‘artificial’
  • Your audience may not respect it and can be difficult to generalise from (consider mixed methods)
  • The quality of the data is dependent on the researchers’ skills and relationships with participants

When you want to experience social practices, interactions and behaviour with minimal influence on what happens

The systematic study of a group of people or cultures to understand behaviours and interactions. The researcher becomes an ‘insider’. It is a way of presenting research findings, as well as a method, which can include participant observation, document analysis and visual methods.

  • When you need to be an ‘insider’ to fully access the research context (such as organisational cultures)
  • Presenting how everyday behaviours, interactions and practice occur in context
  • Gaining an in-depth knowledge of your research context, participants and social relationships
  • When little is known about a research context or topic
  • If covert (at a conference or workplace for example) it has implications for informed consent
  • If explicit (shadowing for example) the researcher’s presence can affect the interactions and findings

Example use case : Ikea At Home research study to understand how people feel about their home

When you want to generate numerical data about the scale of people’s opinions and feelings

Mixed Methods

A process of systematically collecting information from a large number of different people. Responses are summarised as statistics (online surveys automate this analysis for you).

  • Targeting specific types of research participant and providing data about their views
  • If designed well, they can be quick, simple and non intrusive for research participants
  • Findings can have more credibility than other methods because of their breadth
  • Describing, measuring and understanding (a basic questionnaire)
  • Statistical analysis, modelling cause and effect (large scale survey designed to represent the population)
  • Can raise more questions about what happens and why, lack depth of insight (consider mixed methods )
  • Hard to design well and require a lot of time upfront and data skills to analyse the results
  • Low completion rates and people feel ‘over surveyed’ (consider incentives )
  • Assumes people will be honest and sufficiently aware of the research context to provide credible answers.

Further information: A great guide to creating questions here and here , build on existing data/questions here

When one research method cannot fully answer your main research question

Mixed methods

Combining different methods to answer your research questions, can be a mix of quantitative or qualitative methods or both. It may mean working with different types of data, research designs or being part of a research team (covering different research disciplines)

  • Overcoming the limitation of relying on a single research method or approach
  • Triangulating findings (i.e. using an additional method) can give them more validity
  • Accessing different types of research participants
  • A more holistic understanding about how, why and the extent to which something happens
  • Answering different types of research questions about frequency and perceptions
  • Giving findings more validity and influence because of the range of data and insights
  • Requires a broader range of skills and more time to deliver, analyse and report on
  • Research design must have strong sequencing (when each method is used and analysed , why) to make the most of a mixed methods approach - not always possible in a tight timescale or short research project

User Research

When you want to learn about people’s needs, behaviours and motivations for using a service

Service Design

S emi-Structured Interviews

Usability testing

A research approach employed to understand users and their needs, motivations and behaviours, primarily to inform service design.

  • User-centered design processes which look to ensure services meet the needs of their audience
  • Gaining specific insights into how a person interacts with a digital tool or service
  • Exploring general needs, behaviours and motivations for a specific target group using a range of services
  • Focus on a tool or service can prevent wider analysis, relevance and applicability
  • Research can lack credibility due to small numbers, set up, documentation (often highly specific focus)
  • Can overlook those who do not use a service for a whole range of reasons

What method?

  • User research involves any method which looks at who users are, the problems they face, what they are trying to do and how they use existing services. This can create user personas, user journeys and user experience maps. It largely includes qualitative research methods.

When you want to design a service to meet people’s needs, including planning, organising, infrastructure, communication and components)

A research approach employed in the activity of planning and organising of people, infrastructure, communication and material components of a service, in order to improve quality and interaction.

  • Gaining a holistic picture of all components (infrastructure, people, organisations, culture) affecting how a person interacts with a service
  • Service design often begins with user research but participants in research include all those involved in delivering (not just using) a service, such as employees and stakeholders in an organisation as well as looking at the context and system which affect how a service works and its effectiveness

Content analysis

When you want to understand public discourse through secondary or online data

A systematic process of classifying and interpreting documents, text or images to analyse key discourses (their meaning) or to quantify patterns (such as word frequencies). This can be done manually or it can be automated.

  • Exploring the focus of messages, text or imagery and change over time
  • Secondary data sources, such as archives, online social media data (such as Tweets) and news articles
  • Gaining a qualitative or quantitative insights about key messages
  • Focuses on public and documented interpretations of events and experiences
  • Documents are not exhaustive and not all are accessible (or available online/freely)
  • Qualitative coding is time intensive to manually classify, reliant on researcher interpretation
  • Automated coding for key words can miss nuances and difficult to produce meaningful findings

When you want to engage stakeholders in research, generate ideas or codesign solutions

Also consider:

A tool to undertake research. It is an interactive session, often taking a full day, in which research participants sor stakeholders work intensively on an issue or question. The process can combine elements of qualitative research, brainstorming or problem solving.

  • Engaging stakeholders - building empathy with and understanding of research findings
  • Understanding problems or prototyping solutions, linked to user research and service design approaches
  • Participatory research, allowing participants to shape agendas and outcomes
  • Creative, collaborative and engaging activities to build rapport and understanding with participants
  • Participatory design, enabling participants to co-design solutions which work for them
  • Highly dependent on the right people attending and the facilitation skills
  • Can be a lot of time and effort to coordinate a workshop effectively and analyse findings
  • The immersive and collaborative environment makes it difficult to document effectively
  • Collaborative solutions may duplicate existing problems or solutions

When you want to test prototypes or learn about problems with an existing service

A user research method where you watch participants try to complete specific tasks using your service. Moderated testing involve interaction with the research participant, asking them to explain what they are doing, thinking and feeling. Unmoderated testing is completed alone by the participant.

  • Identify any usability issues with a digital service - for example, problems with the language or layout
  • Seeing if users understand what they need to do in order to complete designated tasks
  • Generating ideas to improve a prototype of existing digital service
  • Assessing user experience
  • Focus is not on ‘natural’ use (consider contextual interviews , participant observation , ethnography )
  • Data is about a specific design and interaction with a tool at that moment
  • Findings cannot be generalised or applicable more broadly to understand users and behaviours

Please log in to save materials. Log in

  • Resource Library
  • Research Methods
  • VIVA Grant Recipients
  • Vgr-social-work-research

Education Standards

Radford university.

Learning Domain: Social Work

Standard: Basic Research Methodology

Lesson 10: Sampling in Qualitative Research

Lesson 11: qualitative measurement & rigor, lesson 12: qualitative design & data gathering, lesson 1: introduction to research, lesson 2: getting started with your research project, lesson 3: critical information literacy, lesson 4: paradigm, theory, and causality, lesson 5: research questions, lesson 6: ethics, lesson 7: measurement in quantitative research, lesson 8: sampling in quantitative research, lesson 9: quantitative research designs, powerpoint slides: sowk 621.01: research i: basic research methodology.

PowerPoint Slides: SOWK 621.01: Research I: Basic Research Methodology

The twelve lessons for SOWK 621.01: Research I: Basic Research Methodology as previously taught by Dr. Matthew DeCarlo at Radford University. Dr. DeCarlo and his team developed a complete package of materials that includes a textbook, ancillary materials, and a student workbook as part of a VIVA Open Course Grant.

The PowerPoint slides associated with the twelve lessons of the course, SOWK 621.01: Research I: Basic Research Methodology, as previously taught by Dr. Matthew DeCarlo at Radford University. 

Version History

  • Resources Home 🏠
  • Try SciSpace Copilot
  • Search research papers
  • Add Copilot Extension
  • Try AI Detector
  • Try Paraphraser
  • Try Citation Generator
  • April Papers
  • June Papers
  • July Papers

SciSpace Resources

Here's What You Need to Understand About Research Methodology

Deeptanshu D

Table of Contents

Research methodology involves a systematic and well-structured approach to conducting scholarly or scientific inquiries. Knowing the significance of research methodology and its different components is crucial as it serves as the basis for any study.

Typically, your research topic will start as a broad idea you want to investigate more thoroughly. Once you’ve identified a research problem and created research questions , you must choose the appropriate methodology and frameworks to address those questions effectively.

What is the definition of a research methodology?

Research methodology is the process or the way you intend to execute your study. The methodology section of a research paper outlines how you plan to conduct your study. It covers various steps such as collecting data, statistical analysis, observing participants, and other procedures involved in the research process

The methods section should give a description of the process that will convert your idea into a study. Additionally, the outcomes of your process must provide valid and reliable results resonant with the aims and objectives of your research. This thumb rule holds complete validity, no matter whether your paper has inclinations for qualitative or quantitative usage.

Studying research methods used in related studies can provide helpful insights and direction for your own research. Now easily discover papers related to your topic on SciSpace and utilize our AI research assistant, Copilot , to quickly review the methodologies applied in different papers.

Analyze and understand research methodologies faster with SciSpace Copilot

The need for a good research methodology

While deciding on your approach towards your research, the reason or factors you weighed in choosing a particular problem and formulating a research topic need to be validated and explained. A research methodology helps you do exactly that. Moreover, a good research methodology lets you build your argument to validate your research work performed through various data collection methods, analytical methods, and other essential points.

Just imagine it as a strategy documented to provide an overview of what you intend to do.

While undertaking any research writing or performing the research itself, you may get drifted in not something of much importance. In such a case, a research methodology helps you to get back to your outlined work methodology.

A research methodology helps in keeping you accountable for your work. Additionally, it can help you evaluate whether your work is in sync with your original aims and objectives or not. Besides, a good research methodology enables you to navigate your research process smoothly and swiftly while providing effective planning to achieve your desired results.

What is the basic structure of a research methodology?

Usually, you must ensure to include the following stated aspects while deciding over the basic structure of your research methodology:

1. Your research procedure

Explain what research methods you’re going to use. Whether you intend to proceed with quantitative or qualitative, or a composite of both approaches, you need to state that explicitly. The option among the three depends on your research’s aim, objectives, and scope.

2. Provide the rationality behind your chosen approach

Based on logic and reason, let your readers know why you have chosen said research methodologies. Additionally, you have to build strong arguments supporting why your chosen research method is the best way to achieve the desired outcome.

3. Explain your mechanism

The mechanism encompasses the research methods or instruments you will use to develop your research methodology. It usually refers to your data collection methods. You can use interviews, surveys, physical questionnaires, etc., of the many available mechanisms as research methodology instruments. The data collection method is determined by the type of research and whether the data is quantitative data(includes numerical data) or qualitative data (perception, morale, etc.) Moreover, you need to put logical reasoning behind choosing a particular instrument.

4. Significance of outcomes

The results will be available once you have finished experimenting. However, you should also explain how you plan to use the data to interpret the findings. This section also aids in understanding the problem from within, breaking it down into pieces, and viewing the research problem from various perspectives.

5. Reader’s advice

Anything that you feel must be explained to spread more awareness among readers and focus groups must be included and described in detail. You should not just specify your research methodology on the assumption that a reader is aware of the topic.  

All the relevant information that explains and simplifies your research paper must be included in the methodology section. If you are conducting your research in a non-traditional manner, give a logical justification and list its benefits.

6. Explain your sample space

Include information about the sample and sample space in the methodology section. The term "sample" refers to a smaller set of data that a researcher selects or chooses from a larger group of people or focus groups using a predetermined selection method. Let your readers know how you are going to distinguish between relevant and non-relevant samples. How you figured out those exact numbers to back your research methodology, i.e. the sample spacing of instruments, must be discussed thoroughly.

For example, if you are going to conduct a survey or interview, then by what procedure will you select the interviewees (or sample size in case of surveys), and how exactly will the interview or survey be conducted.

7. Challenges and limitations

This part, which is frequently assumed to be unnecessary, is actually very important. The challenges and limitations that your chosen strategy inherently possesses must be specified while you are conducting different types of research.

The importance of a good research methodology

You must have observed that all research papers, dissertations, or theses carry a chapter entirely dedicated to research methodology. This section helps maintain your credibility as a better interpreter of results rather than a manipulator.

A good research methodology always explains the procedure, data collection methods and techniques, aim, and scope of the research. In a research study, it leads to a well-organized, rationality-based approach, while the paper lacking it is often observed as messy or disorganized.

You should pay special attention to validating your chosen way towards the research methodology. This becomes extremely important in case you select an unconventional or a distinct method of execution.

Curating and developing a strong, effective research methodology can assist you in addressing a variety of situations, such as:

  • When someone tries to duplicate or expand upon your research after few years.
  • If a contradiction or conflict of facts occurs at a later time. This gives you the security you need to deal with these contradictions while still being able to defend your approach.
  • Gaining a tactical approach in getting your research completed in time. Just ensure you are using the right approach while drafting your research methodology, and it can help you achieve your desired outcomes. Additionally, it provides a better explanation and understanding of the research question itself.
  • Documenting the results so that the final outcome of the research stays as you intended it to be while starting.

Instruments you could use while writing a good research methodology

As a researcher, you must choose which tools or data collection methods that fit best in terms of the relevance of your research. This decision has to be wise.

There exists many research equipments or tools that you can use to carry out your research process. These are classified as:

a. Interviews (One-on-One or a Group)

An interview aimed to get your desired research outcomes can be undertaken in many different ways. For example, you can design your interview as structured, semi-structured, or unstructured. What sets them apart is the degree of formality in the questions. On the other hand, in a group interview, your aim should be to collect more opinions and group perceptions from the focus groups on a certain topic rather than looking out for some formal answers.

In surveys, you are in better control if you specifically draft the questions you seek the response for. For example, you may choose to include free-style questions that can be answered descriptively, or you may provide a multiple-choice type response for questions. Besides, you can also opt to choose both ways, deciding what suits your research process and purpose better.

c. Sample Groups

Similar to the group interviews, here, you can select a group of individuals and assign them a topic to discuss or freely express their opinions over that. You can simultaneously note down the answers and later draft them appropriately, deciding on the relevance of every response.

d. Observations

If your research domain is humanities or sociology, observations are the best-proven method to draw your research methodology. Of course, you can always include studying the spontaneous response of the participants towards a situation or conducting the same but in a more structured manner. A structured observation means putting the participants in a situation at a previously decided time and then studying their responses.

Of all the tools described above, it is you who should wisely choose the instruments and decide what’s the best fit for your research. You must not restrict yourself from multiple methods or a combination of a few instruments if appropriate in drafting a good research methodology.

Types of research methodology

A research methodology exists in various forms. Depending upon their approach, whether centered around words, numbers, or both, methodologies are distinguished as qualitative, quantitative, or an amalgamation of both.

1. Qualitative research methodology

When a research methodology primarily focuses on words and textual data, then it is generally referred to as qualitative research methodology. This type is usually preferred among researchers when the aim and scope of the research are mainly theoretical and explanatory.

The instruments used are observations, interviews, and sample groups. You can use this methodology if you are trying to study human behavior or response in some situations. Generally, qualitative research methodology is widely used in sociology, psychology, and other related domains.

2. Quantitative research methodology

If your research is majorly centered on data, figures, and stats, then analyzing these numerical data is often referred to as quantitative research methodology. You can use quantitative research methodology if your research requires you to validate or justify the obtained results.

In quantitative methods, surveys, tests, experiments, and evaluations of current databases can be advantageously used as instruments If your research involves testing some hypothesis, then use this methodology.

3. Amalgam methodology

As the name suggests, the amalgam methodology uses both quantitative and qualitative approaches. This methodology is used when a part of the research requires you to verify the facts and figures, whereas the other part demands you to discover the theoretical and explanatory nature of the research question.

The instruments for the amalgam methodology require you to conduct interviews and surveys, including tests and experiments. The outcome of this methodology can be insightful and valuable as it provides precise test results in line with theoretical explanations and reasoning.

The amalgam method, makes your work both factual and rational at the same time.

Final words: How to decide which is the best research methodology?

If you have kept your sincerity and awareness intact with the aims and scope of research well enough, you must have got an idea of which research methodology suits your work best.

Before deciding which research methodology answers your research question, you must invest significant time in reading and doing your homework for that. Taking references that yield relevant results should be your first approach to establishing a research methodology.

Moreover, you should never refrain from exploring other options. Before setting your work in stone, you must try all the available options as it explains why the choice of research methodology that you finally make is more appropriate than the other available options.

You should always go for a quantitative research methodology if your research requires gathering large amounts of data, figures, and statistics. This research methodology will provide you with results if your research paper involves the validation of some hypothesis.

Whereas, if  you are looking for more explanations, reasons, opinions, and public perceptions around a theory, you must use qualitative research methodology.The choice of an appropriate research methodology ultimately depends on what you want to achieve through your research.

Frequently Asked Questions (FAQs) about Research Methodology

1. how to write a research methodology.

You can always provide a separate section for research methodology where you should specify details about the methods and instruments used during the research, discussions on result analysis, including insights into the background information, and conveying the research limitations.

2. What are the types of research methodology?

There generally exists four types of research methodology i.e.

  • Observation
  • Experimental
  • Derivational

3. What is the true meaning of research methodology?

The set of techniques or procedures followed to discover and analyze the information gathered to validate or justify a research outcome is generally called Research Methodology.

4. Where lies the importance of research methodology?

Your research methodology directly reflects the validity of your research outcomes and how well-informed your research work is. Moreover, it can help future researchers cite or refer to your research if they plan to use a similar research methodology.

presentation of methodology

You might also like

Consensus GPT vs. SciSpace GPT: Choose the Best GPT for Research

Consensus GPT vs. SciSpace GPT: Choose the Best GPT for Research

Sumalatha G

Literature Review and Theoretical Framework: Understanding the Differences

Nikhil Seethi

Using AI for research: A beginner’s guide

Shubham Dogra

presentation of methodology

How To Write The Methodology Chapter

A plain-language explainer – with practical examples.

Dissertation Coaching

Overview: The Methodology Chapter

  • The purpose  of the methodology chapter
  • Why you need to craft this chapter (really) well
  • How to write and structure the chapter
  • Methodology chapter example
  • Essential takeaways

What (exactly) is the methodology chapter?

The methodology chapter is where you outline the philosophical foundations of your research and detail the specific methodological choices you’ve made. In other words, the purpose of this chapter is to explain exactly how you designed your study and, just as importantly, why you made those choices.

Your methodology chapter should comprehensively describe and justify all the methodological decisions involved in your study. For instance, the research approach you took (qualitative, quantitative, or mixed methods), your sampling strategy (who you collected data from), how you gathered your data, and how you analysed it. If that sounds a bit daunting, don’t worry – we’ll walk you through all these methodological aspects in this post .

Research methodology webinar

Why is the methodology chapter important?

The methodology chapter plays two important roles in your dissertation or thesis:

Firstly, it demonstrates your understanding of research theory, which is what earns you marks. A flawed research design or methodology would mean flawed results. So, this chapter is vital as it allows you to show the marker that you know what you’re doing and that your results are credible .

Secondly, the methodology chapter is what helps to make your study replicable. In other words, it allows other researchers to undertake your study using the same methodological approach, and compare their findings to yours. This is very important within academic research, as each study builds on previous studies.

The methodology chapter is also important in that it allows you to identify and discuss any methodological issues or problems you encountered (i.e., research limitations ), and to explain how you mitigated the impacts of these.

Now, it’s important to understand that every research project has its limitations , so it’s important to acknowledge these openly and highlight your study’s value despite its limitations . Doing so demonstrates your understanding of research design, which will earn you marks. 

Need a helping hand?

presentation of methodology

How to write up the methodology chapter

Before you start writing, it’s always a good idea to draw up a rough outline to guide your writing. Don’t just start writing without knowing what you’ll discuss where. If you do, you’ll likely end up with a disjointed, ill-flowing narrative . You’ll then waste a lot of time rewriting in an attempt to try to stitch all the pieces together. Do yourself a favour and start with the end in mind .

Section 1 – Introduction

As with all chapters in your dissertation or thesis, the methodology chapter should have a brief introduction. In this section, you should remind your readers what the focus of your study is, especially the research aims . As we’ve discussed many times on the blog, your methodology needs to align with your research aims, objectives and research questions. Therefore, it’s useful to frontload this component to remind the reader (and yourself!) what you’re trying to achieve.

The intro provides a roadmap to your methodology chapter

Section 2 – The Methodology

The next section of your chapter is where you’ll present the actual methodology. In this section, you need to detail and justify the key methodological choices you’ve made in a logical, intuitive fashion. Importantly, this is the heart of your methodology chapter, so you need to get specific – don’t hold back on the details here. This is not one of those “less is more” situations.

Let’s take a look at the most common components you’ll likely need to cover.

Methodological Choice #1 – Research Philosophy

Research philosophy refers to the underlying beliefs (i.e., the worldview) regarding how data about a phenomenon should be gathered , analysed and used . The research philosophy will serve as the core of your study and underpin all of the other research design choices, so it’s critically important that you understand which philosophy you’ll adopt and why you made that choice. If you’re not clear on this, take the time to get clarity before you make any further methodological choices.

While several research philosophies exist, two commonly adopted ones are positivism and interpretivism . These two sit roughly on opposite sides of the research philosophy spectrum.

Positivism states that the researcher can observe reality objectively and that there is only one reality, which exists independently of the observer. As a consequence, it is quite commonly the underlying research philosophy in quantitative studies and is oftentimes the assumed philosophy in the physical sciences.

Contrasted with this, interpretivism , which is often the underlying research philosophy in qualitative studies, assumes that the researcher performs a role in observing the world around them and that reality is unique to each observer . In other words, reality is observed subjectively .

These are just two philosophies (there are many more), but they demonstrate significantly different approaches to research and have a significant impact on all the methodological choices. Therefore, it’s vital that you clearly outline and justify your research philosophy at the beginning of your methodology chapter, as it sets the scene for everything that follows.

Private Coaching

The next thing you would typically discuss in your methodology section is the research type. The starting point for this is to indicate whether the research you conducted is inductive or deductive .

Inductive research takes a bottom-up approach , where the researcher begins with specific observations or data and then draws general conclusions or theories from those observations. Therefore these studies tend to be exploratory in terms of approach.

Conversely , d eductive research takes a top-down approach , where the researcher starts with a theory or hypothesis and then tests it using specific observations or data. Therefore these studies tend to be confirmatory in approach.

Related to this, you’ll need to indicate whether your study adopts a qualitative, quantitative or mixed  approach. As we’ve mentioned, there’s a strong link between this choice and your research philosophy, so make sure that your choices are tightly aligned . When you write this section up, remember to clearly justify your choices, as they form the foundation of your study.

Methodological Choice #3 – Research Strategy

Next, you’ll need to discuss your research strategy (also referred to as a research design ). This methodological choice refers to the broader strategy in terms of how you’ll conduct your research, based on the aims of your study.

Several research strategies exist, including experimental , case studies , ethnography , grounded theory, action research , and phenomenology . Let’s take a look at two of these, experimental and ethnographic, to see how they contrast.

Experimental research makes use of the scientific method , where one group is the control group (in which no variables are manipulated ) and another is the experimental group (in which a specific variable is manipulated). This type of research is undertaken under strict conditions in a controlled, artificial environment (e.g., a laboratory). By having firm control over the environment, experimental research typically allows the researcher to establish causation between variables. Therefore, it can be a good choice if you have research aims that involve identifying causal relationships.

Ethnographic research , on the other hand, involves observing and capturing the experiences and perceptions of participants in their natural environment (for example, at home or in the office). In other words, in an uncontrolled environment.  Naturally, this means that this research strategy would be far less suitable if your research aims involve identifying causation, but it would be very valuable if you’re looking to explore and examine a group culture, for example.

The next thing you’ll need to detail in your methodology chapter is the time horizon. There are two options here: cross-sectional and longitudinal . In other words, whether the data for your study were all collected at one point in time (cross-sectional) or at multiple points in time (longitudinal).

The choice you make here depends again on your research aims, objectives and research questions. If, for example, you aim to assess how a specific group of people’s perspectives regarding a topic change over time , you’d likely adopt a longitudinal time horizon.

Another important factor to consider is simply whether you have the time necessary to adopt a longitudinal approach (which could involve collecting data over multiple months or even years). Oftentimes, the time pressures of your degree program will force your hand into adopting a cross-sectional time horizon, so keep this in mind.

Methodological Choice #5 – Sampling Strategy

Next, you’ll need to discuss your sampling strategy . There are two main categories of sampling, probability and non-probability sampling.

Probability sampling involves a random (and therefore representative) selection of participants from a population, whereas non-probability sampling entails selecting participants in a non-random  (and therefore non-representative) manner. For example, selecting participants based on ease of access (this is called a convenience sample).

The right sampling approach depends largely on what you’re trying to achieve in your study. Specifically, whether you trying to develop findings that are generalisable to a population or not. Practicalities and resource constraints also play a large role here, as it can oftentimes be challenging to gain access to a truly random sample. In the video below, we explore some of the most common sampling strategies. https://www.youtube.com/watch?v=fSmedyVv-Us Video can't be loaded because JavaScript is disabled: Sampling Methods 101: Probability & Non-Probability Sampling Explained Simply (https://www.youtube.com/watch?v=fSmedyVv-Us) Methodological Choice #6 – Data Collection Method

Next up, you’ll need to explain how you’ll go about collecting the necessary data for your study. Your data collection method (or methods) will depend on the type of data that you plan to collect – in other words, qualitative or quantitative data.

Typically, quantitative research relies on surveys , data generated by lab equipment, analytics software or existing datasets. Qualitative research, on the other hand, often makes use of collection methods such as interviews , focus groups , participant observations, and ethnography.

So, as you can see, there is a tight link between this section and the design choices you outlined in earlier sections. Strong alignment between these sections, as well as your research aims and questions is therefore very important.

Methodological Choice #7 – Data Analysis Methods/Techniques

The final major methodological choice that you need to address is that of analysis techniques . In other words, how you’ll go about analysing your date once you’ve collected it. Here it’s important to be very specific about your analysis methods and/or techniques – don’t leave any room for interpretation. Also, as with all choices in this chapter, you need to justify each choice you make.

Research methodology checklist

With the key methodological choices outlined and justified, the next step is to discuss the limitations of your design. No research methodology is perfect – there will always be trade-offs between the “ideal” methodology and what’s practical and viable, given your constraints. Therefore, this section of your methodology chapter is where you’ll discuss the trade-offs you had to make, and why these were justified given the context.

Methodological limitations can vary greatly from study to study, ranging from common issues such as time and budget constraints to issues of sample or selection bias . For example, you may find that you didn’t manage to draw in enough respondents to achieve the desired sample size (and therefore, statistically significant results), or your sample may be skewed heavily towards a certain demographic, thereby negatively impacting representativeness .

In this section, it’s important to be critical of the shortcomings of your study. There’s no use trying to hide them (your marker will be aware of them regardless). By being critical, you’ll demonstrate to your marker that you have a strong understanding of research theory, so don’t be shy here. At the same time, don’t beat your study to death . State the limitations, why these were justified, how you mitigated their impacts to the best degree possible, and how your study still provides value despite these limitations .

Section 4 – Concluding Summary

Finally, it’s time to wrap up the methodology chapter with a brief concluding summary. In this section, you’ll want to concisely summarise what you’ve presented in the chapter. Here, it can be a good idea to use a figure to summarise the key decisions, especially if your university recommends using a specific model (for example, Saunders’ Research Onion ).

Keep it simple

Methodology Chapter Example

Wrapping up.

Also, remember the golden rule of the methodology chapter – justify every choice ! Make sure that you clearly explain the “why” for every “what”, and reference credible methodology textbooks or academic sources to back up your justifications.

Research Methodology Bootcamp

Learn More About Methodology

Triangulation: The Ultimate Credibility Enhancer

Triangulation: The Ultimate Credibility Enhancer

Triangulation is one of the best ways to enhance the credibility of your research. Learn about the different options here.

Research Limitations 101: What You Need To Know

Research Limitations 101: What You Need To Know

Learn everything you need to know about research limitations (AKA limitations of the study). Includes practical examples from real studies.

In Vivo Coding 101: Full Explainer With Examples

In Vivo Coding 101: Full Explainer With Examples

Learn about in vivo coding, a popular qualitative coding technique ideal for studies where the nuances of language are central to the aims.

Process Coding 101: Full Explainer With Examples

Process Coding 101: Full Explainer With Examples

Learn about process coding, a popular qualitative coding technique ideal for studies exploring processes, actions and changes over time.

Qualitative Coding 101: Inductive, Deductive & Hybrid Coding

Qualitative Coding 101: Inductive, Deductive & Hybrid Coding

Inductive, Deductive & Abductive Coding Qualitative Coding Approaches Explained...

📄 FREE TEMPLATES

Research Topic Ideation

Proposal Writing

Literature Review

Methodology & Analysis

Academic Writing

Referencing & Citing

Apps, Tools & Tricks

The Grad Coach Podcast

55 Comments

DAUDI JACKSON GYUNDA

highly appreciated.

florin

This was very helpful!

Nophie

This was helpful

mengistu

Thanks ,it is a very useful idea.

Thanks ,it is very useful idea.

Lucia

Thank you so much, this information is very useful.

Shemeka Hodge-Joyce

Thank you very much. I must say the information presented was succinct, coherent and invaluable. It is well put together and easy to comprehend. I have a great guide to create the research methodology for my dissertation.

james edwin thomson

Highly clear and useful.

Amir

I understand a bit on the explanation above. I want to have some coach but I’m still student and don’t have any budget to hire one. A lot of question I want to ask.

Henrick

Thank you so much. This concluded my day plan. Thank you so much.

Najat

Thanks it was helpful

Karen

Great information. It would be great though if you could show us practical examples.

Patrick O Matthew

Thanks so much for this information. God bless and be with you

Atugonza Zahara

Thank you so so much. Indeed it was helpful

Joy O.

This is EXCELLENT!

I was totally confused by other explanations. Thank you so much!.

keinemukama surprise

justdoing my research now , thanks for the guidance.

Yucong Huang

Thank uuuu! These contents are really valued for me!

Thokozani kanyemba

This is powerful …I really like it

Hend Zahran

Highly useful and clear, thank you so much.

Harry Kaliza

Highly appreciated. Good guide

Fateme Esfahani

That was helpful. Thanks

David Tshigomana

This is very useful.Thank you

Kaunda

Very helpful information. Thank you

Peter

This is exactly what I was looking for. The explanation is so detailed and easy to comprehend. Well done and thank you.

Shazia Malik

Great job. You just summarised everything in the easiest and most comprehensible way possible. Thanks a lot.

Rosenda R. Gabriente

Thank you very much for the ideas you have given this will really help me a lot. Thank you and God Bless.

Eman

Such great effort …….very grateful thank you

Shaji Viswanathan

Please accept my sincere gratitude. I have to say that the information that was delivered was congruent, concise, and quite helpful. It is clear and straightforward, making it simple to understand. I am in possession of an excellent manual that will assist me in developing the research methods for my dissertation.

lalarie

Thank you for your great explanation. It really helped me construct my methodology paper.

Daniel sitieney

thank you for simplifieng the methodoly, It was realy helpful

Kayode

Very helpful!

Nathan

Thank you for your great explanation.

Emily Kamende

The explanation I have been looking for. So clear Thank you

Abraham Mafuta

Thank you very much .this was more enlightening.

Jordan

helped me create the in depth and thorough methodology for my dissertation

Nelson D Menduabor

Thank you for the great explaination.please construct one methodology for me

I appreciate you for the explanation of methodology. Please construct one methodology on the topic: The effects influencing students dropout among schools for my thesis

This helped me complete my methods section of my dissertation with ease. I have managed to write a thorough and concise methodology!

ASHA KIUNGA

its so good in deed

leslie chihope

wow …what an easy to follow presentation. very invaluable content shared. utmost important.

Ahmed khedr

Peace be upon you, I am Dr. Ahmed Khedr, a former part-time professor at Al-Azhar University in Cairo, Egypt. I am currently teaching research methods, and I have been dealing with your esteemed site for several years, and I found that despite my long experience with research methods sites, it is one of the smoothest sites for evaluating the material for students, For this reason, I relied on it a lot in teaching and translated most of what was written into Arabic and published it on my own page on Facebook. Thank you all… Everything I posted on my page is provided with the names of the writers of Grad coach, the title of the article, and the site. My best regards.

Daniel Edwards

A remarkably simple and useful guide, thank you kindly.

Magnus Mahenge

I real appriciate your short and remarkable chapter summary

Olalekan Adisa

Bravo! Very helpful guide.

Arthur Margraf

Only true experts could provide such helpful, fantastic, and inspiring knowledge about Methodology. Thank you very much! God be with you and us all!

Aruni Nilangi

highly appreciate your effort.

White Label Blog Content

This is a very well thought out post. Very informative and a great read.

FELEKE FACHA

THANKS SO MUCH FOR SHARING YOUR NICE IDEA

Chandika Perera

I love you Emma, you are simply amazing with clear explanations with complete information. GradCoach really helped me to do my assignment here in Auckland. Mostly, Emma make it so simple and enjoyable

Zibele Xuba

Thank you very much for this informative and synthesised version.

Yusra AR. Mahmood

thank you, It was a very informative presentation, you made it just to the point in a straightforward way .

Chryslin

Help me write a methodology on the topic “challenges faced by family businesses in Ghana

Kajela

Well articulated, clear, and concise. I got a lot from this writings. Thanks

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Submit Comment

presentation of methodology

  • Print Friendly

Loading metrics

Open Access

Ten simple rules for effective presentation slides

* E-mail: [email protected]

Affiliation Biomedical Engineering and the Center for Public Health Genomics, University of Virginia, Charlottesville, Virginia, United States of America

ORCID logo

  • Kristen M. Naegle

PLOS

Published: December 2, 2021

  • https://doi.org/10.1371/journal.pcbi.1009554
  • Reader Comments

Fig 1

Citation: Naegle KM (2021) Ten simple rules for effective presentation slides. PLoS Comput Biol 17(12): e1009554. https://doi.org/10.1371/journal.pcbi.1009554

Copyright: © 2021 Kristen M. Naegle. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Funding: The author received no specific funding for this work.

Competing interests: The author has declared no competing interests exist.

Introduction

The “presentation slide” is the building block of all academic presentations, whether they are journal clubs, thesis committee meetings, short conference talks, or hour-long seminars. A slide is a single page projected on a screen, usually built on the premise of a title, body, and figures or tables and includes both what is shown and what is spoken about that slide. Multiple slides are strung together to tell the larger story of the presentation. While there have been excellent 10 simple rules on giving entire presentations [ 1 , 2 ], there was an absence in the fine details of how to design a slide for optimal effect—such as the design elements that allow slides to convey meaningful information, to keep the audience engaged and informed, and to deliver the information intended and in the time frame allowed. As all research presentations seek to teach, effective slide design borrows from the same principles as effective teaching, including the consideration of cognitive processing your audience is relying on to organize, process, and retain information. This is written for anyone who needs to prepare slides from any length scale and for most purposes of conveying research to broad audiences. The rules are broken into 3 primary areas. Rules 1 to 5 are about optimizing the scope of each slide. Rules 6 to 8 are about principles around designing elements of the slide. Rules 9 to 10 are about preparing for your presentation, with the slides as the central focus of that preparation.

Rule 1: Include only one idea per slide

Each slide should have one central objective to deliver—the main idea or question [ 3 – 5 ]. Often, this means breaking complex ideas down into manageable pieces (see Fig 1 , where “background” information has been split into 2 key concepts). In another example, if you are presenting a complex computational approach in a large flow diagram, introduce it in smaller units, building it up until you finish with the entire diagram. The progressive buildup of complex information means that audiences are prepared to understand the whole picture, once you have dedicated time to each of the parts. You can accomplish the buildup of components in several ways—for example, using presentation software to cover/uncover information. Personally, I choose to create separate slides for each piece of information content I introduce—where the final slide has the entire diagram, and I use cropping or a cover on duplicated slides that come before to hide what I’m not yet ready to include. I use this method in order to ensure that each slide in my deck truly presents one specific idea (the new content) and the amount of the new information on that slide can be described in 1 minute (Rule 2), but it comes with the trade-off—a change to the format of one of the slides in the series often means changes to all slides.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

Top left: A background slide that describes the background material on a project from my lab. The slide was created using a PowerPoint Design Template, which had to be modified to increase default text sizes for this figure (i.e., the default text sizes are even worse than shown here). Bottom row: The 2 new slides that break up the content into 2 explicit ideas about the background, using a central graphic. In the first slide, the graphic is an explicit example of the SH2 domain of PI3-kinase interacting with a phosphorylation site (Y754) on the PDGFR to describe the important details of what an SH2 domain and phosphotyrosine ligand are and how they interact. I use that same graphic in the second slide to generalize all binding events and include redundant text to drive home the central message (a lot of possible interactions might occur in the human proteome, more than we can currently measure). Top right highlights which rules were used to move from the original slide to the new slide. Specific changes as highlighted by Rule 7 include increasing contrast by changing the background color, increasing font size, changing to sans serif fonts, and removing all capital text and underlining (using bold to draw attention). PDGFR, platelet-derived growth factor receptor.

https://doi.org/10.1371/journal.pcbi.1009554.g001

Rule 2: Spend only 1 minute per slide

When you present your slide in the talk, it should take 1 minute or less to discuss. This rule is really helpful for planning purposes—a 20-minute presentation should have somewhere around 20 slides. Also, frequently giving your audience new information to feast on helps keep them engaged. During practice, if you find yourself spending more than a minute on a slide, there’s too much for that one slide—it’s time to break up the content into multiple slides or even remove information that is not wholly central to the story you are trying to tell. Reduce, reduce, reduce, until you get to a single message, clearly described, which takes less than 1 minute to present.

Rule 3: Make use of your heading

When each slide conveys only one message, use the heading of that slide to write exactly the message you are trying to deliver. Instead of titling the slide “Results,” try “CTNND1 is central to metastasis” or “False-positive rates are highly sample specific.” Use this landmark signpost to ensure that all the content on that slide is related exactly to the heading and only the heading. Think of the slide heading as the introductory or concluding sentence of a paragraph and the slide content the rest of the paragraph that supports the main point of the paragraph. An audience member should be able to follow along with you in the “paragraph” and come to the same conclusion sentence as your header at the end of the slide.

Rule 4: Include only essential points

While you are speaking, audience members’ eyes and minds will be wandering over your slide. If you have a comment, detail, or figure on a slide, have a plan to explicitly identify and talk about it. If you don’t think it’s important enough to spend time on, then don’t have it on your slide. This is especially important when faculty are present. I often tell students that thesis committee members are like cats: If you put a shiny bauble in front of them, they’ll go after it. Be sure to only put the shiny baubles on slides that you want them to focus on. Putting together a thesis meeting for only faculty is really an exercise in herding cats (if you have cats, you know this is no easy feat). Clear and concise slide design will go a long way in helping you corral those easily distracted faculty members.

Rule 5: Give credit, where credit is due

An exception to Rule 4 is to include proper citations or references to work on your slide. When adding citations, names of other researchers, or other types of credit, use a consistent style and method for adding this information to your slides. Your audience will then be able to easily partition this information from the other content. A common mistake people make is to think “I’ll add that reference later,” but I highly recommend you put the proper reference on the slide at the time you make it, before you forget where it came from. Finally, in certain kinds of presentations, credits can make it clear who did the work. For the faculty members heading labs, it is an effective way to connect your audience with the personnel in the lab who did the work, which is a great career booster for that person. For graduate students, it is an effective way to delineate your contribution to the work, especially in meetings where the goal is to establish your credentials for meeting the rigors of a PhD checkpoint.

Rule 6: Use graphics effectively

As a rule, you should almost never have slides that only contain text. Build your slides around good visualizations. It is a visual presentation after all, and as they say, a picture is worth a thousand words. However, on the flip side, don’t muddy the point of the slide by putting too many complex graphics on a single slide. A multipanel figure that you might include in a manuscript should often be broken into 1 panel per slide (see Rule 1 ). One way to ensure that you use the graphics effectively is to make a point to introduce the figure and its elements to the audience verbally, especially for data figures. For example, you might say the following: “This graph here shows the measured false-positive rate for an experiment and each point is a replicate of the experiment, the graph demonstrates …” If you have put too much on one slide to present in 1 minute (see Rule 2 ), then the complexity or number of the visualizations is too much for just one slide.

Rule 7: Design to avoid cognitive overload

The type of slide elements, the number of them, and how you present them all impact the ability for the audience to intake, organize, and remember the content. For example, a frequent mistake in slide design is to include full sentences, but reading and verbal processing use the same cognitive channels—therefore, an audience member can either read the slide, listen to you, or do some part of both (each poorly), as a result of cognitive overload [ 4 ]. The visual channel is separate, allowing images/videos to be processed with auditory information without cognitive overload [ 6 ] (Rule 6). As presentations are an exercise in listening, and not reading, do what you can to optimize the ability of the audience to listen. Use words sparingly as “guide posts” to you and the audience about major points of the slide. In fact, you can add short text fragments, redundant with the verbal component of the presentation, which has been shown to improve retention [ 7 ] (see Fig 1 for an example of redundant text that avoids cognitive overload). Be careful in the selection of a slide template to minimize accidentally adding elements that the audience must process, but are unimportant. David JP Phillips argues (and effectively demonstrates in his TEDx talk [ 5 ]) that the human brain can easily interpret 6 elements and more than that requires a 500% increase in human cognition load—so keep the total number of elements on the slide to 6 or less. Finally, in addition to the use of short text, white space, and the effective use of graphics/images, you can improve ease of cognitive processing further by considering color choices and font type and size. Here are a few suggestions for improving the experience for your audience, highlighting the importance of these elements for some specific groups:

  • Use high contrast colors and simple backgrounds with low to no color—for persons with dyslexia or visual impairment.
  • Use sans serif fonts and large font sizes (including figure legends), avoid italics, underlining (use bold font instead for emphasis), and all capital letters—for persons with dyslexia or visual impairment [ 8 ].
  • Use color combinations and palettes that can be understood by those with different forms of color blindness [ 9 ]. There are excellent tools available to identify colors to use and ways to simulate your presentation or figures as they might be seen by a person with color blindness (easily found by a web search).
  • In this increasing world of virtual presentation tools, consider practicing your talk with a closed captioning system capture your words. Use this to identify how to improve your speaking pace, volume, and annunciation to improve understanding by all members of your audience, but especially those with a hearing impairment.

Rule 8: Design the slide so that a distracted person gets the main takeaway

It is very difficult to stay focused on a presentation, especially if it is long or if it is part of a longer series of talks at a conference. Audience members may get distracted by an important email, or they may start dreaming of lunch. So, it’s important to look at your slide and ask “If they heard nothing I said, will they understand the key concept of this slide?” The other rules are set up to help with this, including clarity of the single point of the slide (Rule 1), titling it with a major conclusion (Rule 3), and the use of figures (Rule 6) and short text redundant to your verbal description (Rule 7). However, with each slide, step back and ask whether its main conclusion is conveyed, even if someone didn’t hear your accompanying dialog. Importantly, ask if the information on the slide is at the right level of abstraction. For example, do you have too many details about the experiment, which hides the conclusion of the experiment (i.e., breaking Rule 1)? If you are worried about not having enough details, keep a slide at the end of your slide deck (after your conclusions and acknowledgments) with the more detailed information that you can refer to during a question and answer period.

Rule 9: Iteratively improve slide design through practice

Well-designed slides that follow the first 8 rules are intended to help you deliver the message you intend and in the amount of time you intend to deliver it in. The best way to ensure that you nailed slide design for your presentation is to practice, typically a lot. The most important aspects of practicing a new presentation, with an eye toward slide design, are the following 2 key points: (1) practice to ensure that you hit, each time through, the most important points (for example, the text guide posts you left yourself and the title of the slide); and (2) practice to ensure that as you conclude the end of one slide, it leads directly to the next slide. Slide transitions, what you say as you end one slide and begin the next, are important to keeping the flow of the “story.” Practice is when I discover that the order of my presentation is poor or that I left myself too few guideposts to remember what was coming next. Additionally, during practice, the most frequent things I have to improve relate to Rule 2 (the slide takes too long to present, usually because I broke Rule 1, and I’m delivering too much information for one slide), Rule 4 (I have a nonessential detail on the slide), and Rule 5 (I forgot to give a key reference). The very best type of practice is in front of an audience (for example, your lab or peers), where, with fresh perspectives, they can help you identify places for improving slide content, design, and connections across the entirety of your talk.

Rule 10: Design to mitigate the impact of technical disasters

The real presentation almost never goes as we planned in our heads or during our practice. Maybe the speaker before you went over time and now you need to adjust. Maybe the computer the organizer is having you use won’t show your video. Maybe your internet is poor on the day you are giving a virtual presentation at a conference. Technical problems are routinely part of the practice of sharing your work through presentations. Hence, you can design your slides to limit the impact certain kinds of technical disasters create and also prepare alternate approaches. Here are just a few examples of the preparation you can do that will take you a long way toward avoiding a complete fiasco:

  • Save your presentation as a PDF—if the version of Keynote or PowerPoint on a host computer cause issues, you still have a functional copy that has a higher guarantee of compatibility.
  • In using videos, create a backup slide with screen shots of key results. For example, if I have a video of cell migration, I’ll be sure to have a copy of the start and end of the video, in case the video doesn’t play. Even if the video worked, you can pause on this backup slide and take the time to highlight the key results in words if someone could not see or understand the video.
  • Avoid animations, such as figures or text that flash/fly-in/etc. Surveys suggest that no one likes movement in presentations [ 3 , 4 ]. There is likely a cognitive underpinning to the almost universal distaste of pointless animations that relates to the idea proposed by Kosslyn and colleagues that animations are salient perceptual units that captures direct attention [ 4 ]. Although perceptual salience can be used to draw attention to and improve retention of specific points, if you use this approach for unnecessary/unimportant things (like animation of your bullet point text, fly-ins of figures, etc.), then you will distract your audience from the important content. Finally, animations cause additional processing burdens for people with visual impairments [ 10 ] and create opportunities for technical disasters if the software on the host system is not compatible with your planned animation.

Conclusions

These rules are just a start in creating more engaging presentations that increase audience retention of your material. However, there are wonderful resources on continuing on the journey of becoming an amazing public speaker, which includes understanding the psychology and neuroscience behind human perception and learning. For example, as highlighted in Rule 7, David JP Phillips has a wonderful TEDx talk on the subject [ 5 ], and “PowerPoint presentation flaws and failures: A psychological analysis,” by Kosslyn and colleagues is deeply detailed about a number of aspects of human cognition and presentation style [ 4 ]. There are many books on the topic, including the popular “Presentation Zen” by Garr Reynolds [ 11 ]. Finally, although briefly touched on here, the visualization of data is an entire topic of its own that is worth perfecting for both written and oral presentations of work, with fantastic resources like Edward Tufte’s “The Visual Display of Quantitative Information” [ 12 ] or the article “Visualization of Biomedical Data” by O’Donoghue and colleagues [ 13 ].

Acknowledgments

I would like to thank the countless presenters, colleagues, students, and mentors from which I have learned a great deal from on effective presentations. Also, a thank you to the wonderful resources published by organizations on how to increase inclusivity. A special thanks to Dr. Jason Papin and Dr. Michael Guertin on early feedback of this editorial.

  • View Article
  • PubMed/NCBI
  • Google Scholar
  • 3. Teaching VUC for Making Better PowerPoint Presentations. n.d. Available from: https://cft.vanderbilt.edu/guides-sub-pages/making-better-powerpoint-presentations/#baddeley .
  • 8. Creating a dyslexia friendly workplace. Dyslexia friendly style guide. nd. Available from: https://www.bdadyslexia.org.uk/advice/employers/creating-a-dyslexia-friendly-workplace/dyslexia-friendly-style-guide .
  • 9. Cravit R. How to Use Color Blind Friendly Palettes to Make Your Charts Accessible. 2019. Available from: https://venngage.com/blog/color-blind-friendly-palette/ .
  • 10. Making your conference presentation more accessible to blind and partially sighted people. n.d. Available from: https://vocaleyes.co.uk/services/resources/guidelines-for-making-your-conference-presentation-more-accessible-to-blind-and-partially-sighted-people/ .
  • 11. Reynolds G. Presentation Zen: Simple Ideas on Presentation Design and Delivery. 2nd ed. New Riders Pub; 2011.
  • 12. Tufte ER. The Visual Display of Quantitative Information. 2nd ed. Graphics Press; 2001.

SlidePlayer

  • My presentations

Auth with social network:

Download presentation

We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!

Presentation is loading. Please wait.

Lecture Notes on Research Methodology

Published by Eileen Garrison Modified over 6 years ago

Similar presentations

Presentation on theme: "Lecture Notes on Research Methodology"— Presentation transcript:

Lecture Notes on Research Methodology

Introduction to Research Methodology

presentation of methodology

Sabine Mendes Lima Moura Issues in Research Methodology PUC – November 2014.

presentation of methodology

Today Concepts underlying inferential statistics

presentation of methodology

Richard M. Jacobs, OSA, Ph.D.

presentation of methodology

Research Methodology Lecture 1.

presentation of methodology

Chapter 12 Inferential Statistics Gay, Mills, and Airasian

presentation of methodology

Sample Design.

presentation of methodology

Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,

presentation of methodology

Magister of Electrical Engineering Udayana University September 2011

presentation of methodology

Chapter 1: Introduction to Statistics

presentation of methodology

RESEARCH A systematic quest for undiscovered truth A way of thinking

presentation of methodology

Research Methodology.

presentation of methodology

Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.

presentation of methodology

Research Seminars in IT in Education (MIT6003) Quantitative Educational Research Design 2 Dr Jacky Pow.

presentation of methodology

PROCESSING OF DATA The collected data in research is processed and analyzed to come to some conclusions or to verify the hypothesis made. Processing of.

presentation of methodology

Academic Research Academic Research Dr Kishor Bhanushali M

presentation of methodology

Question paper 1997.

presentation of methodology

Chapter 6: Analyzing and Interpreting Quantitative Data

presentation of methodology

Module III Multivariate Analysis Techniques- Framework, Factor Analysis, Cluster Analysis and Conjoint Analysis Research Report.

presentation of methodology

Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.

About project

© 2024 SlidePlayer.com Inc. All rights reserved.

  • USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • 6. The Methodology
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Applying Critical Thinking
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • The C.A.R.S. Model
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Qualitative Methods
  • Quantitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

The methods section describes actions taken to investigate a research problem and the rationale for the application of specific procedures or techniques used to identify, select, process, and analyze information applied to understanding the problem, thereby, allowing the reader to critically evaluate a study’s overall validity and reliability. The methodology section of a research paper answers two main questions: How was the data collected or generated? And, how was it analyzed? The writing should be direct and precise and always written in the past tense.

Kallet, Richard H. "How to Write the Methods Section of a Research Paper." Respiratory Care 49 (October 2004): 1229-1232.

Importance of a Good Methodology Section

You must explain how you obtained and analyzed your results for the following reasons:

  • Readers need to know how the data was obtained because the method you chose affects the results and, by extension, how you interpreted their significance in the discussion section of your paper.
  • Methodology is crucial for any branch of scholarship because an unreliable method produces unreliable results and, as a consequence, undermines the value of your analysis of the findings.
  • In most cases, there are a variety of different methods you can choose to investigate a research problem. The methodology section of your paper should clearly articulate the reasons why you have chosen a particular procedure or technique.
  • The reader wants to know that the data was collected or generated in a way that is consistent with accepted practice in the field of study. For example, if you are using a multiple choice questionnaire, readers need to know that it offered your respondents a reasonable range of answers to choose from.
  • The method must be appropriate to fulfilling the overall aims of the study. For example, you need to ensure that you have a large enough sample size to be able to generalize and make recommendations based upon the findings.
  • The methodology should discuss the problems that were anticipated and the steps you took to prevent them from occurring. For any problems that do arise, you must describe the ways in which they were minimized or why these problems do not impact in any meaningful way your interpretation of the findings.
  • In the social and behavioral sciences, it is important to always provide sufficient information to allow other researchers to adopt or replicate your methodology. This information is particularly important when a new method has been developed or an innovative use of an existing method is utilized.

Bem, Daryl J. Writing the Empirical Journal Article. Psychology Writing Center. University of Washington; Denscombe, Martyn. The Good Research Guide: For Small-Scale Social Research Projects . 5th edition. Buckingham, UK: Open University Press, 2014; Lunenburg, Frederick C. Writing a Successful Thesis or Dissertation: Tips and Strategies for Students in the Social and Behavioral Sciences . Thousand Oaks, CA: Corwin Press, 2008.

Structure and Writing Style

I.  Groups of Research Methods

There are two main groups of research methods in the social sciences:

  • The e mpirical-analytical group approaches the study of social sciences in a similar manner that researchers study the natural sciences . This type of research focuses on objective knowledge, research questions that can be answered yes or no, and operational definitions of variables to be measured. The empirical-analytical group employs deductive reasoning that uses existing theory as a foundation for formulating hypotheses that need to be tested. This approach is focused on explanation.
  • The i nterpretative group of methods is focused on understanding phenomenon in a comprehensive, holistic way . Interpretive methods focus on analytically disclosing the meaning-making practices of human subjects [the why, how, or by what means people do what they do], while showing how those practices arrange so that it can be used to generate observable outcomes. Interpretive methods allow you to recognize your connection to the phenomena under investigation. However, the interpretative group requires careful examination of variables because it focuses more on subjective knowledge.

II.  Content

The introduction to your methodology section should begin by restating the research problem and underlying assumptions underpinning your study. This is followed by situating the methods you used to gather, analyze, and process information within the overall “tradition” of your field of study and within the particular research design you have chosen to study the problem. If the method you choose lies outside of the tradition of your field [i.e., your review of the literature demonstrates that the method is not commonly used], provide a justification for how your choice of methods specifically addresses the research problem in ways that have not been utilized in prior studies.

The remainder of your methodology section should describe the following:

  • Decisions made in selecting the data you have analyzed or, in the case of qualitative research, the subjects and research setting you have examined,
  • Tools and methods used to identify and collect information, and how you identified relevant variables,
  • The ways in which you processed the data and the procedures you used to analyze that data, and
  • The specific research tools or strategies that you utilized to study the underlying hypothesis and research questions.

In addition, an effectively written methodology section should:

  • Introduce the overall methodological approach for investigating your research problem . Is your study qualitative or quantitative or a combination of both (mixed method)? Are you going to take a special approach, such as action research, or a more neutral stance?
  • Indicate how the approach fits the overall research design . Your methods for gathering data should have a clear connection to your research problem. In other words, make sure that your methods will actually address the problem. One of the most common deficiencies found in research papers is that the proposed methodology is not suitable to achieving the stated objective of your paper.
  • Describe the specific methods of data collection you are going to use , such as, surveys, interviews, questionnaires, observation, archival research. If you are analyzing existing data, such as a data set or archival documents, describe how it was originally created or gathered and by whom. Also be sure to explain how older data is still relevant to investigating the current research problem.
  • Explain how you intend to analyze your results . Will you use statistical analysis? Will you use specific theoretical perspectives to help you analyze a text or explain observed behaviors? Describe how you plan to obtain an accurate assessment of relationships, patterns, trends, distributions, and possible contradictions found in the data.
  • Provide background and a rationale for methodologies that are unfamiliar for your readers . Very often in the social sciences, research problems and the methods for investigating them require more explanation/rationale than widely accepted rules governing the natural and physical sciences. Be clear and concise in your explanation.
  • Provide a justification for subject selection and sampling procedure . For instance, if you propose to conduct interviews, how do you intend to select the sample population? If you are analyzing texts, which texts have you chosen, and why? If you are using statistics, why is this set of data being used? If other data sources exist, explain why the data you chose is most appropriate to addressing the research problem.
  • Provide a justification for case study selection . A common method of analyzing research problems in the social sciences is to analyze specific cases. These can be a person, place, event, phenomenon, or other type of subject of analysis that are either examined as a singular topic of in-depth investigation or multiple topics of investigation studied for the purpose of comparing or contrasting findings. In either method, you should explain why a case or cases were chosen and how they specifically relate to the research problem.
  • Describe potential limitations . Are there any practical limitations that could affect your data collection? How will you attempt to control for potential confounding variables and errors? If your methodology may lead to problems you can anticipate, state this openly and show why pursuing this methodology outweighs the risk of these problems cropping up.

NOTE:   Once you have written all of the elements of the methods section, subsequent revisions should focus on how to present those elements as clearly and as logically as possibly. The description of how you prepared to study the research problem, how you gathered the data, and the protocol for analyzing the data should be organized chronologically. For clarity, when a large amount of detail must be presented, information should be presented in sub-sections according to topic. If necessary, consider using appendices for raw data.

ANOTHER NOTE: If you are conducting a qualitative analysis of a research problem , the methodology section generally requires a more elaborate description of the methods used as well as an explanation of the processes applied to gathering and analyzing of data than is generally required for studies using quantitative methods. Because you are the primary instrument for generating the data [e.g., through interviews or observations], the process for collecting that data has a significantly greater impact on producing the findings. Therefore, qualitative research requires a more detailed description of the methods used.

YET ANOTHER NOTE:   If your study involves interviews, observations, or other qualitative techniques involving human subjects , you may be required to obtain approval from the university's Office for the Protection of Research Subjects before beginning your research. This is not a common procedure for most undergraduate level student research assignments. However, i f your professor states you need approval, you must include a statement in your methods section that you received official endorsement and adequate informed consent from the office and that there was a clear assessment and minimization of risks to participants and to the university. This statement informs the reader that your study was conducted in an ethical and responsible manner. In some cases, the approval notice is included as an appendix to your paper.

III.  Problems to Avoid

Irrelevant Detail The methodology section of your paper should be thorough but concise. Do not provide any background information that does not directly help the reader understand why a particular method was chosen, how the data was gathered or obtained, and how the data was analyzed in relation to the research problem [note: analyzed, not interpreted! Save how you interpreted the findings for the discussion section]. With this in mind, the page length of your methods section will generally be less than any other section of your paper except the conclusion.

Unnecessary Explanation of Basic Procedures Remember that you are not writing a how-to guide about a particular method. You should make the assumption that readers possess a basic understanding of how to investigate the research problem on their own and, therefore, you do not have to go into great detail about specific methodological procedures. The focus should be on how you applied a method , not on the mechanics of doing a method. An exception to this rule is if you select an unconventional methodological approach; if this is the case, be sure to explain why this approach was chosen and how it enhances the overall process of discovery.

Problem Blindness It is almost a given that you will encounter problems when collecting or generating your data, or, gaps will exist in existing data or archival materials. Do not ignore these problems or pretend they did not occur. Often, documenting how you overcame obstacles can form an interesting part of the methodology. It demonstrates to the reader that you can provide a cogent rationale for the decisions you made to minimize the impact of any problems that arose.

Literature Review Just as the literature review section of your paper provides an overview of sources you have examined while researching a particular topic, the methodology section should cite any sources that informed your choice and application of a particular method [i.e., the choice of a survey should include any citations to the works you used to help construct the survey].

It’s More than Sources of Information! A description of a research study's method should not be confused with a description of the sources of information. Such a list of sources is useful in and of itself, especially if it is accompanied by an explanation about the selection and use of the sources. The description of the project's methodology complements a list of sources in that it sets forth the organization and interpretation of information emanating from those sources.

Azevedo, L.F. et al. "How to Write a Scientific Paper: Writing the Methods Section." Revista Portuguesa de Pneumologia 17 (2011): 232-238; Blair Lorrie. “Choosing a Methodology.” In Writing a Graduate Thesis or Dissertation , Teaching Writing Series. (Rotterdam: Sense Publishers 2016), pp. 49-72; Butin, Dan W. The Education Dissertation A Guide for Practitioner Scholars . Thousand Oaks, CA: Corwin, 2010; Carter, Susan. Structuring Your Research Thesis . New York: Palgrave Macmillan, 2012; Kallet, Richard H. “How to Write the Methods Section of a Research Paper.” Respiratory Care 49 (October 2004):1229-1232; Lunenburg, Frederick C. Writing a Successful Thesis or Dissertation: Tips and Strategies for Students in the Social and Behavioral Sciences . Thousand Oaks, CA: Corwin Press, 2008. Methods Section. The Writer’s Handbook. Writing Center. University of Wisconsin, Madison; Rudestam, Kjell Erik and Rae R. Newton. “The Method Chapter: Describing Your Research Plan.” In Surviving Your Dissertation: A Comprehensive Guide to Content and Process . (Thousand Oaks, Sage Publications, 2015), pp. 87-115; What is Interpretive Research. Institute of Public and International Affairs, University of Utah; Writing the Experimental Report: Methods, Results, and Discussion. The Writing Lab and The OWL. Purdue University; Methods and Materials. The Structure, Format, Content, and Style of a Journal-Style Scientific Paper. Department of Biology. Bates College.

Writing Tip

Statistical Designs and Tests? Do Not Fear Them!

Don't avoid using a quantitative approach to analyzing your research problem just because you fear the idea of applying statistical designs and tests. A qualitative approach, such as conducting interviews or content analysis of archival texts, can yield exciting new insights about a research problem, but it should not be undertaken simply because you have a disdain for running a simple regression. A well designed quantitative research study can often be accomplished in very clear and direct ways, whereas, a similar study of a qualitative nature usually requires considerable time to analyze large volumes of data and a tremendous burden to create new paths for analysis where previously no path associated with your research problem had existed.

To locate data and statistics, GO HERE .

Another Writing Tip

Knowing the Relationship Between Theories and Methods

There can be multiple meaning associated with the term "theories" and the term "methods" in social sciences research. A helpful way to delineate between them is to understand "theories" as representing different ways of characterizing the social world when you research it and "methods" as representing different ways of generating and analyzing data about that social world. Framed in this way, all empirical social sciences research involves theories and methods, whether they are stated explicitly or not. However, while theories and methods are often related, it is important that, as a researcher, you deliberately separate them in order to avoid your theories playing a disproportionate role in shaping what outcomes your chosen methods produce.

Introspectively engage in an ongoing dialectic between the application of theories and methods to help enable you to use the outcomes from your methods to interrogate and develop new theories, or ways of framing conceptually the research problem. This is how scholarship grows and branches out into new intellectual territory.

Reynolds, R. Larry. Ways of Knowing. Alternative Microeconomics . Part 1, Chapter 3. Boise State University; The Theory-Method Relationship. S-Cool Revision. United Kingdom.

Yet Another Writing Tip

Methods and the Methodology

Do not confuse the terms "methods" and "methodology." As Schneider notes, a method refers to the technical steps taken to do research . Descriptions of methods usually include defining and stating why you have chosen specific techniques to investigate a research problem, followed by an outline of the procedures you used to systematically select, gather, and process the data [remember to always save the interpretation of data for the discussion section of your paper].

The methodology refers to a discussion of the underlying reasoning why particular methods were used . This discussion includes describing the theoretical concepts that inform the choice of methods to be applied, placing the choice of methods within the more general nature of academic work, and reviewing its relevance to examining the research problem. The methodology section also includes a thorough review of the methods other scholars have used to study the topic.

Bryman, Alan. "Of Methods and Methodology." Qualitative Research in Organizations and Management: An International Journal 3 (2008): 159-168; Schneider, Florian. “What's in a Methodology: The Difference between Method, Methodology, and Theory…and How to Get the Balance Right?” PoliticsEastAsia.com. Chinese Department, University of Leiden, Netherlands.

  • << Previous: Scholarly vs. Popular Publications
  • Next: Qualitative Methods >>
  • Last Updated: Sep 27, 2024 1:09 PM
  • URL: https://libguides.usc.edu/writingguide
  • Open access
  • Published: 07 September 2020

A tutorial on methodological studies: the what, when, how and why

  • Lawrence Mbuagbaw   ORCID: orcid.org/0000-0001-5855-5461 1 , 2 , 3 ,
  • Daeria O. Lawson 1 ,
  • Livia Puljak 4 ,
  • David B. Allison 5 &
  • Lehana Thabane 1 , 2 , 6 , 7 , 8  

BMC Medical Research Methodology volume  20 , Article number:  226 ( 2020 ) Cite this article

42k Accesses

61 Citations

60 Altmetric

Metrics details

Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.

We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?

Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.

Peer Review reports

The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 , 2 , 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 , 7 , 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).

In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig.  1 .

figure 1

Trends in the number studies that mention “methodological review” or “meta-

epidemiological study” in PubMed.

The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.

The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.

What is a methodological study?

Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 , 13 , 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.

Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.

When should we conduct a methodological study?

Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.

These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].

How often are methodological studies conducted?

There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.

Why do we conduct methodological studies?

Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].

Where can we find methodological studies?

Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.

Some frequently asked questions about methodological studies

In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.

Q: How should I select research reports for my methodological study?

A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].

The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.

Q: How many databases should I search?

A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.

Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.

Q: Should I publish a protocol for my methodological study?

A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.

Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).

Q: How to appraise the quality of a methodological study?

A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.

Q: Should I justify a sample size?

A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:

Comparing two groups

Determining a proportion, mean or another quantifier

Determining factors associated with an outcome using regression-based analyses

For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].

Q: What should I call my study?

A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.

Q: Should I account for clustering in my methodological study?

A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”

A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].

Q: Should I extract data in duplicate?

A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].

Q: Should I assess the risk of bias of research reports included in my methodological study?

A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].

Q: What variables are relevant to methodological studies?

A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:

Country: Countries and regions differ in their research cultures, and the resources available to conduct research. Therefore, it is reasonable to believe that there may be differences in methodological features across countries. Methodological studies have reported loco-regional differences in reporting quality [ 52 , 53 ]. This may also be related to challenges non-English speakers face in publishing papers in English.

Authors’ expertise: The inclusion of authors with expertise in research methodology, biostatistics, and scientific writing is likely to influence the end-product. Oltean et al. found that among randomized trials in orthopaedic surgery, the use of analyses that accounted for clustering was more likely when specialists (e.g. statistician, epidemiologist or clinical trials methodologist) were included on the study team [ 54 ]. Fleming et al. found that including methodologists in the review team was associated with appropriate use of reporting guidelines [ 55 ].

Source of funding and conflicts of interest: Some studies have found that funded studies report better [ 56 , 57 ], while others do not [ 53 , 58 ]. The presence of funding would indicate the availability of resources deployed to ensure optimal design, conduct, analysis and reporting. However, the source of funding may introduce conflicts of interest and warrant assessment. For example, Kaiser et al. investigated the effect of industry funding on obesity or nutrition randomized trials and found that reporting quality was similar [ 59 ]. Thomas et al. looked at reporting quality of long-term weight loss trials and found that industry funded studies were better [ 60 ]. Kan et al. examined the association between industry funding and “positive trials” (trials reporting a significant intervention effect) and found that industry funding was highly predictive of a positive trial [ 61 ]. This finding is similar to that of a recent Cochrane Methodology Review by Hansen et al. [ 62 ]

Journal characteristics: Certain journals’ characteristics may influence the study design, analysis or reporting. Characteristics such as journal endorsement of guidelines [ 63 , 64 ], and Journal Impact Factor (JIF) have been shown to be associated with reporting [ 63 , 65 , 66 , 67 ].

Study size (sample size/number of sites): Some studies have shown that reporting is better in larger studies [ 53 , 56 , 58 ].

Year of publication: It is reasonable to assume that design, conduct, analysis and reporting of research will change over time. Many studies have demonstrated improvements in reporting over time or after the publication of reporting guidelines [ 68 , 69 ].

Type of intervention: In a methodological study of reporting quality of weight loss intervention studies, Thabane et al. found that trials of pharmacologic interventions were reported better than trials of non-pharmacologic interventions [ 70 ].

Interactions between variables: Complex interactions between the previously listed variables are possible. High income countries with more resources may be more likely to conduct larger studies and incorporate a variety of experts. Authors in certain countries may prefer certain journals, and journal endorsement of guidelines and editorial policies may change over time.

Q: Should I focus only on high impact journals?

A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.

Q: Can I conduct a methodological study of qualitative research?

A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.

Q: What reporting guidelines should I use for my methodological study?

A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.

Q: What are the potential threats to validity and how can I avoid them?

A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.

Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].

With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.

Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n  = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n  = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n  = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.

Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.

A proposed framework

In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:

What is the aim?

Methodological studies that investigate bias

A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].

Methodological studies that investigate quality (or completeness) of reporting

Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].

Methodological studies that investigate the consistency of reporting

Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].

Methodological studies that investigate factors associated with reporting

In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].

Methodological studies that investigate methods

Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].

Methodological studies that summarize other methodological studies

Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].

Methodological studies that investigate nomenclature and terminology

Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].

Other types of methodological studies

In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.

What is the design?

Methodological studies that are descriptive

Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].

Methodological studies that are analytical

Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].

What is the sampling strategy?

Methodological studies that include the target population

Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n  = 103) [ 30 ].

Methodological studies that include a sample of the target population

Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.

What is the unit of analysis?

Methodological studies with a research report as the unit of analysis

Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.

Methodological studies with a design, analysis or reporting item as the unit of analysis

Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].

This framework is outlined in Fig.  2 .

figure 2

A proposed framework for methodological studies

Conclusions

Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.

In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.

Availability of data and materials

Data sharing is not applicable to this article as no new data were created or analyzed in this study.

Abbreviations

Consolidated Standards of Reporting Trials

Evidence, Participants, Intervention, Comparison, Outcome, Timeframe

Grading of Recommendations, Assessment, Development and Evaluations

Participants, Intervention, Comparison, Outcome, Timeframe

Preferred Reporting Items of Systematic reviews and Meta-Analyses

Studies Within a Review

Studies Within a Trial

Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–9.

PubMed   Google Scholar  

Chan AW, Song F, Vickers A, Jefferson T, Dickersin K, Gotzsche PC, Krumholz HM, Ghersi D, van der Worp HB. Increasing value and reducing waste: addressing inaccessible research. Lancet. 2014;383(9913):257–66.

PubMed   PubMed Central   Google Scholar  

Ioannidis JP, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383(9912):166–75.

Higgins JP, Altman DG, Gotzsche PC, Juni P, Moher D, Oxman AD, Savovic J, Schulz KF, Weeks L, Sterne JA. The Cochrane Collaboration's tool for assessing risk of bias in randomised trials. BMJ. 2011;343:d5928.

Moher D, Schulz KF, Altman DG. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomised trials. Lancet. 2001;357.

Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, Clarke M, Devereaux PJ, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100.

Shea BJ, Hamel C, Wells GA, Bouter LM, Kristjansson E, Grimshaw J, Henry DA, Boers M. AMSTAR is a reliable and valid measurement tool to assess the methodological quality of systematic reviews. J Clin Epidemiol. 2009;62(10):1013–20.

Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, Moher D, Tugwell P, Welch V, Kristjansson E, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. Bmj. 2017;358:j4008.

Lawson DO, Leenus A, Mbuagbaw L. Mapping the nomenclature, methodology, and reporting of studies that review methods: a pilot methodological review. Pilot Feasibility Studies. 2020;6(1):13.

Puljak L, Makaric ZL, Buljan I, Pieper D. What is a meta-epidemiological study? Analysis of published literature indicated heterogeneous study designs and definitions. J Comp Eff Res. 2020.

Abbade LPF, Wang M, Sriganesh K, Jin Y, Mbuagbaw L, Thabane L. The framing of research questions using the PICOT format in randomized controlled trials of venous ulcer disease is suboptimal: a systematic survey. Wound Repair Regen. 2017;25(5):892–900.

Gohari F, Baradaran HR, Tabatabaee M, Anijidani S, Mohammadpour Touserkani F, Atlasi R, Razmgir M. Quality of reporting randomized controlled trials (RCTs) in diabetes in Iran; a systematic review. J Diabetes Metab Disord. 2015;15(1):36.

Wang M, Jin Y, Hu ZJ, Thabane A, Dennis B, Gajic-Veljanoski O, Paul J, Thabane L. The reporting quality of abstracts of stepped wedge randomized trials is suboptimal: a systematic survey of the literature. Contemp Clin Trials Commun. 2017;8:1–10.

Shanthanna H, Kaushal A, Mbuagbaw L, Couban R, Busse J, Thabane L: A cross-sectional study of the reporting quality of pilot or feasibility trials in high-impact anesthesia journals Can J Anaesthesia 2018, 65(11):1180–1195.

Kosa SD, Mbuagbaw L, Borg Debono V, Bhandari M, Dennis BB, Ene G, Leenus A, Shi D, Thabane M, Valvasori S, et al. Agreement in reporting between trial publications and current clinical trial registry in high impact journals: a methodological review. Contemporary Clinical Trials. 2018;65:144–50.

Zhang Y, Florez ID, Colunga Lozano LE, Aloweni FAB, Kennedy SA, Li A, Craigie S, Zhang S, Agarwal A, Lopes LC, et al. A systematic survey on reporting and methods for handling missing participant data for continuous outcomes in randomized controlled trials. J Clin Epidemiol. 2017;88:57–66.

CAS   PubMed   Google Scholar  

Hernández AV, Boersma E, Murray GD, Habbema JD, Steyerberg EW. Subgroup analyses in therapeutic cardiovascular clinical trials: are most of them misleading? Am Heart J. 2006;151(2):257–64.

Samaan Z, Mbuagbaw L, Kosa D, Borg Debono V, Dillenburg R, Zhang S, Fruci V, Dennis B, Bawor M, Thabane L. A systematic scoping review of adherence to reporting guidelines in health care literature. J Multidiscip Healthc. 2013;6:169–88.

Buscemi N, Hartling L, Vandermeer B, Tjosvold L, Klassen TP. Single data extraction generated more errors than double data extraction in systematic reviews. J Clin Epidemiol. 2006;59(7):697–703.

Carrasco-Labra A, Brignardello-Petersen R, Santesso N, Neumann I, Mustafa RA, Mbuagbaw L, Etxeandia Ikobaltzeta I, De Stio C, McCullagh LJ, Alonso-Coello P. Improving GRADE evidence tables part 1: a randomized trial shows improved understanding of content in summary-of-findings tables with a new format. J Clin Epidemiol. 2016;74:7–18.

The Northern Ireland Hub for Trials Methodology Research: SWAT/SWAR Information [ https://www.qub.ac.uk/sites/TheNorthernIrelandNetworkforTrialsMethodologyResearch/SWATSWARInformation/ ]. Accessed 31 Aug 2020.

Chick S, Sánchez P, Ferrin D, Morrice D. How to conduct a successful simulation study. In: Proceedings of the 2003 winter simulation conference: 2003; 2003. p. 66–70.

Google Scholar  

Mulrow CD. The medical review article: state of the science. Ann Intern Med. 1987;106(3):485–8.

Sacks HS, Reitman D, Pagano D, Kupelnick B. Meta-analysis: an update. Mount Sinai J Med New York. 1996;63(3–4):216–24.

CAS   Google Scholar  

Areia M, Soares M, Dinis-Ribeiro M. Quality reporting of endoscopic diagnostic studies in gastrointestinal journals: where do we stand on the use of the STARD and CONSORT statements? Endoscopy. 2010;42(2):138–47.

Knol M, Groenwold R, Grobbee D. P-values in baseline tables of randomised controlled trials are inappropriate but still common in high impact journals. Eur J Prev Cardiol. 2012;19(2):231–2.

Chen M, Cui J, Zhang AL, Sze DM, Xue CC, May BH. Adherence to CONSORT items in randomized controlled trials of integrative medicine for colorectal Cancer published in Chinese journals. J Altern Complement Med. 2018;24(2):115–24.

Hopewell S, Ravaud P, Baron G, Boutron I. Effect of editors' implementation of CONSORT guidelines on the reporting of abstracts in high impact medical journals: interrupted time series analysis. BMJ. 2012;344:e4178.

The Cochrane Methodology Register Issue 2 2009 [ https://cmr.cochrane.org/help.htm ]. Accessed 31 Aug 2020.

Mbuagbaw L, Kredo T, Welch V, Mursleen S, Ross S, Zani B, Motaze NV, Quinlan L. Critical EPICOT items were absent in Cochrane human immunodeficiency virus systematic reviews: a bibliometric analysis. J Clin Epidemiol. 2016;74:66–72.

Barton S, Peckitt C, Sclafani F, Cunningham D, Chau I. The influence of industry sponsorship on the reporting of subgroup analyses within phase III randomised controlled trials in gastrointestinal oncology. Eur J Cancer. 2015;51(18):2732–9.

Setia MS. Methodology series module 5: sampling strategies. Indian J Dermatol. 2016;61(5):505–9.

Wilson B, Burnett P, Moher D, Altman DG, Al-Shahi Salman R. Completeness of reporting of randomised controlled trials including people with transient ischaemic attack or stroke: a systematic review. Eur Stroke J. 2018;3(4):337–46.

Kahale LA, Diab B, Brignardello-Petersen R, Agarwal A, Mustafa RA, Kwong J, Neumann I, Li L, Lopes LC, Briel M, et al. Systematic reviews do not adequately report or address missing outcome data in their analyses: a methodological survey. J Clin Epidemiol. 2018;99:14–23.

De Angelis CD, Drazen JM, Frizelle FA, Haug C, Hoey J, Horton R, Kotzin S, Laine C, Marusic A, Overbeke AJPM, et al. Is this clinical trial fully registered?: a statement from the International Committee of Medical Journal Editors*. Ann Intern Med. 2005;143(2):146–8.

Ohtake PJ, Childs JD. Why publish study protocols? Phys Ther. 2014;94(9):1208–9.

Rombey T, Allers K, Mathes T, Hoffmann F, Pieper D. A descriptive analysis of the characteristics and the peer review process of systematic review protocols published in an open peer review journal from 2012 to 2017. BMC Med Res Methodol. 2019;19(1):57.

Grimes DA, Schulz KF. Bias and causal associations in observational research. Lancet. 2002;359(9302):248–52.

Porta M (ed.): A dictionary of epidemiology, 5th edn. Oxford: Oxford University Press, Inc.; 2008.

El Dib R, Tikkinen KAO, Akl EA, Gomaa HA, Mustafa RA, Agarwal A, Carpenter CR, Zhang Y, Jorge EC, Almeida R, et al. Systematic survey of randomized trials evaluating the impact of alternative diagnostic strategies on patient-important outcomes. J Clin Epidemiol. 2017;84:61–9.

Helzer JE, Robins LN, Taibleson M, Woodruff RA Jr, Reich T, Wish ED. Reliability of psychiatric diagnosis. I. a methodological review. Arch Gen Psychiatry. 1977;34(2):129–33.

Chung ST, Chacko SK, Sunehag AL, Haymond MW. Measurements of gluconeogenesis and Glycogenolysis: a methodological review. Diabetes. 2015;64(12):3996–4010.

CAS   PubMed   PubMed Central   Google Scholar  

Sterne JA, Juni P, Schulz KF, Altman DG, Bartlett C, Egger M. Statistical methods for assessing the influence of study characteristics on treatment effects in 'meta-epidemiological' research. Stat Med. 2002;21(11):1513–24.

Moen EL, Fricano-Kugler CJ, Luikart BW, O’Malley AJ. Analyzing clustered data: why and how to account for multiple observations nested within a study participant? PLoS One. 2016;11(1):e0146721.

Zyzanski SJ, Flocke SA, Dickinson LM. On the nature and analysis of clustered data. Ann Fam Med. 2004;2(3):199–200.

Mathes T, Klassen P, Pieper D. Frequency of data extraction errors and methods to increase data extraction quality: a methodological review. BMC Med Res Methodol. 2017;17(1):152.

Bui DDA, Del Fiol G, Hurdle JF, Jonnalagadda S. Extractive text summarization system to aid data extraction from full text in systematic review development. J Biomed Inform. 2016;64:265–72.

Bui DD, Del Fiol G, Jonnalagadda S. PDF text classification to leverage information extraction from publication reports. J Biomed Inform. 2016;61:141–8.

Maticic K, Krnic Martinic M, Puljak L. Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience. BMC Med Res Methodol. 2019;19(1):32.

Speich B. Blinding in surgical randomized clinical trials in 2015. Ann Surg. 2017;266(1):21–2.

Abraha I, Cozzolino F, Orso M, Marchesi M, Germani A, Lombardo G, Eusebi P, De Florio R, Luchetta ML, Iorio A, et al. A systematic review found that deviations from intention-to-treat are common in randomized trials and systematic reviews. J Clin Epidemiol. 2017;84:37–46.

Zhong Y, Zhou W, Jiang H, Fan T, Diao X, Yang H, Min J, Wang G, Fu J, Mao B. Quality of reporting of two-group parallel randomized controlled clinical trials of multi-herb formulae: A survey of reports indexed in the Science Citation Index Expanded. Eur J Integrative Med. 2011;3(4):e309–16.

Farrokhyar F, Chu R, Whitlock R, Thabane L. A systematic review of the quality of publications reporting coronary artery bypass grafting trials. Can J Surg. 2007;50(4):266–77.

Oltean H, Gagnier JJ. Use of clustering analysis in randomized controlled trials in orthopaedic surgery. BMC Med Res Methodol. 2015;15:17.

Fleming PS, Koletsi D, Pandis N. Blinded by PRISMA: are systematic reviewers focusing on PRISMA and ignoring other guidelines? PLoS One. 2014;9(5):e96407.

Balasubramanian SP, Wiener M, Alshameeri Z, Tiruvoipati R, Elbourne D, Reed MW. Standards of reporting of randomized controlled trials in general surgery: can we do better? Ann Surg. 2006;244(5):663–7.

de Vries TW, van Roon EN. Low quality of reporting adverse drug reactions in paediatric randomised controlled trials. Arch Dis Child. 2010;95(12):1023–6.

Borg Debono V, Zhang S, Ye C, Paul J, Arya A, Hurlburt L, Murthy Y, Thabane L. The quality of reporting of RCTs used within a postoperative pain management meta-analysis, using the CONSORT statement. BMC Anesthesiol. 2012;12:13.

Kaiser KA, Cofield SS, Fontaine KR, Glasser SP, Thabane L, Chu R, Ambrale S, Dwary AD, Kumar A, Nayyar G, et al. Is funding source related to study reporting quality in obesity or nutrition randomized control trials in top-tier medical journals? Int J Obes. 2012;36(7):977–81.

Thomas O, Thabane L, Douketis J, Chu R, Westfall AO, Allison DB. Industry funding and the reporting quality of large long-term weight loss trials. Int J Obes. 2008;32(10):1531–6.

Khan NR, Saad H, Oravec CS, Rossi N, Nguyen V, Venable GT, Lillard JC, Patel P, Taylor DR, Vaughn BN, et al. A review of industry funding in randomized controlled trials published in the neurosurgical literature-the elephant in the room. Neurosurgery. 2018;83(5):890–7.

Hansen C, Lundh A, Rasmussen K, Hrobjartsson A. Financial conflicts of interest in systematic reviews: associations with results, conclusions, and methodological quality. Cochrane Database Syst Rev. 2019;8:Mr000047.

Kiehna EN, Starke RM, Pouratian N, Dumont AS. Standards for reporting randomized controlled trials in neurosurgery. J Neurosurg. 2011;114(2):280–5.

Liu LQ, Morris PJ, Pengel LH. Compliance to the CONSORT statement of randomized controlled trials in solid organ transplantation: a 3-year overview. Transpl Int. 2013;26(3):300–6.

Bala MM, Akl EA, Sun X, Bassler D, Mertz D, Mejza F, Vandvik PO, Malaga G, Johnston BC, Dahm P, et al. Randomized trials published in higher vs. lower impact journals differ in design, conduct, and analysis. J Clin Epidemiol. 2013;66(3):286–95.

Lee SY, Teoh PJ, Camm CF, Agha RA. Compliance of randomized controlled trials in trauma surgery with the CONSORT statement. J Trauma Acute Care Surg. 2013;75(4):562–72.

Ziogas DC, Zintzaras E. Analysis of the quality of reporting of randomized controlled trials in acute and chronic myeloid leukemia, and myelodysplastic syndromes as governed by the CONSORT statement. Ann Epidemiol. 2009;19(7):494–500.

Alvarez F, Meyer N, Gourraud PA, Paul C. CONSORT adoption and quality of reporting of randomized controlled trials: a systematic analysis in two dermatology journals. Br J Dermatol. 2009;161(5):1159–65.

Mbuagbaw L, Thabane M, Vanniyasingam T, Borg Debono V, Kosa S, Zhang S, Ye C, Parpia S, Dennis BB, Thabane L. Improvement in the quality of abstracts in major clinical journals since CONSORT extension for abstracts: a systematic review. Contemporary Clin trials. 2014;38(2):245–50.

Thabane L, Chu R, Cuddy K, Douketis J. What is the quality of reporting in weight loss intervention studies? A systematic review of randomized controlled trials. Int J Obes. 2007;31(10):1554–9.

Murad MH, Wang Z. Guidelines for reporting meta-epidemiological methodology research. Evidence Based Med. 2017;22(4):139.

METRIC - MEthodological sTudy ReportIng Checklist: guidelines for reporting methodological studies in health research [ http://www.equator-network.org/library/reporting-guidelines-under-development/reporting-guidelines-under-development-for-other-study-designs/#METRIC ]. Accessed 31 Aug 2020.

Jager KJ, Zoccali C, MacLeod A, Dekker FW. Confounding: what it is and how to deal with it. Kidney Int. 2008;73(3):256–60.

Parker SG, Halligan S, Erotocritou M, Wood CPJ, Boulton RW, Plumb AAO, Windsor ACJ, Mallett S. A systematic methodological review of non-randomised interventional studies of elective ventral hernia repair: clear definitions and a standardised minimum dataset are needed. Hernia. 2019.

Bouwmeester W, Zuithoff NPA, Mallett S, Geerlings MI, Vergouwe Y, Steyerberg EW, Altman DG, Moons KGM. Reporting and methods in clinical prediction research: a systematic review. PLoS Med. 2012;9(5):1–12.

Schiller P, Burchardi N, Niestroj M, Kieser M. Quality of reporting of clinical non-inferiority and equivalence randomised trials--update and extension. Trials. 2012;13:214.

Riado Minguez D, Kowalski M, Vallve Odena M, Longin Pontzen D, Jelicic Kadic A, Jeric M, Dosenovic S, Jakus D, Vrdoljak M, Poklepovic Pericic T, et al. Methodological and reporting quality of systematic reviews published in the highest ranking journals in the field of pain. Anesth Analg. 2017;125(4):1348–54.

Thabut G, Estellat C, Boutron I, Samama CM, Ravaud P. Methodological issues in trials assessing primary prophylaxis of venous thrombo-embolism. Eur Heart J. 2005;27(2):227–36.

Puljak L, Riva N, Parmelli E, González-Lorenzo M, Moja L, Pieper D. Data extraction methods: an analysis of internal reporting discrepancies in single manuscripts and practical advice. J Clin Epidemiol. 2020;117:158–64.

Ritchie A, Seubert L, Clifford R, Perry D, Bond C. Do randomised controlled trials relevant to pharmacy meet best practice standards for quality conduct and reporting? A systematic review. Int J Pharm Pract. 2019.

Babic A, Vuka I, Saric F, Proloscic I, Slapnicar E, Cavar J, Pericic TP, Pieper D, Puljak L. Overall bias methods and their use in sensitivity analysis of Cochrane reviews were not consistent. J Clin Epidemiol. 2019.

Tan A, Porcher R, Crequit P, Ravaud P, Dechartres A. Differences in treatment effect size between overall survival and progression-free survival in immunotherapy trials: a Meta-epidemiologic study of trials with results posted at ClinicalTrials.gov. J Clin Oncol. 2017;35(15):1686–94.

Croitoru D, Huang Y, Kurdina A, Chan AW, Drucker AM. Quality of reporting in systematic reviews published in dermatology journals. Br J Dermatol. 2020;182(6):1469–76.

Khan MS, Ochani RK, Shaikh A, Vaduganathan M, Khan SU, Fatima K, Yamani N, Mandrola J, Doukky R, Krasuski RA: Assessing the Quality of Reporting of Harms in Randomized Controlled Trials Published in High Impact Cardiovascular Journals. Eur Heart J Qual Care Clin Outcomes 2019.

Rosmarakis ES, Soteriades ES, Vergidis PI, Kasiakou SK, Falagas ME. From conference abstract to full paper: differences between data presented in conferences and journals. FASEB J. 2005;19(7):673–80.

Mueller M, D’Addario M, Egger M, Cevallos M, Dekkers O, Mugglin C, Scott P. Methods to systematically review and meta-analyse observational studies: a systematic scoping review of recommendations. BMC Med Res Methodol. 2018;18(1):44.

Li G, Abbade LPF, Nwosu I, Jin Y, Leenus A, Maaz M, Wang M, Bhatt M, Zielinski L, Sanger N, et al. A scoping review of comparisons between abstracts and full reports in primary biomedical research. BMC Med Res Methodol. 2017;17(1):181.

Krnic Martinic M, Pieper D, Glatt A, Puljak L. Definition of a systematic review used in overviews of systematic reviews, meta-epidemiological studies and textbooks. BMC Med Res Methodol. 2019;19(1):203.

Analytical study [ https://medical-dictionary.thefreedictionary.com/analytical+study ]. Accessed 31 Aug 2020.

Tricco AC, Tetzlaff J, Pham B, Brehaut J, Moher D. Non-Cochrane vs. Cochrane reviews were twice as likely to have positive conclusion statements: cross-sectional study. J Clin Epidemiol. 2009;62(4):380–6 e381.

Schalken N, Rietbergen C. The reporting quality of systematic reviews and Meta-analyses in industrial and organizational psychology: a systematic review. Front Psychol. 2017;8:1395.

Ranker LR, Petersen JM, Fox MP. Awareness of and potential for dependent error in the observational epidemiologic literature: A review. Ann Epidemiol. 2019;36:15–9 e12.

Paquette M, Alotaibi AM, Nieuwlaat R, Santesso N, Mbuagbaw L. A meta-epidemiological study of subgroup analyses in cochrane systematic reviews of atrial fibrillation. Syst Rev. 2019;8(1):241.

Download references

Acknowledgements

This work did not receive any dedicated funding.

Author information

Authors and affiliations.

Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada

Lawrence Mbuagbaw, Daeria O. Lawson & Lehana Thabane

Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario, L8N 4A6, Canada

Lawrence Mbuagbaw & Lehana Thabane

Centre for the Development of Best Practices in Health, Yaoundé, Cameroon

Lawrence Mbuagbaw

Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000, Zagreb, Croatia

Livia Puljak

Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN, 47405, USA

David B. Allison

Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON, Canada

Lehana Thabane

Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON, Canada

Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON, Canada

You can also search for this author in PubMed   Google Scholar

Contributions

LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.

Corresponding author

Correspondence to Lawrence Mbuagbaw .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Mbuagbaw, L., Lawson, D.O., Puljak, L. et al. A tutorial on methodological studies: the what, when, how and why. BMC Med Res Methodol 20 , 226 (2020). https://doi.org/10.1186/s12874-020-01107-7

Download citation

Received : 27 May 2020

Accepted : 27 August 2020

Published : 07 September 2020

DOI : https://doi.org/10.1186/s12874-020-01107-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Methodological study
  • Meta-epidemiology
  • Research methods
  • Research-on-research

BMC Medical Research Methodology

ISSN: 1471-2288

presentation of methodology

Got any suggestions?

We want to hear from you! Send us a message and help improve Slidesgo

Top searches

Trending searches

presentation of methodology

fall background

24 templates

presentation of methodology

43 templates

presentation of methodology

rosh hashanah

38 templates

presentation of methodology

fall pumpkin

67 templates

presentation of methodology

cute halloween

16 templates

presentation of methodology

120 templates

Research Methodology Workshop

It seems that you like this template, research methodology workshop presentation, premium google slides theme, powerpoint template, and canva presentation template.

The backbone of any scientific inquiry: the methodology. A systematic process of collecting, analyzing, and interpreting data to draw conclusions about a particular subject matter. However, many researchers struggle with selecting the appropriate research design, sampling methods, data collection, and analysis techniques. But your workshop is ready to tackle that issue so as to help researchers to get started! With this template and its captivating design, you'll get everyone paying close attention to your explanations. Remember that you can customize everything!

Features of this template

  • 100% editable and easy to modify
  • 31 different slides to impress your audience
  • Contains easy-to-edit graphics such as graphs, maps, tables, timelines and mockups
  • Includes 500+ icons and Flaticon’s extension for customizing your slides
  • Designed to be used in Google Slides, Canva, and Microsoft PowerPoint
  • 16:9 widescreen format suitable for all types of screens
  • Includes information about fonts, colors, and credits of the resources used

What are the benefits of having a Premium account?

What Premium plans do you have?

What can I do to have unlimited downloads?

Don’t want to attribute Slidesgo?

Gain access to over 30200 templates & presentations with premium from 1.67€/month.

Are you already Premium? Log in

presentation of methodology

Register for free and start downloading now

Related posts on our blog.

How to Add, Duplicate, Move, Delete or Hide Slides in Google Slides | Quick Tips & Tutorial for your presentations

How to Add, Duplicate, Move, Delete or Hide Slides in Google Slides

How to Change Layouts in PowerPoint | Quick Tips & Tutorial for your presentations

How to Change Layouts in PowerPoint

How to Change the Slide Size in Google Slides | Quick Tips & Tutorial for your presentations

How to Change the Slide Size in Google Slides

Related presentations.

Methodology for Empirical Research Workshop presentation template

Premium template

Unlock this template and gain unlimited access

Research Methodology Development presentation template

Create your presentation Create personalized presentation content

Writing tone, number of slides.

Art Workshop presentation template

Register for free and start editing online

Methodology for Written and Oral Presentation of Research Results

  • Journal of Professional Issues in Engineering Education and Practice 136(2):112-117
  • This person is not on ResearchGate, or hasn't claimed this research yet.

Saso Tomazic at University of Ljubljana

  • University of Ljubljana

Veljko M. Milutinovic at University of Belgrade

  • University of Belgrade

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

Guillermo Mejia-Aguilar

  • Vladimir Blagojevic
  • Dragan Bojic

Miroslav Bojovic

  • TECH COMMUN-STC

Michael Alley

  • Steve Jones
  • Tommy S. W. Wong
  • Zaharije Radivojevic
  • Milos Cvetanovic
  • Robert A. Day
  • Pac Socio Rev
  • Bernard P. Cohen
  • Ronald J. Kruse
  • Michael Anbar
  • Jean-luc Doumont
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

Research Methods | Definitions, Types, Examples

Research methods are specific procedures for collecting and analyzing data. Developing your research methods is an integral part of your research design . When planning your methods, there are two key decisions you will make.

First, decide how you will collect data . Your methods depend on what type of data you need to answer your research question :

  • Qualitative vs. quantitative : Will your data take the form of words or numbers?
  • Primary vs. secondary : Will you collect original data yourself, or will you use data that has already been collected by someone else?
  • Descriptive vs. experimental : Will you take measurements of something as it is, or will you perform an experiment?

Second, decide how you will analyze the data .

  • For quantitative data, you can use statistical analysis methods to test relationships between variables.
  • For qualitative data, you can use methods such as thematic analysis to interpret patterns and meanings in the data.

Table of contents

Methods for collecting data, examples of data collection methods, methods for analyzing data, examples of data analysis methods, other interesting articles, frequently asked questions about research methods.

Data is the information that you collect for the purposes of answering your research question . The type of data you need depends on the aims of your research.

Qualitative vs. quantitative data

Your choice of qualitative or quantitative data collection depends on the type of knowledge you want to develop.

For questions about ideas, experiences and meanings, or to study something that can’t be described numerically, collect qualitative data .

If you want to develop a more mechanistic understanding of a topic, or your research involves hypothesis testing , collect quantitative data .

Qualitative to broader populations. .
Quantitative .

You can also take a mixed methods approach , where you use both qualitative and quantitative research methods.

Primary vs. secondary research

Primary research is any original data that you collect yourself for the purposes of answering your research question (e.g. through surveys , observations and experiments ). Secondary research is data that has already been collected by other researchers (e.g. in a government census or previous scientific studies).

If you are exploring a novel research question, you’ll probably need to collect primary data . But if you want to synthesize existing knowledge, analyze historical trends, or identify patterns on a large scale, secondary data might be a better choice.

Primary . methods.
Secondary

Descriptive vs. experimental data

In descriptive research , you collect data about your study subject without intervening. The validity of your research will depend on your sampling method .

In experimental research , you systematically intervene in a process and measure the outcome. The validity of your research will depend on your experimental design .

To conduct an experiment, you need to be able to vary your independent variable , precisely measure your dependent variable, and control for confounding variables . If it’s practically and ethically possible, this method is the best choice for answering questions about cause and effect.

Descriptive . .
Experimental

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Research methods for collecting data
Research method Primary or secondary? Qualitative or quantitative? When to use
Primary Quantitative To test cause-and-effect relationships.
Primary Quantitative To understand general characteristics of a population.
Interview/focus group Primary Qualitative To gain more in-depth understanding of a topic.
Observation Primary Either To understand how something occurs in its natural setting.
Secondary Either To situate your research in an existing body of work, or to evaluate trends within a research topic.
Either Either To gain an in-depth understanding of a specific group or context, or when you don’t have the resources for a large study.

Your data analysis methods will depend on the type of data you collect and how you prepare it for analysis.

Data can often be analyzed both quantitatively and qualitatively. For example, survey responses could be analyzed qualitatively by studying the meanings of responses or quantitatively by studying the frequencies of responses.

Qualitative analysis methods

Qualitative analysis is used to understand words, ideas, and experiences. You can use it to interpret data that was collected:

  • From open-ended surveys and interviews , literature reviews , case studies , ethnographies , and other sources that use text rather than numbers.
  • Using non-probability sampling methods .

Qualitative analysis tends to be quite flexible and relies on the researcher’s judgement, so you have to reflect carefully on your choices and assumptions and be careful to avoid research bias .

Quantitative analysis methods

Quantitative analysis uses numbers and statistics to understand frequencies, averages and correlations (in descriptive studies) or cause-and-effect relationships (in experiments).

You can use quantitative analysis to interpret data that was collected either:

  • During an experiment .
  • Using probability sampling methods .

Because the data is collected and analyzed in a statistically valid way, the results of quantitative analysis can be easily standardized and shared among researchers.

Research methods for analyzing data
Research method Qualitative or quantitative? When to use
Quantitative To analyze data collected in a statistically valid manner (e.g. from experiments, surveys, and observations).
Meta-analysis Quantitative To statistically analyze the results of a large collection of studies.

Can only be applied to studies that collected data in a statistically valid manner.

Qualitative To analyze data collected from interviews, , or textual sources.

To understand general themes in the data and how they are communicated.

Either To analyze large volumes of textual or visual data collected from surveys, literature reviews, or other sources.

Can be quantitative (i.e. frequencies of words) or qualitative (i.e. meanings of words).

Prevent plagiarism. Run a free check.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Chi square test of independence
  • Statistical power
  • Descriptive statistics
  • Degrees of freedom
  • Pearson correlation
  • Null hypothesis
  • Double-blind study
  • Case-control study
  • Research ethics
  • Data collection
  • Hypothesis testing
  • Structured interviews

Research bias

  • Hawthorne effect
  • Unconscious bias
  • Recall bias
  • Halo effect
  • Self-serving bias
  • Information bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

In mixed methods research , you use both qualitative and quantitative data collection and analysis methods to answer your research question .

A sample is a subset of individuals from a larger population . Sampling means selecting the group that you will actually collect data from in your research. For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

In statistics, sampling allows you to test a hypothesis about the characteristics of a population.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts and meanings, use qualitative methods .
  • If you want to analyze a large amount of readily-available data, use secondary data. If you want data specific to your purposes with control over how it is generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Methodology refers to the overarching strategy and rationale of your research project . It involves studying the methods used in your field and the theories or principles behind them, in order to develop an approach that matches your objectives.

Methods are the specific tools and procedures you use to collect and analyze data (for example, experiments, surveys , and statistical tests ).

In shorter scientific papers, where the aim is to report the findings of a specific study, you might simply describe what you did in a methods section .

In a longer or more complex research project, such as a thesis or dissertation , you will probably include a methodology section , where you explain your approach to answering the research questions and cite relevant sources to support your choice of methods.

Is this article helpful?

Other students also liked, writing strong research questions | criteria & examples.

  • What Is a Research Design | Types, Guide & Examples
  • Data Collection | Definition, Methods & Examples

More interesting articles

  • Between-Subjects Design | Examples, Pros, & Cons
  • Cluster Sampling | A Simple Step-by-Step Guide with Examples
  • Confounding Variables | Definition, Examples & Controls
  • Construct Validity | Definition, Types, & Examples
  • Content Analysis | Guide, Methods & Examples
  • Control Groups and Treatment Groups | Uses & Examples
  • Control Variables | What Are They & Why Do They Matter?
  • Correlation vs. Causation | Difference, Designs & Examples
  • Correlational Research | When & How to Use
  • Critical Discourse Analysis | Definition, Guide & Examples
  • Cross-Sectional Study | Definition, Uses & Examples
  • Descriptive Research | Definition, Types, Methods & Examples
  • Ethical Considerations in Research | Types & Examples
  • Explanatory and Response Variables | Definitions & Examples
  • Explanatory Research | Definition, Guide, & Examples
  • Exploratory Research | Definition, Guide, & Examples
  • External Validity | Definition, Types, Threats & Examples
  • Extraneous Variables | Examples, Types & Controls
  • Guide to Experimental Design | Overview, Steps, & Examples
  • How Do You Incorporate an Interview into a Dissertation? | Tips
  • How to Do Thematic Analysis | Step-by-Step Guide & Examples
  • How to Write a Literature Review | Guide, Examples, & Templates
  • How to Write a Strong Hypothesis | Steps & Examples
  • Inclusion and Exclusion Criteria | Examples & Definition
  • Independent vs. Dependent Variables | Definition & Examples
  • Inductive Reasoning | Types, Examples, Explanation
  • Inductive vs. Deductive Research Approach | Steps & Examples
  • Internal Validity in Research | Definition, Threats, & Examples
  • Internal vs. External Validity | Understanding Differences & Threats
  • Longitudinal Study | Definition, Approaches & Examples
  • Mediator vs. Moderator Variables | Differences & Examples
  • Mixed Methods Research | Definition, Guide & Examples
  • Multistage Sampling | Introductory Guide & Examples
  • Naturalistic Observation | Definition, Guide & Examples
  • Operationalization | A Guide with Examples, Pros & Cons
  • Population vs. Sample | Definitions, Differences & Examples
  • Primary Research | Definition, Types, & Examples
  • Qualitative vs. Quantitative Research | Differences, Examples & Methods
  • Quasi-Experimental Design | Definition, Types & Examples
  • Questionnaire Design | Methods, Question Types & Examples
  • Random Assignment in Experiments | Introduction & Examples
  • Random vs. Systematic Error | Definition & Examples
  • Reliability vs. Validity in Research | Difference, Types and Examples
  • Reproducibility vs Replicability | Difference & Examples
  • Reproducibility vs. Replicability | Difference & Examples
  • Sampling Methods | Types, Techniques & Examples
  • Semi-Structured Interview | Definition, Guide & Examples
  • Simple Random Sampling | Definition, Steps & Examples
  • Single, Double, & Triple Blind Study | Definition & Examples
  • Stratified Sampling | Definition, Guide & Examples
  • Structured Interview | Definition, Guide & Examples
  • Survey Research | Definition, Examples & Methods
  • Systematic Review | Definition, Example, & Guide
  • Systematic Sampling | A Step-by-Step Guide with Examples
  • Textual Analysis | Guide, 3 Approaches & Examples
  • The 4 Types of Reliability in Research | Definitions & Examples
  • The 4 Types of Validity in Research | Definitions & Examples
  • Transcribing an Interview | 5 Steps & Transcription Software
  • Triangulation in Research | Guide, Types, Examples
  • Types of Interviews in Research | Guide & Examples
  • Types of Research Designs Compared | Guide & Examples
  • Types of Variables in Research & Statistics | Examples
  • Unstructured Interview | Definition, Guide & Examples
  • What Is a Case Study? | Definition, Examples & Methods
  • What Is a Case-Control Study? | Definition & Examples
  • What Is a Cohort Study? | Definition & Examples
  • What Is a Conceptual Framework? | Tips & Examples
  • What Is a Controlled Experiment? | Definitions & Examples
  • What Is a Double-Barreled Question?
  • What Is a Focus Group? | Step-by-Step Guide & Examples
  • What Is a Likert Scale? | Guide & Examples
  • What Is a Prospective Cohort Study? | Definition & Examples
  • What Is a Retrospective Cohort Study? | Definition & Examples
  • What Is Action Research? | Definition & Examples
  • What Is an Observational Study? | Guide & Examples
  • What Is Concurrent Validity? | Definition & Examples
  • What Is Content Validity? | Definition & Examples
  • What Is Convenience Sampling? | Definition & Examples
  • What Is Convergent Validity? | Definition & Examples
  • What Is Criterion Validity? | Definition & Examples
  • What Is Data Cleansing? | Definition, Guide & Examples
  • What Is Deductive Reasoning? | Explanation & Examples
  • What Is Discriminant Validity? | Definition & Example
  • What Is Ecological Validity? | Definition & Examples
  • What Is Ethnography? | Definition, Guide & Examples
  • What Is Face Validity? | Guide, Definition & Examples
  • What Is Non-Probability Sampling? | Types & Examples
  • What Is Participant Observation? | Definition & Examples
  • What Is Peer Review? | Types & Examples
  • What Is Predictive Validity? | Examples & Definition
  • What Is Probability Sampling? | Types & Examples
  • What Is Purposive Sampling? | Definition & Examples
  • What Is Qualitative Observation? | Definition & Examples
  • What Is Qualitative Research? | Methods & Examples
  • What Is Quantitative Observation? | Definition & Examples
  • What Is Quantitative Research? | Definition, Uses & Methods

Get unlimited documents corrected

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

Home PowerPoint Templates Methodology

Methodology PowerPoint Templates

The Methodology templates of PowerPoint include a range of presentation slide decks and concept diagrams. There are several industry-wide technology models. The SlideModel catalog offers a collection of commonly used methodology models. These models and business presentations from research to a systematic analysis of implemented procedures or techniques. The methodology PowerPoint templates include research methodology slide decks for thesis to evaluate reliability and validity. All templates of methodology PowerPoint contain creative graphic illustrations and compelling layouts. These template will make your presentation experience worthwhile.

Editable Environmental Impact Report PPT Template

Environmental Impact Report PowerPoint Template

6-Step Head Diagram Morph Template for Presentation

6-Step Head Diagram Morph Template for PowerPoint

5S Plan Diagram PPT Slide Template

5S Plan Diagram Template for PowerPoint

OKR Slide Template for Presentation

OKR Slide Template for PowerPoint

Editable Research Presentation PPT Template

Research Presentation PowerPoint Template

presentation of methodology

Master Thesis PowerPoint Template

PowerPoint Slide Design for RAD Methodology Presentation

RAD Methodology Model Template for PowerPoint

Whiteboard Matrix Chart of Sprint Retrospective

Sprint Retrospective PowerPoint Template

Process Cycle Lean Product Diagram

Lean Product Development Diagram for PowerPoint

Editable Diagram Enneagram Personality System Slide

Enneagram Personality System PowerPoint Diagram

Agile PowerPoint Process Diagram

Agile Process Lifecycle Diagram for PowerPoint

Horizontal Funnel Diagram Model

Cone of Uncertainty Diagram for PowerPoint

Explaining a methodology requires clarity and a logical flow, whether in a business or research context. Our methodology PPT templates are specifically designed to help you outline every phase of your process—data collection, analysis, implementation, and more—in a format that’s easy for your audience to follow. These templates are structured, visually appealing, and fully customizable to fit your presentation needs.

Why Use Methodology Slide Templates?

A methodology template PPT allows you to present complex steps and processes without overwhelming your audience. You don’t have to spend hours formatting slides when you have access to pre-designed templates catering to typical methodologies’ structure. Whether you need to explain research methods, project workflows, or data analysis, these templates give you the framework you need. They include organized sections for each step of your methodology, allowing you to focus on content while still delivering a polished, professional presentation.

Methodology PowerPoint templates are incredibly versatile and can be used in various fields, such as research, business, and education. Researchers can use these templates to break down data collection processes or experimental methods. Project managers often rely on methodology slides to illustrate project phases, from planning to execution. Even educators can find them useful for teaching specific methodologies in class. No matter the context, having a clear and well-structured methodology presentation helps convey your approach effectively.

What is a Methodology Presentation template?

A methodology presentation template is a pre-designed PowerPoint layout that helps you clearly and visually present the steps and processes involved in a specific methodology, such as research or project planning.

How to create a Methodology PowerPoint template?

To create a methodology PowerPoint template, design slides for each phase of your process (e.g., planning, execution, analysis), using flowcharts and visuals. Alternatively, you can download ready-made methodology templates from SlideModel for a professional, structured format.

Where can I find a Free Methodology Presentation template?

You can find free methodology presentation templates at SlideModel, offering a variety of designs tailored for presenting research methods, workflows, and project approaches.

Download Unlimited Content

Our annual unlimited plan let you download unlimited content from slidemodel. save hours of manual work and use awesome slide designs in your next presentation..

presentation of methodology

  • International
  • Education Jobs
  • Schools directory
  • Resources Education Jobs Schools directory News Search

Features of Science Research Methods Lesson 33 AQA Psychology PowerPoint

Features of Science Research Methods Lesson 33 AQA Psychology PowerPoint

Subject: Psychology

Age range: 16+

Resource type: Lesson (complete)

Social Science Store

Last updated

28 September 2024

  • Share through email
  • Share through twitter
  • Share through linkedin
  • Share through facebook
  • Share through pinterest

presentation of methodology

Complete 1 hour lesson including starter, AO1, AO2, AFL, extension tasks, pair, group and class activities, plenary.

Carefully planned to meet the needs of all students.

Part of full unit bundle for easy teaching and revision.

Tes paid licence How can I reuse this?

Your rating is required to reflect your happiness.

It's good to leave some feedback.

Something went wrong, please try again later.

This resource hasn't been reviewed yet

To ensure quality for our reviews, only customers who have purchased this resource can review it

Report this resource to let us know if it violates our terms and conditions. Our customer service team will review your report and will be in touch.

Not quite what you were looking for? Search by keyword to find the right resource:

  • Open access
  • Published: 29 September 2024

Comparison of lecture-based learning with presentation-assimilation-discussion method in occupational bloodborne exposure education of nursing students, a randomised trial

  • Heling Wen 1   na1 ,
  • Rui Zhang 2   na1 ,
  • Zhenke Zhou 3   na1 ,
  • Min Hong 4 ,
  • Zheng Huang 2 ,
  • Yifeng Jiang 2 ,
  • Yu Chen 1 &
  • Lei Peng 5  

BMC Nursing volume  23 , Article number:  702 ( 2024 ) Cite this article

Metrics details

Occupational Bloodborne Exposures (OBEs) are incidents where healthcare workers come into contact with blood or other potentially infectious materials, leading to risks of transmitting bloodborne pathogens. Nursing students, often in direct contact with patients, face heightened risks due to their duties.

First, we conducted a cross-sectional survey using a OBEs questionnaire to explore the knowledge, attitudes, practices, and needs regarding OBEs among nursing students. Subsequently, we used a randomized controlled trial (RCT) to compare the impact of the Presentation-Assimilation-Discussion (PAD) method with the traditional lecture-based learning (LBL) method on OBEs education for nursing students. Pre-test, post-test, and retention test were used to observe the teaching effectiveness, and the students’ feedback on the teaching method was also observed.

In the cross-sectional survey, we found that nursing students lacked sufficient knowledge and management skills regarding OBEs but recognized the importance of standard precautions and expressed a desire for systematic OBEs training during their education and internships. In the RCT, the total, theoretical, and practical scores of the PAD and LBL groups were comparable in the pre-test (56.70 ± 3.47 vs. 56.40 ± 3.95, 33.09 ± 3.39 vs. 33.33 ± 2.44, 23.61 ± 4.66 vs. 23.07 ± 4.84, p  > 0.05). After training, the PAD model demonstrated an advantage over the LBL model in immediate total (84.25 ± 4.06 vs. 78.95 ± 4.23, p  < 0.001), theoretical (54.32 ± 2.43 vs. 51.44 ± 2.58, p  < 0.001), and practical scores (29.93 ± 3.90 vs. 27.51 ± 4.33, p  < 0.01). It also showed superior retention of total (69.05 ± 3.87 vs. 65.77 ± 2.94, p  < 0.001) and theoretical scores (39.05 ± 3.05 vs. 36.23 ± 3.18, p  < 0.001). However, there was no significant difference in the retention of practical scores between the two groups (30.00 ± 4.76 vs. 29.53 ± 3.73, p  > 0.05). The PAD group benefited more across various learning dimensions but reported a higher study load.

Conclusions

Our study reveals that the PAD model could be a valuable approach for teaching OBEs to nursing students.

Peer Review reports

Introduction

Occupational Bloodborne Exposures (OBEs) is defined as any percutaneous or mucocutaneous incident in which an individual working in a healthcare setting or involved in related activities comes into contact with blood or other potentially infectious materials (OPIM) [ 1 ]. This type of exposure includes a variety of situations, such as needlestick injuries, cuts from sharp medical instruments, and contact of mucous membranes or compromised skin with materials that may contain bloodborne viruses, including the Human Immunodeficiency Virus (HIV), Hepatitis B Virus (HBV), and Hepatitis C Virus (HCV) [ 2 ]. According to the World Health Report, occupational exposure is estimated to account for 2.5% of HIV/AIDS cases and 40% of hepatitis B and C cases among healthcare professionals. It is also estimated that globally, occupational exposure to percutaneous injuries has resulted in 16,000 HCV, 66,000 HBV, and 1,000 HIV infections among healthcare workers [ 3 ].

A cross-sectional study revealed that within a one-month period, 1.53% of healthcare workers experienced various types of OBEs, with students during their internship exhibiting the highest incidence rate among all healthcare professionals [ 4 ]. Notably, nurses, physicians, and logistical staff who had not received relevant training demonstrated a higher incidence of OBEs compared to their trained counterparts. The study also highlights that occupational exposure can lead to impacts on healthcare workers beyond the risk of infection, including anxiety (57.7%), stress (24.2%), and insomnia (10.2%). These effects can significantly affect the mental and physical health of healthcare personnel [ 5 ]. Furthermore, another study indicates that healthcare personnel exhibit adverse emotional responses such as anxiety, anger, and guilt following OBEs. They may engage in active coping strategies, such as seeking immediate medical attention or reporting the incident to a monitoring system, or adopt passive coping methods, including avoiding reporting the event and harboring vague hopes of encountering no subsequent issues [ 6 ]. Healthcare institutions need to provide their staff with repeated education to familiarize them with guidelines for preventing OBEs and constantly stimulate their awareness of the risks associated with injuries.

Nurses, due to their clinical duties which often involve direct and frequent contact with patients, are at a heightened risk of occupational exposure to bloodborne pathogens compared to other healthcare providers, such as physicians [ 7 ]. This increased risk can be attributed to the primary role nurses play in administering injections and intravenous fluids. Nurses make up the majority of healthcare professionals who handle injections and sharp instruments. Additionally, hospitals typically have a higher proportion of nursing staff compared to other healthcare occupational groups. The majority of these exposures occur through percutaneous injuries, such as needlestick incidents, cuts from sharp instruments, and contact of non-intact skin or mucous membranes with blood or OPIM.

An online cross-sectional survey conducted across 31 provincial administrative regions in China revealed that more than half (52.1%) of the nurses experienced occupational exposure to blood or body fluids [ 7 ]. Similarly, a study in Thailand indicated that operating room nurses face a high risk of occupational exposure to bloodborne pathogens, including needlestick injuries (NSIs), sharps injuries (SIs), and blood and body fluid exposures (BBFEs), with inadequate training identified as a significant contributing factor to these exposures [ 8 ]. In clinical practice, nursing students participate in many clinical tasks similar to those of licensed nurses. However, they show significantly lower levels of skill and experience in handling needles and sharps, leading to a higher risk of OBEs [ 9 , 10 ]. More importantly, the knowledge, attitudes, practices, and needs (K-A-P-N) of Chinese nursing students, especially those during their internship, regarding OBEs are not well recognized. The OBEs questionnaire based on the “K-A-P-N” model among nursing students can assist Chinese nursing students in both theoretical studies and clinical practices related to OBEs.

Given this increased risk of OBEs for nursing students, it is crucial in nursing practice to adhere strictly to standard precautions, use appropriate personal protective equipment (PPE), and follow protocols immediately after exposure. Additionally, ongoing education and training in infection control and prevention strategies are essential in reducing nurses’ risk of OBEs. Studies indicate that high-quality training is the most important factor in preventing healthcare professionals from OBEs [ 11 ]. Nurses who do not participate in training sessions focused on the prevention and management of OBEs at their workplaces face a significantly higher risk of these injuries compared to those who undergo any form of training [ 12 ]. These studies suggest that finding an effective educational and training method is key to preventing nurses’ OBEs and protecting the health of nursing students.

The Lecture-Based Learning (LBL) paradigm, entrenched as a conventional instructional strategy in nursing education, continues to dominate due to its systematic presentation of theoretical constructs. A notable advantage of LBL lies in its ability to effectively transmit extensive knowledge to large cohorts in a constrained temporal window [ 13 ]. LBL underpins the standardization of teaching methods, thereby ensuring uniform knowledge dissemination among all learners a cornerstone for maintaining educational consistency. However, the instructor-centered nature of LBL has been criticized for limiting learner engagement and the facilitation of active learning processes. These elements are crucial in developing analytical reasoning and problem-solving skills in nursing practice [ 14 ]. Moreover, this approach is marked by a lack of opportunities for practical application of theoretical knowledge, an essential aspect of nursing education, due to its primary focus on theory [ 15 ]. The absence of interactive elements in lectures may lead to reduced knowledge retention and lower student motivation. This is because passive learning environments are generally less effective in knowledge assimilation compared to more interactive methods [ 16 ].

In 2014, a Fudan University expert in Shanghai introduced the Presentation-Assimilation-Discussion (PAD) model, a two-pronged approach to classroom teaching [ 17 ]. The PAD model is an innovative teaching approach that divides classroom time equally between teacher-led lectures and student-driven knowledge assimilation and group discussions. In the first phase, the instructor delivers core content through lectures, laying the foundation for students to understand the course material. This is followed by a second phase where students individually reflect to absorb the knowledge, then engage in group and class discussions to further solidify their understanding of the information [ 17 , 18 ]. This approach not only emphasizes the teacher’s role in guiding and facilitating learning but also encourages active student participation, thereby enhancing their motivation and enthusiasm for learning [ 17 , 18 ]. However, some disadvantages of the PAD model have been observed in practice, as this method requires instructors to possess strong classroom management and interaction-guidance skills, which can be challenging for teachers accustomed to traditional lecture-based methods; additionally, some students may be reluctant to participate due to shyness, lack of confidence, or unfamiliarity with the discussion content, potentially impacting their learning outcomes [ 19 ]. Currently, research on the application of the PAD model in healthcare education is relatively scarce.

Therefore, through this study, we aim to answer two questions: First, what is the current state of nursing students ' knowledge, attitudes, practices, and needs regarding OBEs? Second, is the PAD model a more effective method for OBEs education compared to the LBL model, in terms of enhancing nursing students’ mastery of OBEs knowledge and skills?

Study design

This study was conducted in a training and research hospital during the academic year of 2021–2022. First, we conducted a cross-sectional study using a blood-borne infection control questionnaire survey to explore the knowledge, attitudes, practices, and needs regarding OBEs among nursing students. Subsequently, we used a randomized controlled trial to compare the impact of the PAD method with the traditional LBL method on OBEs education for nursing students. The whole study design is illustrated in Fig.  1 .

figure 1

Flowchart of the study design

Study participants

A total of 105 nursing students in the third year of nursing college, who were about to start their internships, were initially recruited. Fifteen students were excluded from the study: eight had previously participated in courses or training related to OBEs, and seven were unwilling to participate. Ultimately, 90 nursing students were enrolled in the study. These nursing students were then randomly divided into two groups using a digital randomization method: forty-five were trained using the presentation-assimilation-discussion (PAD) method as an experimental group, while forty-five were trained using the traditional lecture-based learning (LBL) method as a control group.

Study process

Part 1. the blood-borne infection control questionnaire survey: a cross-sectional study.

After grouping, all participants completed a blood-borne infection control questionnaire. A 20-question anonymous structured questionnaire was used in this study. The 20-question questionnaire was developed, based on guidelines for the prevention of bloodborne infections from the WHO (World Health Organization) and China CDC (Centers for Disease Control and Prevention), and previous studies on bloodborne infections [ 20 , 21 ]. The questions focused on students’ knowledge (K1-5), attitudes (A1-5), practice (P1-5) and training needs (N1-5) regarding OBEs (Supplemental Table S1 ). The 5-level Likert scoring method was adopted for each question, ranging from 1 point (very unclear/strongly disagreed/strong unwillingness) to 5 points (very clear/strongly agreed/strong willingness). A pilot testing of the questionnaire was conducted on 20 nursing students and the questions were revised accordingly. SPSS was used to analyze reliability, and Cronbach’s Alpha was 0.767. The content validity was analyzed by specialists form the hospital’s infection control department, and the construct validity was evaluated by confirmatory factor analysis (CFA). The validity and reliability were good.

Part 2. Comparing the impact of PAD method and LBL method on OBE education: a randomized controlled study

Based on the findings of this survey and historical data on OBEs provided by the hospital’s infection control department, a training program was developed. The training content primarily included blood-borne pathogen, standard precautions and procedural technique training, and post-exposure management procedures. The training course included three sessions with 90 min per session and lasted for two weeks. The two groups received the same teaching syllabus and the same reference materials one week before training. And they were taught by the same faculty members, including one instructor and four teaching assistants.

The PAD method is made up of three major sections. The specific steps of the PAD method are detailed below. To illustrate the application of the teaching approach, we have chosen post-exposure management procedures as an example in this study.

Presentation (P) The instructor briefly emphasized key fundamental concepts and theoretical frameworks using PowerPoint presentations, leaving room for students’ subsequent independent study and discussion. This approach aimed to encourage independent thinking and assist students in constructing a closed-loop learning system. In the chapter on post-exposure management, the lecture mainly covered core concepts and frameworks such as emergency response after exposure, reporting systems, post-exposure prophylaxis (including assessment of the exposure source, assessment of the exposed person, and post-exposure preventive measures), monitoring and follow-up, and psychological intervention after exposure. This stage lasted approximately 40 min, followed by a 10-minute break.

Assimilation (A) After the instructor’s presentation, each student was tasked with identifying and addressing their knowledge gaps through independent study, and creating a mind map on post-exposure management based on what they had learned. Students were also encouraged to propose questions for discussion. Through this process, the integration of learning and thinking was achieved, facilitating the internalization of knowledge. This step took about 15 min.

Discussion (D) The discussion stage was divided into group and class discussions. In the group discussion segment, students formed groups of 4 to 6 members to collectively review key points on post-exposure management and attempt to answer raised questions. In the class discussion segment, a representative from each group, randomly selected by the instructor, summarized the highlights and confusions of their group’s discussion, which were then addressed by members of other groups or the instructor. Each discussion phase lasted 10 min. Finally, in the last 5 min of the class, the instructor summarized the key points of the lesson, highlighted areas for student improvement, and praised groups with outstanding performance in the discussion segments.

The LBL method was implemented in the control group as follows. The instructor utilized PowerPoint to deliver lectures, each lasting 90 min, including a 10-minute mid-lecture break and a 5-minute question-and-answer session in the final. Both groups were encouraged to conduct pre-class preparation and post-class review based on the teaching syllabus to improve learning outcomes.

Demographic data included age and gender. The enrolled participants were required to complete three tests: one week before the training, one week and six months after the training (namely pre-test, post-test and retention test respectively). Each test consisted of a theoretical Sect. (60 points) and a practical Sect. (40 points), with a total score of 100 points. The theoretical section included single-choice questions, multiple-choice questions, and true/false questions with a time limit of 30 min. The questions in the three tests were evaluated by two different instructors to ensure consistency in difficulty levels. The practice section, i.e. hand hygiene practice assessment, included performing the seven-step handwashing technique and answering related questions, with a time limit of 5 min. The same senior nurse with at least 10 years of clinical and teaching experience served as the scoring expert, who was unaware of the study’s purpose, design, and group assignments. Afterward, the total score, as well as the scores for the theoretical and practice sections, were calculated for each student.

To assess the students’ feedback on the teaching method, a 9-question anonymous questionnaire survey was adopted at the end of the course. These questions were aimed to evaluate the students’ attitudes and learning experiences under each teaching method. The 5-level Likert scoring method was adopted for each question, ranging from 1 point (very dissatisfied/strongly disagreed) to 5 points (very satisfied/strongly agreed). The 9-question questionnaire utilized in this research was adapted from those used in earlier studies focused on student-centered teaching approaches [ 22 , 23 ]. It was modified and updated to include elements specific to the PAD model, ensuring it accurately captured students’ feedback regarding the PAD model (Supplemental Figure S1 ).

Ethical considerations

This study was conducted in accordance with the 2013 version of the Helsinki Declaration. This research received approval from the Ethics Committee of the Affiliated Tumor Hospital of Chengdu Medical College. Before the study began, we explained the purpose and protocol of the study to all participants, who then reviewed and signed informed consent forms. Additionally, all participants were informed that they could withdraw from the study at any time without any consequences.

Statistical analysis

The normality of continuous data was assessed using the Shapiro-Wilk test. Depending on the data distribution, the results were presented as mean ± standard deviations (SDs) or median values, along with the interquartile range (IQR). Categorical variables were compared with the Chi-square test. Continuous variables with skewed distribution and ordinal variables were analysed using the Mann-Whitney U test. The test scores of the two groups at different time points were evaluated through a general linear model for repeated measurements. All statistical analyses were conducted using SPSS 26.0 (SPSS Inc., Chicago, USA), with all tests being two-tailed, and significance set at p  < 0.05.

Demographic information of students

A total of 90 nursing students were enrolled in this study and were randomly divided into two groups equally. Six months later, one participant (1/45, 2.2%) from the PAD group and two (2/45, 4.4%) from the LBL group were lost to follow-up. Consequently, the final analysis included data from 44 participants in the PAD group and 43 in the LBL group. Distribution of demographic characteristics was balanced. The gender ratio and ages for the two groups were comparable ( p  > 0.05; Table  1 ).

Bloodborne infection control survey

Nursing students’ knowledge of OBEs were unsatisfactory: with 64.4% of students unfamiliar with post-exposure prophylaxis, 63.2% unaware of bloodborne pathogen monitoring, and 67.8% lacking knowledge about medical waste disposal. Particularly, they were almost completely unaware of the correct post-exposure management, with 78.2% of students unfamiliar with reporting procedures, 70.1% unaware of sharps injury management, 85.1% unaware of mucosal exposure treatment, and 85.1% unfamiliar with post-exposure monitoring and follow-up. However, most students recognized the importance of standard precautions. For instance, 60.9% of students believed that understanding a patient’s medical history was important, 69% emphasized the need to check for skin lesions on their hands before contacting patients, 62.1% highlighted the importance of hand hygiene after contacting patient, and 100% agreed that wearing goggles or face shields is important for preventing blood or bodily fluid splashes. At the same time, 86.2% of students believed that it is necessary to offer specialized and systematic OBEs courses in nursing education, and 63.2% and 87.4% of students, respectively, thought that it is essential for schools and hospitals to provide OBEs training before and during internships. Moreover, 50.6% and 44.8% of students expressed a willingness or strong willingness to participate in such training (Supplemental Table S1 and Fig.  2 ). There was no difference between the two groups regarding knowledge, attitudes, practice, and training needs related to OBEs ( p  > 0.05; Supplemental Table S1 ).

figure 2

Bloodborne infection control survey. The questions focused on students’ knowledge (K1-5), attitudes (A1-5), practice (P1-5) and training needs (N1-5) regarding blood-borne occupational exposures. The 5-level Likert scoring method was adopted for each question. 1 very unclear/strongly disagreed/strong unwillingness; 2 unclear/disagreed/unwillingness; 3 neutral; 4 clear/agreed/willingness; 5 very clear/strongly agreed/strong willingness

Comparison of the test scores

One week before training, nursing students underwent a pre-test evaluation. In the PAD group, the mean total score was 56.70 ± 3.47, with scores for theoretical and practice being 33.09 ± 3.39 and 23.61 ± 4.66, respectively. The LBL group presented similar scores: 56.40 ± 3.95 for the total, 33.33 ± 2.44 for theoretical, and 23.07 ± 4.84 for practice. No significant differences were observed ( p  > 0.05), demonstrating comparable baseline characteristics for both groups.

One week after training, the PAD group demonstrated significant improvements in their mean total score, theoretical, and practice scores, increasing from 56.70 to 84.25, 33.09 to 54.32, and 23.61 to 29.93, respectively ( p  < 0.001). Similarly, the LBL group showed notable gains, with scores rising from 56.40 to 78.95, 33.33 to 51.44, and 23.07 to 27.51, respectively ( p  < 0.001). Further inter-group comparison indicated that the PAD group’s mean total score, theoretical, and practice scores were significantly superior to those of the LBL group after training (84.25 ± 4.06 vs. 78.95 ± 4.23, 54.32 ± 2.43 vs. 51.44 ± 2.58, p  < 0.001, and 29.93 ± 3.90 vs. 27.51 ± 4.33, p  < 0.01, respectively).

To assess the retention of theoretical and practice, a second post-test was conducted six months after training. While the mean total and theoretical scores for both groups significantly diminished ( p  < 0.001), the practice score for the PAD group remained unchanged ( p  > 0.999), and that of the LBL group exhibited a slight increase ( p  = 0.70). Further analysis comparing the two groups revealed that the PAD group’s mean total and theoretical scores continued to be significantly higher than those of the LBL group (69.05 ± 3.87 vs. 65.77 ± 2.94, 39.05 ± 3.05 vs. 36.23 ± 3.18, p  < 0.001). However, the difference in practice scores between the two groups was not statistically significant (30.00 ± 4.76 vs. 29.53 ± 3.73, p  > 0.05). The decline in theoretical scores for both groups underscores the importance of repeated training regarding OBEs, while the maintenance of the seven-step handwashing technique may benefit from repetitive practice in clinical settings.

Additionally, when compared with the pre-test, both groups exhibited a marked enhancement in the mean total scores, theoretical and practice scores in the retention test ( p  < 0.001). This analysis proves the enduring impact of the training program on the participants’ knowledge and skills (Fig.  3 ).

figure 3

The test scores of PAD ( n  = 44) and LBL ( n  = 43) groups at different time points. Different time points: pre-test: 1 week before training; post-test: 1 week after training; retention test: 6 months after training. A total score; B theoretical score; C practice score. Data were presented as mean ± standard deviation (SD). ** p  < 0.01, *** p  < 0.001

Students’ feedback on the teaching method survey

Compared with the LBL group, the PAD group exhibited significant enhancements across various dimensions, including learning enthusiasm, understanding of teaching content, student-instructor interaction, satisfaction with teaching mode, satisfaction with teaching effect, problem-solving ability, and interest in continued learning ( p  < 0.01 for two aspects and p  < 0.001 for the remaining aspects). However, it is important to note that a majority of students in the PAD group reported an increased study load ( p  < 0.001) and a reduction in the systematic organization of teaching content ( p  < 0.05) (Fig.  4 ).

figure 4

Five-level Likert scores of students’ feedback on the teaching method in PAD ( n  = 44) and LBL ( n  = 43) groups. A learning enthusiasm; B study load; C systematization of teaching content; D understanding of teaching content; E student-teacher interaction; F satisfaction of teaching mode; G satisfaction of teaching effect; H problem-solving ability improvement; I interest in continued learning. Data were presented as median values along with the interquartile range. * p  < 0.05, ** p  < 0.01, *** p  < 0.001

The “K-A-P-N” OBEs questionnaire holds significant value in assessing the status of healthcare professionals regarding OBEs, and it can provide data support for developing more effective training and intervention strategies. In the first part of our study, a cross-sectional analysis was conducted using the questionnaires to examine students’ levels of knowledge, attitudes, practices, and needs concerning OBEs. Our data has found that the majority of nursing students exhibit a significant lack of knowledge regarding OBEs. This is evident in their insufficient understanding of the basic knowledge about blood-borne pathogens, the ignorance of preventive measures following OBEs, and unfamiliarity with the standard handling of contaminated items. This finding is consistent with previous research [ 24 , 25 ]. The issue of inadequate OBEs knowledge is also prevalent among students in other disciplines, such as dentistry [ 21 ].

Our OBEs questionnaire revealed that the majority of students demonstrated a high level of recognition regarding fundamental occupational protection knowledge, acknowledging its crucial role in reducing occupational exposures. Most students also recognized the importance of understanding patients’ medical histories prior to performing medical procedures, indicating an awareness of the link between patient background information and occupational exposures. In terms of routine practices, students showed a clear understanding of the need to inspect their hands for injuries and placed significant emphasis on hand hygiene after patient contact. They also believed that wearing goggles or face shields was necessary when there was a risk of blood or bodily fluid splashes. These positive attitudes suggested that students had a strong risk awareness concerning OBEs and a high level of recognition for preventive measures, reflecting their concern and emphasis on occupational safety. This awareness of OBEs and recognition of preventive measures had been observed in other studies on healthcare practitioners [ 26 ]. Moreover, other research was consistent with our findings: the majority of nurses held a positive attitude towards standard precautions against OBEs [ 27 ].

In the “Practices” section of our OBEs questionnaire, we focused on students’ familiarity with procedures and handling methods related to OBEs. Our results indicated that most students reported being very unfamiliar or unfamiliar with these issues. This situation reflected a significant lack of practical ability among nursing students when dealing with OBE incidents, particularly regarding post-OBE reporting procedures, sharp injury management, mucous membrane exposure handling, and OBE monitoring and follow-up procedures. Such unfamiliarity with post-OBE handling was also common among other healthcare practitioners [ 28 ]. These results suggested that OBE training should aim to enhance students’ ability to handle OBE incidents effectively, ensuring they can protect both themselves and patients’ health in their professional environment.

The results from the “Needs” section of our OBEs questionnaire indicated that students generally recognized the importance of training on OBEs and displayed a positive attitude toward participating in such training. This reflected their emphasis on self-protection and raising awareness of occupational safety. It also suggested that they wished to acquire sufficient knowledge and skills to handle potential occupational risks before entering practical nursing work. This positive attitude provided strong support for educational and medical institutions to further refine OBEs training programs. Schools and hospitals should seize this opportunity, integrating students’ needs, and developing an effective OBEs educational model at the outset of nursing students’ internships to enhance their sense of security and professional competence throughout their careers.

The second part of our study focused on the effectiveness of the PAD model as an innovative teaching method for nursing students in the context of OBEs. Our findings indicated that, compared to the traditional LBL method, the PAD model significantly enhanced students’ immediate theoretical knowledge and practical performance regarding OBEs. Another study observed the effectiveness of the PAD model in dental education, demonstrating that, compared to traditional lecture-based teaching, PAD model ignited students’ enthusiasm for learning and enhanced their levels of theoretical testing and practical operation [ 18 ]. These findings align with our research, indicating that the PAD model might significantly improve students’ theoretical knowledge scores and practical operation levels in healthcare-related courses. The PAD model necessitates that educators distill the essence of the teaching material, moving away from traditional didactic teaching methods and enabling students to become the protagonists of the classroom. Through student discussions, teachers can quickly grasp the students’ learning status, identify common issues, and address them in a targeted manner, thereby enhancing teaching effectiveness [ 29 ].

The key to long-term knowledge retention lies in students’ active engagement with the content and taking on greater responsibility for their learning [ 30 ]. The PAD model involves students actively participating in classroom activities and redistributes various rights and responsibilities throughout the teaching process. It enables students to learn with proper guidance from teachers, thereby encouraging them to assume responsibility for their learning [ 29 ]. All these factors suggest that the PAD model can lead to better outcomes in terms of long-term knowledge retention. Investigating the longer-term effects and knowledge retention will yield evidence of the sustainable impact of new teaching methods. Therefore, we explored whether the PAD model is superior to traditional LBL method in terms of knowledge retention after six months. Our findings suggest that, following the initial implementation of the PAD model, long-term retention of knowledge exceeds that achieved through the traditional LBL method. Moreover, in terms of practical skills, the retention test scores of the LBL method group and the PAD model group were higher than those at post-test. This can be attributed to the repeated training of practical skills during the six-month internship, which enhanced the students’ practical abilities.

Our study found that the PAD model, compared to the traditional LBL method, significantly enhanced students’ enthusiasm for learning, understanding of teaching content, satisfaction with teaching, problem-solving abilities, and interest in continued learning. This suggests that the PAD model, by enhancing students’ autonomy and engagement, contributes to deeper learning outcomes. These positive effects, particularly the enhancement of students’ self-directed learning abilities and satisfaction with teaching, have also been observed in other studies applying the PAD model in nursing education [ 19 ].

However, student feedback questionnaires also highlighted an increase in study load and certain deficiencies in the systematic organization of teaching content. This phenomenon may stem from the PAD model’s emphasis on self-directed exploration and learning, which may cause students to feel overwhelmed when faced with a large amount of information and tasks. Similar findings of increased study load have been observed in other studies using non-traditional LBL methods, which share the common feature of transforming students from passive recipients of knowledge in the classroom into active participants in the learning process [ 22 , 31 ]. This suggests that in the future application of the PAD model and similar new teaching methods, it may be beneficial to provide a more structured learning framework to reduce students’ cognitive load and improve their ability to grasp the systematic nature of the knowledge being taught. Additionally, it is hoped that teachers will better manage students’ study load and the systematic organization of teaching content, enhancing the logical coherence and structure of the knowledge. These improvements are expected to offer students clearer learning pathways, leading to a better learning experience and improved educational outcomes.

Our study suggests that the majority of nursing students exhibit a significant lack of knowledge and managing capability of OBEs. At the same time, the students also demonstrate a strong willingness and need to learn about OBEs. Our study is the first to implement the PAD model within the OBEs course for nursing students. Compared to traditional LBL method, we found that the PAD model significantly stimulated nursing students’ enthusiasm and initiative for learning about OBEs. Moreover, the PAD model demonstrated an advantage over the LBL model in both immediate theoretical and practical scores, and it also had a superior retention of theoretical knowledge. However, there was no significant difference in the retention of practical skills between the two groups. Based on our findings, we suggest that the PAD model could be a valuable approach for teaching OBEs to nursing students.

The scope of the investigation was confined to nursing students, highlighting the necessity for subsequent studies to incorporate a broader range of participants, such as healthcare professionals, medical students, and public health students, to corroborate the efficacy of the proposed method. Additionally, the limited sample size of this study underscores the need for further research with an expanded cohort to thoroughly assess the method’s impact and reduce potential biases identified in our findings. Furthermore, the longitudinal effects of this innovative approach were not examined adequately. Future studies should, therefore, employ multi-center randomized controlled trials with extended follow-up durations to evaluate the sustained effectiveness of the method.

Data availability

Please contact the corresponding author for data availability.

Yasin J, Fisseha R, Mekonnen F, Yirdaw K. Occupational exposure to blood and body fluids and associated factors among health care workers at the University of Gondar Hospital, Northwest Ethiopia. Environ Health Prev Med. 2019;24(1):18.

Article   PubMed   PubMed Central   Google Scholar  

Guilbert JJ. The world health report 2002 - reducing risks, promoting healthy life. Educ Health (Abingdon). 2003;16(2):230.

CAS   PubMed   Google Scholar  

Pruss-Ustun A, Rapiti E, Hutin Y. Estimation of the global burden of disease attributable to contaminated sharps injuries among health-care workers. Am J Ind Med. 2005;48(6):482–90.

Article   PubMed   Google Scholar  

Lin J, Gao X, Cui Y, Sun W, Shen Y, Shi Q, Chen X, Hu B. A survey of sharps injuries and occupational infections among healthcare workers in Shanghai. Ann Transl Med. 2019;7(22):678.

Kasatpibal N, Whitney JD, Katechanok S, Ngamsakulrat S, Malairungsakul B, Sirikulsathean P, Nuntawinit C, Muangnart T. Practices and impacts post-exposure to blood and body fluid in operating room nurses: a cross-sectional study. Int J Nurs Stud. 2016;57:39–47.

Jeong JS, Son HM, Jeong IS, Son JS, Shin KS, Yoonchang SW, Jin HY, Han SH, Han SH. Qualitative content analysis of psychologic discomfort and coping process after needlestick injuries among health care workers. Am J Infect Control. 2016;44(2):183–8.

Zhang L, Li Q, Guan L, Fan L, Li Y, Zhang Z, Yuan S. Prevalence and influence factors of occupational exposure to blood and body fluids in registered Chinese nurses: a national cross-sectional study. BMC Nurs. 2022;21(1):298.

Kasatpibal N, Whitney JD, Katechanok S, Ngamsakulrat S, Malairungsakul B, Sirikulsathean P, Nuntawinit C, Muangnart T. Prevalence and risk factors of needlestick injuries, sharps injuries, and blood and body fluid exposures among operating room nurses in Thailand. Am J Infect Control. 2016;44(1):85–90.

Hambridge K, Nichols A, Endacott R. The impact of sharps injuries on student nurses: a systematic review. Br J Nurs. 2016;25(19):1064–71.

Cheung K, Ho SC, Ching SS, Chang KK. Analysis of needlestick injuries among nursing students in Hong Kong. Accid Anal Prev. 2010;42(6):1744–50.

Ashat M, Bhatia V, Puri S, Thakare M, Koushal V. Needle stick injury and HIV risk among health care workers in North India. Indian J Med Sci. 2011;65(9):371–8.

Nsubuga FM, Jaakkola MS. Needle stick injuries among nurses in sub-saharan Africa. Trop Med Int Health. 2005;10(8):773–81.

Zheng QM, Li YY, Yin Q, Zhang N, Wang YP, Li GX, Sun ZG. The effectiveness of problem-based learning compared with lecture-based learning in surgical education: a systematic review and meta-analysis. BMC Med Educ. 2023;23(1):546.

Yoo MS, Park HR. Effects of case-based learning on communication skills, problem-solving ability, and learning motivation in nursing students. Nurs Health Sci. 2015;17(2):166–72.

Sangestani G, Khatiban M. Comparison of problem-based learning and lecture-based learning in midwifery. Nurse Educ Today. 2013;33(8):791–5.

Wen H, Hong M, Chen F, Jiang X, Zhang R, Zeng J, Peng L, Chen Y. CRISP method with flipped classroom approach in ECG teaching of arrhythmia for trainee nurses: a randomized controlled study. BMC Med Educ. 2022;22(1):850.

Zhang X. PAD class: a new attempt in university teaching reform. Fudan Educ Forum. 2014;5(12):5–10.

Google Scholar  

Zhai J, Dai L, Peng C, Dong B, Jia Y, Yang C. Application of the presentation-assimilation-discussion class in oral pathology teaching. J Dent Educ. 2022;86(1):4–11.

Lv H, Tang L, Luo G, Meng M, Luo Q. Rain Classroom and PAD class blended learning mode effectively improves teaching quality in a surgical nursing course. Am J Transl Res. 2024;16(1):200–7.

Mathewos B, Birhan W, Kinfe S, Boru M, Tiruneh G, Addis Z, Alemu A. Assessment of knowledge, attitude and practice towards post exposure prophylaxis for HIV among health care workers in Gondar, North West Ethiopia. BMC Public Health. 2013;13:508.

Saleem H, Waly N, Abdelgawad F. Knowledge, attitude, and practice (KAP) of post exposure prophylaxis for fifth year dental students at a private Egyptian university: a cross-sectional study. BMC Oral Health. 2023;23(1):167.

Rui Z, Lian-Rui X, Rong-Zheng Y, Jing Z, Xue-Hong W, Chuan Z. Friend or foe? Flipped Classroom for Undergraduate Electrocardiogram Learning: a Randomized Controlled Study. BMC Med Educ. 2017;17(1):53.

Wen H, Xu W, Chen F, Jiang X, Zhang R, Zeng J, Peng L, Chen Y. Application of the BOPPPS-CBL model in electrocardiogram teaching for nursing students: a randomized comparison. BMC Med Educ. 2023;23(1):987.

Konlan KD, Aarah-Bapuah M, Kombat JM, Wuffele GM. TOPIC: the level of nurses’ knowledge on occupational post exposure to hepatitis B infection in the Tamale metropolis, Ghana. BMC Health Serv Res. 2017;17(1):254.

George M, Sharma T, Ahwal S, Rastogi A, Bansal A. A national level survey on knowledge, attitude and practices among Indian nurses on viral hepatitis. J Educ Health Promot. 2023;12:247.

Freeman BM, Chea S, Shobayo BI. Knowledge, attitude, and practice towards hepatitis B virus among healthcare workers: a cross-sectional, hospital-based study in Montserrado County, Liberia. Pan Afr Med J. 2023;46:77.

Zhu S, Kahsay KM, Gui L. Knowledge, attitudes and practices related to standard precautions among nurses: a comparative study. J Clin Nurs. 2019;28(19–20):3538–46.

Kochlamazashvili M, Kamkamidze G, McNutt LA, DeHovitz J, Chubinishvili O, Butsashvili M. Knowledge, attitudes and practice survey on blood-borne diseases among dental health care workers in Georgia. J Infect Dev Ctries. 2018;12(10):864–70.

Ouyang L, Zhang H, Zhang X, Wu H. Genomics course design and combined teaching strategy to enhance learning initiatives in classroom. Biochem Mol Biol Educ. 2019;47(6):632–7.

Article   CAS   PubMed   Google Scholar  

Taglieri C, Schnee D, Dvorkin Camiel L, Zaiken K, Mistry A, Nigro S, Tataronis G, Patel D, Jacobson S, Goldman J. Comparison of long-term knowledge retention in lecture-based versus flipped team-based learning course delivery. Curr Pharm Teach Learn. 2017;9(3):391–7.

Hu X, Zhang H, Song Y, Wu C, Yang Q, Shi Z, Zhang X, Chen W. Implementation of flipped classroom combined with problem-based learning: an approach to promote learning about hyperthyroidism in the endocrinology internship. BMC Med Educ. 2019;19(1):290.

Download references

Acknowledgements

This study was supported by the Medico-Engineering Cooperation Funds from University of Electronic Science and Technology of China (No.ZYGX2021YGLH223), Chengdu Science and Technology Bureau (2022-YF05-01940-SN), the grants from Sichuan Science and Technology Program (2024NSFSC0688), the “PRO•Run” Fund of the Nephrology Group of CEBM (KYJ202206-0003-1) and the Foundation of Science and Technology Department of Sichuan Province (No. 2023NSFSC0590). The funders had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author information

Heling Wen, Rui Zhang these authors have equal contribution.

Authors and Affiliations

Institute of Cardiovascular Diseases, Department of Cardiology, School of Medicine, Sichuan Provincial People’s Hospital, University of Electronic Science and Technology of China, Chengdu, 610072, China

Heling Wen & Yu Chen

Department of Surgery, The Affiliated Tumor Hospital of Chengdu Medical College, Chengdu, 610044, China

Rui Zhang, Zheng Huang & Yifeng Jiang

Department of Infectious Disease, The Affiliated Tumor Hospital of Chengdu Medical College, Chengdu, 610021, China

Zhenke Zhou

Department of Fever Clinic, Sichuan Provincial People’s Hospital, Sichuan Academy of Medical Science, University of Electronic Science and Technology of China, Chengdu, 610072, China

Department of Nephrology, Institute of Nephrology, Sichuan Provincial People’s Hospital, School of Medicine, University of Electronic Science and Technology of China, Chengdu, 610072, China

You can also search for this author in PubMed   Google Scholar

Contributions

(I) Conception and design: Heling Wen, Lei Peng and Zheng Huang; (II) Administrative support: Rui Zhang and Zheng Huang; (III) Provision of study materials or nurses: Rui Zhang, Yifeng Jiang, Zhenke Zhou and Zheng Huang; (IV) Collection and assembly of data: Yu Chen, Heling Wen, Lei Peng, Rui Zhang and Zheng Huang; (V) Data analysis and interpretation: Heling Wen, Lei Peng, Min Hong and Yu Chen; (VI) Manuscript writing: Rui Zhang and Yu Chen; (VII) Final approval of manuscript: All authors.

Corresponding authors

Correspondence to Yu Chen or Lei Peng .

Ethics declarations

Ethics approval and consent to participate.

The Institutional Review Board and Ethics Committee of The Affiliated Tumor Hospital of Chengdu Medical College approved the study protocols and written informed participants’ consent (Ref: 2021 − 117). Before enrolment, informed consent was obtained from all the participants and/or legal guardians for the study, wherein they were informed that they could freely withdraw from the study. There would be no negative consequences from opting not to participate in the study. All methods were carried out in accordance with relevant guidelines and regulations, including the Chinese Prevention of Cruelty to Human Subjects Act and the Code of Practice for the Care and Use of Human Subjects for Scientific Purposes.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Wen, H., Zhang, R., Zhou, Z. et al. Comparison of lecture-based learning with presentation-assimilation-discussion method in occupational bloodborne exposure education of nursing students, a randomised trial. BMC Nurs 23 , 702 (2024). https://doi.org/10.1186/s12912-024-02365-2

Download citation

Received : 22 June 2024

Accepted : 23 September 2024

Published : 29 September 2024

DOI : https://doi.org/10.1186/s12912-024-02365-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Occupational bloodborne exposures
  • Lecture-based learning
  • Nursing student

BMC Nursing

ISSN: 1472-6955

presentation of methodology

  • Brief introduction
  • Faculty/Staff
  • Academic committee
  • Research area
  • Research group
  • Guest professor
  • Achievement
  • Undergraduate
  • Young teacher
  • Recruitment

presentation of methodology

Academic progress

Prof. bin wang and prof. zhiwen li's team conducts a series of studies to develop experimental analysis methods for various environmental substances found in human biological samples.

The harmful effects of environmental pollutants on human health have become a significant global concern. These pollutants include polycyclic aromatic hydrocarbons (PAHs), organochlorine pesticides (OCPs), polychlorinated biphenyls (PCBs), polybrominated diphenyl ethers (PBDEs), and toxic metals. They primarily originate from industrial production, agricultural activities, and fossil fuel combustion, and are widely present in the atmosphere, water bodies, soil, and food. Long-term exposure may lead to endocrine disruption, reproductive system abnormalities, neurological damage, weakened immune function, and even an increased risk of cancer. Additionally, the metabolites produced by these pollutants in the body may also significantly impact human health. Therefore, accurately measuring the exposure levels of organic pollutants and metal(loid)s, as well as the concentration of key metabolites in the population, is crucial for assessing the impact of environmental exposure on human health.

In the real world, humans face complex exposure to multiple pollutants, which can easily diffuse and distribute widely throughout the body once they enter. Due to the complex matrix of biological samples, developing methods for accurately measuring target pollutants is challenging. For instance, the available sample volume of human biological specimens is usually limited, the pre-treatment workload for experimental analysis of biological samples is substantial, and the sample size in environmental epidemiology studies is often large. This necessitates that the relevant pre-treatment methods be as rapid and efficient as possible.

To address these challenges, Peking University Institute of Reproductive and Child Health at the School of Public Health, has undertaken a series of initiatives. These include the simultaneous determination of PAHs and their metabolites, monohydroxy-PAHs (OH-PAHs), in various biological samples such as hair, blood, and urine; the simultaneous measurement of multiple persistent organic pollutants (including PAHs, OCPs, PCBs, and PBDEs) in serum samples; the simultaneous detection of typical halogenated endocrine disruptors (such as PAHs, PBDEs, OCPs, and some hydroxy metabolites) and metal(loid)s in hair; and the simultaneous measurement of PAHs, nicotine, cotinine, and metal(loid)s in hair. The development and application of these methods provide reliable technical support for subsequent environmental epidemiology studies with different objectives, aiding in a more comprehensive understanding of the impact of pollutants on human health.

Series of Work (1): Establishing a Framework for the Simultaneous Analysis of Polycyclic Aromatic Hydrocarbons and Their Monohydroxy metabolites in Various Biological Samples (Hair, Blood, Urine) (Cover Article)

First affiliated institution: Peking University

Journal: Environmental Science Processes & Impacts

Article information: Jia X, Long M, Pang Y, An H, Jin Y, Jiang J, Li Z*, Wang B*. Exposure biomarker profiles of polycyclic aromatic hydrocarbons based on a rat model using a versatile analytical framework. Environ Sci Process Impacts. 2024 Aug 14;26(8):1268-1280. doi: 10.1039/d4em00109e. PMID: 38817199.

Main page: https://pubs.rsc.org/en/content/articlelanding/2024/em/d4em00109e

9b72127979e245a2a6b8744c66beed1b.png

Figure 1. Journal Cover

bb9f35f7155445319fce40653b649914.png

Figure 2. Framework for the Analysis of Biomarkers of Polycyclic Aromatic Hydrocarbon (PAHs) Exposure

The research team published a cover article in the journal Environmental Science: Processes & Impacts (Figure 1), presenting a pre-treatment and detection method for the simultaneous analysis of PAHs and their monohydroxy metabolites in biological samples. This method was applied to PAH-exposed rats to identify sensitive biomarkers of PAH exposure (Figure 2). The study provides a versatile method applicable to various biological specimens, enabling the simultaneous analysis of parent PAHs and OH-PAHs, significantly reducing sample volume and pre-treatment costs.

Series of Work (2): Development of a Method for the Simultaneous Determination of Polycyclic Aromatic Hydrocarbons, Organochlorine Pesticides, Polychlorinated Biphenyls, and Polybrominated Diphenyl Ethers in Serum Samples

Journal: Environmental Pollution

Article information: Jia X, Yin S, Xu J, Li N, Ren M, Qin Y, Zhou J, Wei Y, Guo Y, Gao M, Yu Y, Wang B*, Li Z. An efficient method to simultaneously analyze multi-class organic pollutants in human serum. Environ Pollut. 2019 Aug;251:400-406. doi: 10.1016/j.envpol.2019.05.008. Epub 2019 May 3. PMID: 31100571.

Main page: https://www.sciencedirect.com/science/article/pii/S0269749118353600?via%3Dihub

539150c74088449a848133ab075d9580.png

The extent of human exposure to various organic pollutants (OPs), including PAHs, OCPs, PCBs, and PBDEs, can be determined by measuring their concentrations in human serum. However, large-scale measurement of such a wide range of compounds in serum poses challenges in terms of efficiency and cost. The team led by Prof. Wang Bin at the School of Public Health, Peking University, has developed an efficient method for extracting and purifying serum samples. This method utilizes gas chromatography-mass spectrometry (GC-MS) to simultaneously quantify the aforementioned four types of OPs (Figure 3). The approach significantly reduces experimental costs and allows for the efficient batch processing of serum samples, making it more suitable for epidemiological studies.

46b2edb66a48428382c89b42606f4015.png

Figure 3: Illustration of the Simultaneous Determination of Multiple Organic Pollutants in Serum

Series of Work (3): Development of a Method for the Simultaneous Determination of Typical Halogenated Endocrine Disrupting Chemicals (Polychlorinated Biphenyls, Polybrominated Diphenyl Ethers, and Organochlorine Pesticides, along with Hydroxylated Metabolites) and Metal(loid)s in Human Hair

Journal: Science of the Total Environment

Article information: Ren M, Jia X, Shi J, Yan L, Li Z, Lan C, Chen J, Li N, Li K, Huang J, Wu S, Lu Q, Li Z*, Wang B*, Liu J. Simultaneous analysis of typical halogenated endocrine disrupting chemicals and metal(loid)s in human hair. Science of the Total Environment. Volume 718, 2020, 137300, https://doi.org/10.1016/j.scitotenv.2020.137300

Main page: https://www.sciencedirect.com/science/article/abs/pii/S004896972030810X

873e2460470c4e3a9a5ddfe7ac45bb18.png

The research team has developed an experimental analytical method for the simultaneous determination of various environmental endocrine-disrupting organic compounds and inorganic metal(loid)s using a small amount of hair samples. This method effectively removes surface contamination from hair to a certain extent, with good recovery rates for hair exposure biomarkers (80–120%). Additionally, the authors conducted a cross-sectional study as an example, using the method to measure the levels of exposure biomarkers in the hair of pregnant women, thereby validating the feasibility and effectiveness of the method (Figure 4).

16c24185b25b4b7eac6e8a44f84b4fdd.png

Figure 4: Development of Hair Analysis Method for Assessing Human Exposure Levels to Environmental Endocrine-Disrupting Organic Pollutants and Metal(loid)s

Series of Work (4): Development of a Pretreatment Method for the Simultaneous Determination of Polycyclic Aromatic Hydrocarbons, Nicotine, and Cotinine in Hair Samples

Article information: Li Z, Wang B*, Ge S, Yan L, Liu Y, Li Z, Ren A. A simultaneous analysis method of polycyclic aromatic hydrocarbons, nicotine, cotinine and metals in human hair. Environ Pollut. 2016 Dec;219:66-71. doi: 10.1016/j.envpol.2016.09.045.

Main page: https://www.sciencedirect.com/science/article/pii/S0269749116312982?via%3Dihub

ac25af4d93cb4151b41b0827831fb5d5.png

Figure 5: Development of a Pretreatment Method for the Simultaneous Determination of Polycyclic Aromatic Hydrocarbons, Nicotine, and Cotinine in Hair Samples

The research team conducted extensive studies and trials, ultimately selecting an organic base as the digestion solution and a mixed solvent with both polar and non-polar properties as the extraction agent. They developed a simple pretreatment method to simultaneously analyze PAHs, nicotine, cotinine, and metal(loid)s using a small amount of hair (Figure 5). This study provides a simple and low-cost analytical technique for environmental epidemiological research. The methodology has also been granted a national invention patent (Patent Title: Pretreatment Method and Kit for Simultaneous Analysis of Organic Pollutants and Metal(loid)s in Hair Samples, Patent Number: 201610321913.2).

Summary and Outlook:

Accurate, efficient, and low-cost experimental analytical techniques are crucial for etiological screening in environmental epidemiology. The analytical techniques developed by this research team have supported the execution of several national-level projects and have been widely cited and applied by peers both domestically and internationally. We anticipate that these methods will play a key role in broader environmental and health research, providing scientific support for the protection and improvement of human health.

Introduction to Major Team Members:

18ee833a0e714869945e453a6db8dd7a.png

Bin Wang, Associate Professor/Researcher with tenure at the Institute of Reproductive and Child Health, School of Public Health/the College of Urban and Environmental Sciences, Peking University. His main research areas include environmental health, exposomics big data, and artificial intelligence. He employs multidisciplinary research methods to elucidate the impacts and mechanisms of environmental pollution exposure on reproductive health, and to construct health risk assessment models. To date, he has led four projects funded by the National Natural Science Foundation of China (General and Youth Funds) and played a key role in three national key R&D Program projects. As the first or corresponding author, he has published 56 papers in international authoritative journals such as Environmental Health Perspectives, Environmental Science & Technology, and The Innovation, with an H-index of 45 and over 6,500 citations. He serves as an associate editor for the top journal in the field of environmental health, Environmental Science & Technology, since August 2024. He has developed undergraduate curriculum reform courses "Exposomics," graduate curriculum reform courses "Environmental Exposomics," and the global health international advanced public health master's course "Environment & Health." Additionally, he is the leader of the "Environment and Population Health" group on the China Cohort Consortium platform and the deputy secretary-general of the Environmental and Reproductive Health Committee of the Environmental Mutagen Society. He has received the Beijing Preventive Medicine Association's Second Prize for Science and Technology and the title of "Outstanding Individual in the National Scientific and Technological System for Fighting COVID-19."

9298ec2274764f33945ac68a2d1e1c57.png

Zhiwen Li, Professor/Researcher at the Institute of Reproductive and Child Health, School of Public Health, Peking University. He is mainly engaged in epidemiological research on the impact of the environment factors on birth defects, preterm birth, and other adverse pregnancy outcomes, as well as women's health. He has led eight projects funded by the National Natural Science Foundation, national key R&D projects, and provincial and ministerial-level projects, and has been a key participant in two national key R&D projects. He has published over 120 papers as the first or corresponding author in international authoritative journals such as J Hazard Mater, Environ Int, and Int J Epidemiol. He has developed graduate courses "Reproductive and Perinatal Epidemiology," "Typical Research Cases in Reproductive Health," and the undergraduate course "Birth Defects and Their Prevention." He has participated in teaching the graduate course "Informatization Management and Mining Utilization of Medical Big Data" and the undergraduate course "Autism Spectrum Disorders." He serves as Chair of the Environmental and Reproductive Health Branch of the China Eugenics Science Association, Deputy Chair of the Environmental and Reproductive Health Professional Committee of the Chinese Environmental Mutagen Society, and Deputy Chair and Secretary-General of the Children's Brain Science and Brain Health Promotion Professional Committee of the China Maternal and Child Health Association. He has received the Beijing Preventive Medicine Association's Second Prize for Science and Technology and the Third Prize of the Huaxia Medical Science and Technology Award. He holds 1 national invention patent and two utility model patents.

da5ad9cb3a92448885d7a2ae014e6878.jpg

Xiaoqian Jia, earned her Ph.D. from the Institute of Reproductive and Child Health at the School of Public Health, Peking University, and subsequently remained at the institution for a position. Her primary research focus is on reproductive health epidemiology, with special attention to the environmental causes and mechanisms of gestational diabetes and neural tube defects. She has been involved in the establishment of birth cohorts and the development of methods for detecting multiple environmental pollutants in biological samples. As the first author, she has published seven papers in journals such as The Innovation (IF = 32.1), Environment International (IF = 11.8), and Environmental Pollution (IF = 8.9). She has participated as a member in a total of five projects, including the National Key R&D Program of the Ministry of Science and Technology and general projects of the National Natural Science Foundation.

f7da9ed7caa9440183e6dad69570d5e2.jpg

Mengyuan Ren, earned her Ph.D. from the Institute of Reproductive and Child Health at the School of Public Health, Peking University, and is currently a postdoctoral researcher at Emory University in the United States. Her primary research focus is on assessing reproductive health risks associated with human exposure to mixed environmental pollutants. She is skilled in constructing reproductive health risk assessment and prediction models based on multi-omics data. To date, she has published 6 papers as the first author in journals including The Innovation (IF = 32.1), Environment International (IF = 11.8), Science of the Total Environment (IF = 9.8), and Hygiene and Environmental Health Advances. She has participated as a member in a total of 5 projects, including general projects of the National Natural Science Foundation, the National Key R&D Program of the Ministry of Science and Technology, and international scientific and technological innovation cooperation projects. She is a core member in the development of the ExposomeX platform (http://www.exposomex.cn/).

de4339db43434c0894d1865c894fb824.png

Zhenjiang Li, completed both his bachelor's and master's degrees at the School of Public Health, Peking University, and earned his Ph.D. from Emory University in the United States. He is currently a postdoctoral researcher at the Keck School of Medicine, University of Southern California. His primary research focus is on using multi-omics bioinformatics technologies, including metabolomics, proteomics, transcriptomics, and epigenomics, to explore the in vivo processes of common environmental pollutants and the bodily responses they trigger. This research aims to elucidate the impact of pollutants on the health outcomes of specific populations such as pregnant women and the elderly. To date, he has published 9 papers as the first author in journals including Alzheimer’s & Dementia (IF = 12.9), Environmental Science & Technology (IF = 11.4), and Science of The Total Environment (IF = 9.8). He has participated as a member in several R01 research grant projects funded by the National Institute of Environmental Health Sciences in the United States.

Contributed by | Professor Bin Wang's Team (Peking University)

Reviewed by | Bin Wang

IMAGES

  1. Methodology Presentation for Thesis PPT

    presentation of methodology

  2. Get Research Methodology With Analysis Presentation Slide

    presentation of methodology

  3. Methodology Presentation Template for Google Slides

    presentation of methodology

  4. PPT

    presentation of methodology

  5. Top 20 PowerPoint Templates for a Systematic Research Methodology

    presentation of methodology

  6. Process Methodology Powerpoint Presentation Slides

    presentation of methodology

VIDEO

  1. RESEARCH METHODOLOGY (FP PRESENTATION)

  2. RESEARCH METHODOLOGY (FP PRESENTATION)

  3. VIDEO PRESENTATION RESEARCH METHODOLOGY

  4. METHODOLOGY PRESENTATION

  5. PP METHO final

  6. Presentation research methodology ZAM ZULFAQAR b062110309

COMMENTS

  1. How to Make a Successful Research Presentation

    Learn how to turn a research paper into a visual presentation with a compelling narrative arc, clear graphics, and practice. Follow the tips and examples from a TA and a student who presented their work in GEO/WRI 201.

  2. Guide to Research Methods

    Combining different methods to answer your research questions, can be a mix of quantitative or qualitative methods or both. It may mean working with different types of data, research designs or being part of a research team (covering different research disciplines) Good for. Overcoming the limitation of relying on a single research method or ...

  3. PDF Presenting Methodology and Research Approach

    Learn how to develop and present a comprehensive methodology chapter for a qualitative dissertation. This chapter covers the key components, such as research sample, design, data collection, analysis, and trustworthiness, with a roadmap and an example.

  4. What Is a Research Methodology?

    Learn what a research methodology is and how to write one for your dissertation. Find out the steps, tips, and examples for different types of research methods, such as quantitative, qualitative, surveys, experiments, and existing data.

  5. PowerPoint Slides: SOWK 621.01: Research I: Basic Research Methodology

    Download the slides for twelve lessons of SOWK 621.01: Research I: Basic Research Methodology, taught by Dr. Matthew DeCarlo at Radford University. The slides cover topics such as introduction to research, research questions, ethics, measurement, sampling, and design.

  6. Your Step-by-Step Guide to Writing a Good Research Methodology

    Research methodology is the process or the way you intend to execute your entire research. A research methodology provides a description of the process you will undertake to convert your idea into a study. Read more to find out about types, structure, importance, and tips on research methodology.

  7. What Is Research Methodology? Definition + Examples

    Learn the basics of research methodology, including qualitative, quantitative and mixed methods, sampling strategies, data collection and analysis methods. See examples of research methodology chapters and how to choose your methodology.

  8. How To Write The Methodology Chapter

    Learn how to write the methodology chapter of your dissertation, thesis or research paper with examples and tips. Find out how to choose and justify your research philosophy, type, design, data collection and analysis methods.

  9. Research Methodology Part 1 : Introduction to Research & Research

    This presentation material in PowerPoint is the first of an eleven-part package designed and used regularly for teaching research methodology particularly to post-graduate students and research ...

  10. Ten simple rules for effective presentation slides

    Learn how to design slides that convey meaningful information, keep the audience engaged, and deliver the intended message. The article provides practical tips on slide content, structure, design, and preparation for academic presentations.

  11. Lecture Notes on Research Methodology

    A presentation on the meaning, objectives, types, approaches, methods and process of research methodology. Learn the definitions, examples and criteria of different research methods and techniques for various research purposes and designs.

  12. 6. The Methodology

    Methods and the Methodology. Do not confuse the terms "methods" and "methodology." As Schneider notes, a method refers to the technical steps taken to do research. Descriptions of methods usually include defining and stating why you have chosen specific techniques to investigate a research problem, followed by an outline of the procedures you ...

  13. A tutorial on methodological studies: the what, when, how and why

    Learn how to conduct methodological studies that evaluate the design, analysis or reporting of other research-related reports. This tutorial paper provides an overview of key aspects, examples and a framework for categorizing methodological studies in quantitative research.

  14. Research Methodology Workshop

    Research Methodology Workshop Presentation . Education . Premium Google Slides theme, PowerPoint template, and Canva presentation template . The backbone of any scientific inquiry: the methodology. A systematic process of collecting, analyzing, and interpreting data to draw conclusions about a particular subject matter.

  15. PPT

    Download a free PowerPoint presentation on research methodology, covering the stages of a research project, types of research, problem statement, objectives, hypotheses, and theoretical framework. Learn how to prepare a research proposal and choose a topic, variables, and terms for your study.

  16. Methodology for Written and Oral Presentation of Research Results

    presentation methodology for writing and presenting research results. This course's outline includes the following: (a) weekly schedule, (b) reading. assignments ...

  17. Research Methods

    Learn how to choose and use research methods for collecting and analyzing data in your dissertation. Compare qualitative and quantitative, primary and secondary, descriptive and experimental methods with examples and pros and cons.

  18. Methodology PowerPoint Templates

    Our methodology PPT templates are specifically designed to help you outline every phase of your process—data collection, analysis, implementation, and more—in a format that's easy for your audience to follow. These templates are structured, visually appealing, and fully customizable to fit your presentation needs. ...

  19. Features of Science Research Methods Lesson 33 AQA Psychology

    Complete 1 hour lesson including starter, AO1, AO2, AFL, extension tasks, pair, group and class activities, plenary. Carefully planned to meet the needs of all stude

  20. Comparison of lecture-based learning with presentation-assimilation

    Background Occupational Bloodborne Exposures (OBEs) are incidents where healthcare workers come into contact with blood or other potentially infectious materials, leading to risks of transmitting bloodborne pathogens. Nursing students, often in direct contact with patients, face heightened risks due to their duties. Methods First, we conducted a cross-sectional survey using a OBEs ...

  21. Prof. Bin Wang and Prof. Zhiwen Li's team conducts a series of studies

    This study provides a simple and low-cost analytical technique for environmental epidemiological research. The methodology has also been granted a national invention patent (Patent Title: Pretreatment Method and Kit for Simultaneous Analysis of Organic Pollutants and Metal(loid)s in Hair Samples, Patent Number: 201610321913.2). Summary and Outlook: