Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Questionnaire Design | Methods, Question Types & Examples

Questionnaire Design | Methods, Question Types & Examples

Published on July 15, 2021 by Pritha Bhandari . Revised on June 22, 2023.

A questionnaire is a list of questions or items used to gather data from respondents about their attitudes, experiences, or opinions. Questionnaires can be used to collect quantitative and/or qualitative information.

Questionnaires are commonly used in market research as well as in the social and health sciences. For example, a company may ask for feedback about a recent customer service experience, or psychology researchers may investigate health risk perceptions using questionnaires.

Table of contents

Questionnaires vs. surveys, questionnaire methods, open-ended vs. closed-ended questions, question wording, question order, step-by-step guide to design, other interesting articles, frequently asked questions about questionnaire design.

A survey is a research method where you collect and analyze data from a group of people. A questionnaire is a specific tool or instrument for collecting the data.

Designing a questionnaire means creating valid and reliable questions that address your research objectives , placing them in a useful order, and selecting an appropriate method for administration.

But designing a questionnaire is only one component of survey research. Survey research also involves defining the population you’re interested in, choosing an appropriate sampling method , administering questionnaires, data cleansing and analysis, and interpretation.

Sampling is important in survey research because you’ll often aim to generalize your results to the population. Gather data from a sample that represents the range of views in the population for externally valid results. There will always be some differences between the population and the sample, but minimizing these will help you avoid several types of research bias , including sampling bias , ascertainment bias , and undercoverage bias .

Prevent plagiarism. Run a free check.

Questionnaires can be self-administered or researcher-administered . Self-administered questionnaires are more common because they are easy to implement and inexpensive, but researcher-administered questionnaires allow deeper insights.

Self-administered questionnaires

Self-administered questionnaires can be delivered online or in paper-and-pen formats, in person or through mail. All questions are standardized so that all respondents receive the same questions with identical wording.

Self-administered questionnaires can be:

  • cost-effective
  • easy to administer for small and large groups
  • anonymous and suitable for sensitive topics

But they may also be:

  • unsuitable for people with limited literacy or verbal skills
  • susceptible to a nonresponse bias (most people invited may not complete the questionnaire)
  • biased towards people who volunteer because impersonal survey requests often go ignored.

Researcher-administered questionnaires

Researcher-administered questionnaires are interviews that take place by phone, in-person, or online between researchers and respondents.

Researcher-administered questionnaires can:

  • help you ensure the respondents are representative of your target audience
  • allow clarifications of ambiguous or unclear questions and answers
  • have high response rates because it’s harder to refuse an interview when personal attention is given to respondents

But researcher-administered questionnaires can be limiting in terms of resources. They are:

  • costly and time-consuming to perform
  • more difficult to analyze if you have qualitative responses
  • likely to contain experimenter bias or demand characteristics
  • likely to encourage social desirability bias in responses because of a lack of anonymity

Your questionnaire can include open-ended or closed-ended questions or a combination of both.

Using closed-ended questions limits your responses, while open-ended questions enable a broad range of answers. You’ll need to balance these considerations with your available time and resources.

Closed-ended questions

Closed-ended, or restricted-choice, questions offer respondents a fixed set of choices to select from. Closed-ended questions are best for collecting data on categorical or quantitative variables.

Categorical variables can be nominal or ordinal. Quantitative variables can be interval or ratio. Understanding the type of variable and level of measurement means you can perform appropriate statistical analyses for generalizable results.

Examples of closed-ended questions for different variables

Nominal variables include categories that can’t be ranked, such as race or ethnicity. This includes binary or dichotomous categories.

It’s best to include categories that cover all possible answers and are mutually exclusive. There should be no overlap between response items.

In binary or dichotomous questions, you’ll give respondents only two options to choose from.

White Black or African American American Indian or Alaska Native Asian Native Hawaiian or Other Pacific Islander

Ordinal variables include categories that can be ranked. Consider how wide or narrow a range you’ll include in your response items, and their relevance to your respondents.

Likert scale questions collect ordinal data using rating scales with 5 or 7 points.

When you have four or more Likert-type questions, you can treat the composite data as quantitative data on an interval scale . Intelligence tests, psychological scales, and personality inventories use multiple Likert-type questions to collect interval data.

With interval or ratio scales , you can apply strong statistical hypothesis tests to address your research aims.

Pros and cons of closed-ended questions

Well-designed closed-ended questions are easy to understand and can be answered quickly. However, you might still miss important answers that are relevant to respondents. An incomplete set of response items may force some respondents to pick the closest alternative to their true answer. These types of questions may also miss out on valuable detail.

To solve these problems, you can make questions partially closed-ended, and include an open-ended option where respondents can fill in their own answer.

Open-ended questions

Open-ended, or long-form, questions allow respondents to give answers in their own words. Because there are no restrictions on their choices, respondents can answer in ways that researchers may not have otherwise considered. For example, respondents may want to answer “multiracial” for the question on race rather than selecting from a restricted list.

  • How do you feel about open science?
  • How would you describe your personality?
  • In your opinion, what is the biggest obstacle for productivity in remote work?

Open-ended questions have a few downsides.

They require more time and effort from respondents, which may deter them from completing the questionnaire.

For researchers, understanding and summarizing responses to these questions can take a lot of time and resources. You’ll need to develop a systematic coding scheme to categorize answers, and you may also need to involve other researchers in data analysis for high reliability .

Question wording can influence your respondents’ answers, especially if the language is unclear, ambiguous, or biased. Good questions need to be understood by all respondents in the same way ( reliable ) and measure exactly what you’re interested in ( valid ).

Use clear language

You should design questions with your target audience in mind. Consider their familiarity with your questionnaire topics and language and tailor your questions to them.

For readability and clarity, avoid jargon or overly complex language. Don’t use double negatives because they can be harder to understand.

Use balanced framing

Respondents often answer in different ways depending on the question framing. Positive frames are interpreted as more neutral than negative frames and may encourage more socially desirable answers.

Positive frame Negative frame
Should protests of pandemic-related restrictions be allowed? Should protests of pandemic-related restrictions be forbidden?

Use a mix of both positive and negative frames to avoid research bias , and ensure that your question wording is balanced wherever possible.

Unbalanced questions focus on only one side of an argument. Respondents may be less likely to oppose the question if it is framed in a particular direction. It’s best practice to provide a counter argument within the question as well.

Unbalanced Balanced
Do you favor…? Do you favor or oppose…?
Do you agree that…? Do you agree or disagree that…?

Avoid leading questions

Leading questions guide respondents towards answering in specific ways, even if that’s not how they truly feel, by explicitly or implicitly providing them with extra information.

It’s best to keep your questions short and specific to your topic of interest.

  • The average daily work commute in the US takes 54.2 minutes and costs $29 per day. Since 2020, working from home has saved many employees time and money. Do you favor flexible work-from-home policies even after it’s safe to return to offices?
  • Experts agree that a well-balanced diet provides sufficient vitamins and minerals, and multivitamins and supplements are not necessary or effective. Do you agree or disagree that multivitamins are helpful for balanced nutrition?

Keep your questions focused

Ask about only one idea at a time and avoid double-barreled questions. Double-barreled questions ask about more than one item at a time, which can confuse respondents.

This question could be difficult to answer for respondents who feel strongly about the right to clean drinking water but not high-speed internet. They might only answer about the topic they feel passionate about or provide a neutral answer instead – but neither of these options capture their true answers.

Instead, you should ask two separate questions to gauge respondents’ opinions.

Strongly Agree Agree Undecided Disagree Strongly Disagree

Do you agree or disagree that the government should be responsible for providing high-speed internet to everyone?

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

how many questions in research questionnaire

You can organize the questions logically, with a clear progression from simple to complex. Alternatively, you can randomize the question order between respondents.

Logical flow

Using a logical flow to your question order means starting with simple questions, such as behavioral or opinion questions, and ending with more complex, sensitive, or controversial questions.

The question order that you use can significantly affect the responses by priming them in specific directions. Question order effects, or context effects, occur when earlier questions influence the responses to later questions, reducing the validity of your questionnaire.

While demographic questions are usually unaffected by order effects, questions about opinions and attitudes are more susceptible to them.

  • How knowledgeable are you about Joe Biden’s executive orders in his first 100 days?
  • Are you satisfied or dissatisfied with the way Joe Biden is managing the economy?
  • Do you approve or disapprove of the way Joe Biden is handling his job as president?

It’s important to minimize order effects because they can be a source of systematic error or bias in your study.

Randomization

Randomization involves presenting individual respondents with the same questionnaire but with different question orders.

When you use randomization, order effects will be minimized in your dataset. But a randomized order may also make it harder for respondents to process your questionnaire. Some questions may need more cognitive effort, while others are easier to answer, so a random order could require more time or mental capacity for respondents to switch between questions.

Step 1: Define your goals and objectives

The first step of designing a questionnaire is determining your aims.

  • What topics or experiences are you studying?
  • What specifically do you want to find out?
  • Is a self-report questionnaire an appropriate tool for investigating this topic?

Once you’ve specified your research aims, you can operationalize your variables of interest into questionnaire items. Operationalizing concepts means turning them from abstract ideas into concrete measurements. Every question needs to address a defined need and have a clear purpose.

Step 2: Use questions that are suitable for your sample

Create appropriate questions by taking the perspective of your respondents. Consider their language proficiency and available time and energy when designing your questionnaire.

  • Are the respondents familiar with the language and terms used in your questions?
  • Would any of the questions insult, confuse, or embarrass them?
  • Do the response items for any closed-ended questions capture all possible answers?
  • Are the response items mutually exclusive?
  • Do the respondents have time to respond to open-ended questions?

Consider all possible options for responses to closed-ended questions. From a respondent’s perspective, a lack of response options reflecting their point of view or true answer may make them feel alienated or excluded. In turn, they’ll become disengaged or inattentive to the rest of the questionnaire.

Step 3: Decide on your questionnaire length and question order

Once you have your questions, make sure that the length and order of your questions are appropriate for your sample.

If respondents are not being incentivized or compensated, keep your questionnaire short and easy to answer. Otherwise, your sample may be biased with only highly motivated respondents completing the questionnaire.

Decide on your question order based on your aims and resources. Use a logical flow if your respondents have limited time or if you cannot randomize questions. Randomizing questions helps you avoid bias, but it can take more complex statistical analysis to interpret your data.

Step 4: Pretest your questionnaire

When you have a complete list of questions, you’ll need to pretest it to make sure what you’re asking is always clear and unambiguous. Pretesting helps you catch any errors or points of confusion before performing your study.

Ask friends, classmates, or members of your target audience to complete your questionnaire using the same method you’ll use for your research. Find out if any questions were particularly difficult to answer or if the directions were unclear or inconsistent, and make changes as necessary.

If you have the resources, running a pilot study will help you test the validity and reliability of your questionnaire. A pilot study is a practice run of the full study, and it includes sampling, data collection , and analysis. You can find out whether your procedures are unfeasible or susceptible to bias and make changes in time, but you can’t test a hypothesis with this type of study because it’s usually statistically underpowered .

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Prospective cohort study

Research bias

  • Implicit bias
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic
  • Social desirability bias

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analyzing data from people using questionnaires.

Closed-ended, or restricted-choice, questions offer respondents a fixed set of choices to select from. These questions are easier to answer quickly.

Open-ended or long-form questions allow respondents to answer in their own words. Because there are no restrictions on their choices, respondents can answer in ways that researchers may not have otherwise considered.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviors. It is made up of 4 or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with 5 or 7 possible responses, to capture their degree of agreement.

You can organize the questions logically, with a clear progression from simple to complex, or randomly between respondents. A logical flow helps respondents process the questionnaire easier and quicker, but it may lead to bias. Randomization can minimize the bias from order effects.

Questionnaires can be self-administered or researcher-administered.

Researcher-administered questionnaires are interviews that take place by phone, in-person, or online between researchers and respondents. You can gain deeper insights by clarifying questions for respondents or asking follow-up questions.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, June 22). Questionnaire Design | Methods, Question Types & Examples. Scribbr. Retrieved September 4, 2024, from https://www.scribbr.com/methodology/questionnaire/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, survey research | definition, examples & methods, what is a likert scale | guide & examples, reliability vs. validity in research | difference, types and examples, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

Writing Good Survey Questions: 10 Best Practices

how many questions in research questionnaire

August 20, 2023 2023-08-20

  • Email article
  • Share on LinkedIn
  • Share on Twitter

Unfortunately, there is no simple formula for cranking out good, unbiased questionnaires.

That said, there are certainly common mistakes in survey design that can be avoided if you know what to look for. Below, I’ve provided the 10 most common and dangerous errors that can be made when designing a survey and guidelines for how to avoid them.

In This Article:

1. ask about the right things, 2. use language that is neutral, natural, and clear, 3. don’t ask respondents to predict behavior, 4. focus on closed-ended questions, 5. avoid double-barreled questions, 6. use balanced scales, 7. answer options should be all-inclusive and mutually exclusive, 8. provide an opt-out, 9. allow most questions to be optional, 10. respect your respondents, ask only questions that you need answered.

One of the easiest traps to fall into when writing a survey is to ask about too much. After all, you want to take advantage of this one opportunity to ask questions of your audience, right?

The most important thing to remember about surveys is to keep them short . Ask only about the things that are essential for answering your research questions. If you don’t absolutely need the information, leave it out.

Don’t Ask Questions that You Can Find the Answer to

When drafting a survey, many researchers slip into autopilot and start by asking a plethora of demographic questions . Ask yourself: do you need all that demographic information? Will you use it to answer your research questions? Even if you will use it, is there another way to capture it besides asking about it in a survey? For example, if you are surveying current customers, and they are providing their email addresses, could you look up their demographic information if needed?

Don’t Ask Questions that Respondents Can’t Answer Accurately

Surveys are best for capturing quantitative attitudinal data . If you’re looking to learn something qualitative or behavioral, there’s likely a method better suited to your needs. Asking the question in a survey is, at best, likely to introduce inefficiency in your process, and, at worst, will produce unreliable or misleading data.

For example, consider the question below:

⛔️

If I were asked this question, I could only speculate about what might make a button stand out. Maybe a large size? Maybe a different color, compared to surrounding content? But this is merely conjecture. The only reliable way to tell if the button actually stood out for me would be to mock up the page and show it to me. This type of question would be better studied with other research methods, such as usability testing or A/B testing , but not with a survey.

Avoid Biasing Respondents

There are endless ways in which bias can be introduced into survey data , and it is the researcher’s task to minimize this bias as much as possible. For example, consider the wording of the following question.

⛔️

By initially providing the context that the organization is committed to achieving a 5-star satisfaction rating , the survey creators are, in essence, pleading with the respondent to give them one. The respondent may feel guilty providing an honest response if they had a less than stellar experience.

Note also the use of the word satisfaction . This wording subtly biases the participant into framing their experience as a satisfactory one.

An alternative wording of the question might remove the first sentence altogether, and simply ask respondents to rate their experience.

Use Natural, Familiar Language

We must always be on the lookout for jargon in survey design. If respondents cannot understand your questions or response options, you will introduce bad data into your dataset. While we should strive to keep survey questions short and simple, it is sometimes necessary to provide brief definitions or descriptions when asking about complex topics, to prevent misunderstanding. Always pilot your questionnaires with the target audience to ensure that all jargon has been removed.

Speak to Respondents Like Humans

For some reason, when drafting a questionnaire, many researchers introduce unnecessary formality and flowery language into their questions. Resist this urge. Phrase questions as clearly and simply as possible, as though you were asking them in an interview format.

People are notoriously unreliable predictors of their own behavior. For various reasons, predictions are almost bound to be flawed, leading Jakob Nielsen to remind us to never listen to users .

Yet, requests for behavioral predictions are rampant in insufficiently thought-out UX surveys. Consider the question:  How likely are you to use this product? While a respondent may feel likely to use a product based on a description or a brief tutorial, their answer does not constitute a reliable prediction and should not be used to make critical product decisions.

Often, instead of future-prediction requests , you will see present-estimate requests : How often do you currently use this product in an average week? While this type of question avoids the problem of predictions, it still is unreliable. Users struggle to estimate based on some imaginary “average” week and will often, instead, recall outlier weeks, which are more memorable.

The best way to phrase a question like this is to ask for specific, recent memories : Approximately how many times did you use this product in the past 7 days? It is important to include the word approximately and to allow for ranges rather than exact numbers. Reporting an exact count of a past behavior is often either challenging or impossible, so asking for it introduces imprecise data. It can also make respondents more likely to drop off if they feel incapable of answering the question accurately.

⛔️

⚠️

Surveys are, at their core, a quantitative research method . They rely upon closed-ended questions (e.g., multiple-choice or rating-scale questions) to generate quantitative data. Surveys can also leverage open-ended questions (e.g., short-answer or long-answer questions) to generate qualitative data. That said, the best surveys rely upon closed-ended questions, with a smattering of open-ended questions to provide additional qualitative color and support to the mostly quantitative data.

If you find that your questionnaire relies overly heavily on open-ended questions, it might be a red flag that another qualitative-research method (e.g., interviews ) may serve your research aims better.

On the subject of open-ended survey questions, it is often wise to include one broad open-ended question at the end of your questionnaire . Many respondents will have an issue or piece of feedback in mind when they start a survey, and they’re simply waiting for the right question to come up. If no such question exists, they may end the survey experience with a bad taste. A final, optional, long-answer question with a prompt like Is there anything else you’d like to share? can help to alleviate this frustration and supply some potentially valuable data.

A double-barreled question asks respondents to answer two things at once. For example: How easy and intuitive was this website to use? Easy and intuitive , while related, are not synonymous, and, therefore, the question is asking the respondent to use a single rating scale to assess the website on two distinct dimensions simultaneously. By necessity, the respondent will either pick one of these words to focus on or try to assess both and estimate a midpoint “average” score. Neither of these will generate fully accurate or reliable data.

Therefore, double-barreled questions should always be avoided and, instead, split up into two separate questions.

⛔️

 

Rating-scale questions are tremendously valuable in generating quantitative data in survey design. Often, a respondent is asked to rate their agreement with a statement on an agreement scale (e.g., Strongly Agree, Agree, Neutral, Disagree, Strongly Disagree ), or otherwise to rate something using a scale of adjectives (e.g., Excellent, Good, Neutral, Fair, Poo r).

You’ll notice that, in both of the examples given above, there is an equal number of positive and negative options (2 each), surrounding a neutral option. The equal number of positive and negative options means that the response scale is balanced and eliminates a potential source of bias or error.

In an unbalanced scale, you’ll see an unequal number of positive and negative options (e.g., Excellent, Very Good, Good, Poor, Very Poor ). This example contains 3 positive options and only 2 negative ones. It, therefore, biases the participant to select a positive option.

⛔️

Answer options for a multiple-choice question should include all possible answers (i.e., all inclusive) and should not overlap (i.e., mutually exclusive). For example, consider the following question:

⛔️

In this formulation, some possible answers are skipped (i.e., anyone who is over 50 won’t be able to select an answer). Additionally, some answers overlap (e.g., a 20-year-old could select either the first or second response).

Always doublecheck your numeric answer options to ensure that all numbers are included and none are repeated .

No matter how carefully and inclusively you craft your questions, there will always be respondents for whom none of the available answers are acceptable. Maybe they are an edge case you hadn’t considered. Maybe they don’t remember the answer. Or maybe they simply don’t want to answer that particular question. Always provide an opt-out answer in these cases to avoid bad data.

Opt-out answers can include things like the following: Not applicable , None of the above , I don’t know, I don’t recall , Other , or Prefer not to answer . Any multiple-choice question should include at least one of these answers. However, avoid the temptation to include one catch-all opt-out answer containing multiple possibilities . For example, an option labeled I don’t know / Not applicable  covers two very different responses with different meanings; combining them fogs your data.

It is so tempting to make questions required in a questionnaire. After all, we want the data! However, the choice to make any individual question required will likely lead to one of two unwanted results:

  • Bad Data: If a respondent is unable to answer a question accurately, but the question is required, they may select an answer at random. These types of answers will be impossible to detect and will introduce bad data into your study, in the form of random-response bias.
  • Dropoffs: The other option available to a participant unable to correctly answer a required question is to abandon the questionnaire. This behavior will increase the effort needed to reach the desired number of responses.

Therefore, before deciding to make any question required, consider if doing so is worth the risks of bad data and dropoffs.

In the field of user experience, we like to say that we are user advocates. That doesn’t just mean advocating for user needs when it comes to product decisions. It also means respecting our users any time we’re fortunate enough to interact with them.

Don’t Assume Negativity

This is particularly important when discussing health issues or disability. Phrasings such Do you suffer from hypertension? may be perceived as offensive. Instead, use objective wording such as Do you have hypertension?

Be Sensitive with Sensitive Topics

When asking about any topics that may be deemed sensitive, private, or offensive, first ask yourself: Does it really need to be asked? Often, we can get plenty of valuable information while omitting that topic.

Other times, it is necessary to delve into potentially sensitive topics. In these cases, be sure to choose your wording carefully. Ensure you’re using the current preferred terminology favored by members of the population you’re addressing. If necessary, consider providing a brief explanation for why you are asking about that particular topic and what benefit will come from responding.

Use Inclusive and Appropriate Wording for Demographic Questions

When asking about topics such as race, ethnicity, sex, or gender identity, use accurate and sensitive terminology. For example, it is no longer appropriate to offer a simple binary option for gender questions. At a minimum, a third option indicating an Other or Non-binary category is expected, as well as an opt-out answer for those that prefer not to respond.

An inclusive question is respectful of your users’ identities and allows them to answer only if they feel comfortable.

Related Courses

Survey design and execution.

Use surveys to drive and evaluate UX design

ResearchOps: Scaling User Research

Orchestrate and optimize research to amplify its impact

User Interviews

Uncover in-depth, accurate insights about your users

Related Topics

  • Research Methods Research Methods

Learn More:

how many questions in research questionnaire

MVP: Why It Isn't Always Release 1

Sara Paul · 4 min

how many questions in research questionnaire

What Is a SWOT Analysis?

Therese Fessenden · 5 min

how many questions in research questionnaire

Success Rate vs. Completion Rate

Tim Neusesser · 4 min

Related Articles:

Should You Run a Survey?

Maddie Brown · 6 min

10 Survey Challenges and How to Avoid Them

Tanner Kohler · 15 min

User-Feedback Requests: 5 Guidelines

Anna Kaley · 10 min

Rating Scales in UX Research: Likert or Semantic Differential?

Maria Rosala · 7 min

Between-Subjects vs. Within-Subjects Study Design

Raluca Budiu · 8 min

27 Tips and Tricks for Conducting Successful User Research in the Field

Susan Farrell and Mayya Azarova · 5 min

how many questions in research questionnaire

How Many Survey Questions Should I Use?

  • Survey Tips

If you’re wondering how many survey questions you need to include in your questionnaire, the short answer is: as few as possible.

Keeping your survey question count low is crucial, because survey fatigue is a real danger for survey makers hoping to collect the best, most accurate data.

A few well worded, well designed survey questions are usually no problem for respondents to complete. But, once a survey starts to get bogged down with page after page of radio buttons, essay boxes, and convoluted question phrasing, respondents either lose interest and become too frustrated to complete the rest of the survey.

Deciding the exact number of survey questions you need to reach your goals is, of course, more complicated. It depends largely on your purpose and audience. But, that’s not all.

The quick and dirty guidelines for determining how many questions to use in a survey are:

  • Get to the point: Ask only as many survey questions as you need to achieve your goal.
  • Stay on track: Every question you ask must be directly related to your survey’s purpose.
  • Respect their time: Your respondents are busy people! Faster is better for response rates.

In this post, I cover each of these considerations and give you tips for determining the optimal length for your next survey project.

How Your Goals Should Influence the Number of Survey Questions You Use

The first step for you to take, long before you start writing survey questions, is to determine your survey’s purpose and goals.

Ask yourself:

  • Why am I making this survey?
  • What kind of data am I looking for?

The answers to these questions will help you determine the kind of survey you are running, the survey question types you will use, and how many survey questions you need to ask to get you to where you want to be.

What follows in an example of a survey maker going through the purpose-setting process.

What is The Purpose of This Survey?

A small business owner wants to expand his current web design business to include new services. He has a few ideas of what offerings he could make, like mobile app development, copywriting, or digital marketing consulting, but before he makes the investment in new personnel, he wants to make sure his customers are interested.

So, he decides to make a survey.

The purpose of his survey is to determine which services existing customers would be most interested in seeing from his team.

The goal is to identify which service his business should develop next and, importantly, where he will be investing his time and money. He wants to make sure the survey data points him in the right direction!

What Kind of Data Am I Looking For in Response to My Survey Questions?

Now that he’s decided his survey’s purpose, he can dive right into picking survey question types, right?

Not exactly.

While it may seem like common sense, it’s important to take an extra moment to think about what kind of data you need to be able to act on your survey data after you have it.

In our imaginary business owner’s case, he is looking for concrete feedback from his existing customers. .

In this case, he could put together a very simple survey centered around a check box question type asking customers to select any additional services they would be interested in. (Of course, he remembers to include an “Other – Write In” option so that customers can submit their own ideas.)

This is the simplest version of the survey that the business owner could make.

But, it’s likely that he would want more information, like how likely they would be to use a particular service if he provided it, and what kinds of projects, if any, they already have in mind.

This kind of information may give him greater insight into what his customers want versus what services they really need.

To collect this kind of data, it would be best to use more advanced question types like text boxes, Likert Scales, and even Drag & Drop Rankings to determine potential projects, likelihood of using the new services, and ordering which new services they would like to see unveiled first, second, and third.

Do you see how thinking about what kind of data you need really determines how many questions (and what kind of questions) you need to ask in your survey?

The basic version of our business owner’s new service survey could have been made in a question or two.

But, when looking for more robust data, more and more advanced questions are needed. That said, always keep in mind that your respondents’ time is valuable. If your purpose is broad, then consider breaking the survey down into multiple micro-topics.

Fighting Survey Fatigue with Micro-Surveys

Micro-surveys are bite-sized surveys that require very little time and may only involve a question or two. Because they are the very definition of short and sweet, they may be exactly what busy respondents need to give you their honest feedback without bogging down their day.

In the example of our business owner, he could choose to break up his new service research into smaller steps. The first would be to survey his customers to see which new services they would be interested in.

Let’s say that, of the customers that responded, most are looking for an app development service.

The business owner could then follow up with those exact customers for more information on project ideas and timelines.

This way, he will be able to gauge their interest, determine timeline, and discover exactly which skills he will need to look for in a new hire.

The Ideal Number of Survey Questions for Most Surveys

To be clear, there is no magic number for every survey and every audience.

But, in general and for most survey types, it’s best to keep the survey completion time under 10 minutes. Five minute surveys will see even higher completion rates, especially with customer satisfaction and feedback surveys.

This means, you should aim for 10 survey questions (or fewer, if you are using multiple text and essay box question types).

When you start moving into long surveys with lots of questions and over 10 minute completion times, you may want to consider offering respondents an incentive to compensate them for their time. Online gift cards are popular, but you can also use custom prizes or coupons.

In the Alchemer application, we do our best to help you out with keeping surveys on track.

Under the “Test” tab, the Survey Diagnostic panel that estimates how long your survey will take, how fatiguing it is, and how accessible it is to sight-and-hearing impaired audiences.

Alchemer Blog: How Many Survey Questions Should I Use? - Completion Time Dashboard

While you should definitely have a real, live person run through the survey to catch any errors, this diagnostic panel is a great way to make sure you’re balancing your data needs with your respondent’s busy lives. It will also alert you to any potential problems within the survey itself.

Happy surveying!

how many questions in research questionnaire

See all blog posts >

how many questions in research questionnaire

  • Customer Experience , Customer Feedback

how many questions in research questionnaire

  • Alchemer Survey , Integrated Feedback

Photo of 2 product managers collaborating on customer feedback

  • Customer Feedback , Product Feedback , Product Management

See it in Action

how many questions in research questionnaire

  • Privacy Overview
  • Strictly Necessary Cookies
  • 3rd Party Cookies

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping this cookie enabled helps us to improve our website.

Please enable Strictly Necessary Cookies first so that we can save your preferences!

  • Privacy Policy

Research Method

Home » Questionnaire – Definition, Types, and Examples

Questionnaire – Definition, Types, and Examples

Table of Contents

Questionnaire

Questionnaire

Definition:

A Questionnaire is a research tool or survey instrument that consists of a set of questions or prompts designed to gather information from individuals or groups of people.

It is a standardized way of collecting data from a large number of people by asking them a series of questions related to a specific topic or research objective. The questions may be open-ended or closed-ended, and the responses can be quantitative or qualitative. Questionnaires are widely used in research, marketing, social sciences, healthcare, and many other fields to collect data and insights from a target population.

History of Questionnaire

The history of questionnaires can be traced back to the ancient Greeks, who used questionnaires as a means of assessing public opinion. However, the modern history of questionnaires began in the late 19th century with the rise of social surveys.

The first social survey was conducted in the United States in 1874 by Francis A. Walker, who used a questionnaire to collect data on labor conditions. In the early 20th century, questionnaires became a popular tool for conducting social research, particularly in the fields of sociology and psychology.

One of the most influential figures in the development of the questionnaire was the psychologist Raymond Cattell, who in the 1940s and 1950s developed the personality questionnaire, a standardized instrument for measuring personality traits. Cattell’s work helped establish the questionnaire as a key tool in personality research.

In the 1960s and 1970s, the use of questionnaires expanded into other fields, including market research, public opinion polling, and health surveys. With the rise of computer technology, questionnaires became easier and more cost-effective to administer, leading to their widespread use in research and business settings.

Today, questionnaires are used in a wide range of settings, including academic research, business, healthcare, and government. They continue to evolve as a research tool, with advances in computer technology and data analysis techniques making it easier to collect and analyze data from large numbers of participants.

Types of Questionnaire

Types of Questionnaires are as follows:

Structured Questionnaire

This type of questionnaire has a fixed format with predetermined questions that the respondent must answer. The questions are usually closed-ended, which means that the respondent must select a response from a list of options.

Unstructured Questionnaire

An unstructured questionnaire does not have a fixed format or predetermined questions. Instead, the interviewer or researcher can ask open-ended questions to the respondent and let them provide their own answers.

Open-ended Questionnaire

An open-ended questionnaire allows the respondent to answer the question in their own words, without any pre-determined response options. The questions usually start with phrases like “how,” “why,” or “what,” and encourage the respondent to provide more detailed and personalized answers.

Close-ended Questionnaire

In a closed-ended questionnaire, the respondent is given a set of predetermined response options to choose from. This type of questionnaire is easier to analyze and summarize, but may not provide as much insight into the respondent’s opinions or attitudes.

Mixed Questionnaire

A mixed questionnaire is a combination of open-ended and closed-ended questions. This type of questionnaire allows for more flexibility in terms of the questions that can be asked, and can provide both quantitative and qualitative data.

Pictorial Questionnaire:

In a pictorial questionnaire, instead of using words to ask questions, the questions are presented in the form of pictures, diagrams or images. This can be particularly useful for respondents who have low literacy skills, or for situations where language barriers exist. Pictorial questionnaires can also be useful in cross-cultural research where respondents may come from different language backgrounds.

Types of Questions in Questionnaire

The types of Questions in Questionnaire are as follows:

Multiple Choice Questions

These questions have several options for participants to choose from. They are useful for getting quantitative data and can be used to collect demographic information.

  • a. Red b . Blue c. Green d . Yellow

Rating Scale Questions

These questions ask participants to rate something on a scale (e.g. from 1 to 10). They are useful for measuring attitudes and opinions.

  • On a scale of 1 to 10, how likely are you to recommend this product to a friend?

Open-Ended Questions

These questions allow participants to answer in their own words and provide more in-depth and detailed responses. They are useful for getting qualitative data.

  • What do you think are the biggest challenges facing your community?

Likert Scale Questions

These questions ask participants to rate how much they agree or disagree with a statement. They are useful for measuring attitudes and opinions.

How strongly do you agree or disagree with the following statement:

“I enjoy exercising regularly.”

  • a . Strongly Agree
  • c . Neither Agree nor Disagree
  • d . Disagree
  • e . Strongly Disagree

Demographic Questions

These questions ask about the participant’s personal information such as age, gender, ethnicity, education level, etc. They are useful for segmenting the data and analyzing results by demographic groups.

  • What is your age?

Yes/No Questions

These questions only have two options: Yes or No. They are useful for getting simple, straightforward answers to a specific question.

Have you ever traveled outside of your home country?

Ranking Questions

These questions ask participants to rank several items in order of preference or importance. They are useful for measuring priorities or preferences.

Please rank the following factors in order of importance when choosing a restaurant:

  • a. Quality of Food
  • c. Ambiance
  • d. Location

Matrix Questions

These questions present a matrix or grid of options that participants can choose from. They are useful for getting data on multiple variables at once.

The product is easy to use
The product meets my needs
The product is affordable

Dichotomous Questions

These questions present two options that are opposite or contradictory. They are useful for measuring binary or polarized attitudes.

Do you support the death penalty?

How to Make a Questionnaire

Step-by-Step Guide for Making a Questionnaire:

  • Define your research objectives: Before you start creating questions, you need to define the purpose of your questionnaire and what you hope to achieve from the data you collect.
  • Choose the appropriate question types: Based on your research objectives, choose the appropriate question types to collect the data you need. Refer to the types of questions mentioned earlier for guidance.
  • Develop questions: Develop clear and concise questions that are easy for participants to understand. Avoid leading or biased questions that might influence the responses.
  • Organize questions: Organize questions in a logical and coherent order, starting with demographic questions followed by general questions, and ending with specific or sensitive questions.
  • Pilot the questionnaire : Test your questionnaire on a small group of participants to identify any flaws or issues with the questions or the format.
  • Refine the questionnaire : Based on feedback from the pilot, refine and revise the questionnaire as necessary to ensure that it is valid and reliable.
  • Distribute the questionnaire: Distribute the questionnaire to your target audience using a method that is appropriate for your research objectives, such as online surveys, email, or paper surveys.
  • Collect and analyze data: Collect the completed questionnaires and analyze the data using appropriate statistical methods. Draw conclusions from the data and use them to inform decision-making or further research.
  • Report findings: Present your findings in a clear and concise report, including a summary of the research objectives, methodology, key findings, and recommendations.

Questionnaire Administration Modes

There are several modes of questionnaire administration. The choice of mode depends on the research objectives, sample size, and available resources. Some common modes of administration include:

  • Self-administered paper questionnaires: Participants complete the questionnaire on paper, either in person or by mail. This mode is relatively low cost and easy to administer, but it may result in lower response rates and greater potential for errors in data entry.
  • Online questionnaires: Participants complete the questionnaire on a website or through email. This mode is convenient for both researchers and participants, as it allows for fast and easy data collection. However, it may be subject to issues such as low response rates, lack of internet access, and potential for fraudulent responses.
  • Telephone surveys: Trained interviewers administer the questionnaire over the phone. This mode allows for a large sample size and can result in higher response rates, but it is also more expensive and time-consuming than other modes.
  • Face-to-face interviews : Trained interviewers administer the questionnaire in person. This mode allows for a high degree of control over the survey environment and can result in higher response rates, but it is also more expensive and time-consuming than other modes.
  • Mixed-mode surveys: Researchers use a combination of two or more modes to administer the questionnaire, such as using online questionnaires for initial screening and following up with telephone interviews for more detailed information. This mode can help overcome some of the limitations of individual modes, but it requires careful planning and coordination.

Example of Questionnaire

Title of the Survey: Customer Satisfaction Survey

Introduction:

We appreciate your business and would like to ensure that we are meeting your needs. Please take a few minutes to complete this survey so that we can better understand your experience with our products and services. Your feedback is important to us and will help us improve our offerings.

Instructions:

Please read each question carefully and select the response that best reflects your experience. If you have any additional comments or suggestions, please feel free to include them in the space provided at the end of the survey.

1. How satisfied are you with our product quality?

  • Very satisfied
  • Somewhat satisfied
  • Somewhat dissatisfied
  • Very dissatisfied

2. How satisfied are you with our customer service?

3. How satisfied are you with the price of our products?

4. How likely are you to recommend our products to others?

  • Very likely
  • Somewhat likely
  • Somewhat unlikely
  • Very unlikely

5. How easy was it to find the information you were looking for on our website?

  • Somewhat easy
  • Somewhat difficult
  • Very difficult

6. How satisfied are you with the overall experience of using our products and services?

7. Is there anything that you would like to see us improve upon or change in the future?

…………………………………………………………………………………………………………………………..

Conclusion:

Thank you for taking the time to complete this survey. Your feedback is valuable to us and will help us improve our products and services. If you have any further comments or concerns, please do not hesitate to contact us.

Applications of Questionnaire

Some common applications of questionnaires include:

  • Research : Questionnaires are commonly used in research to gather information from participants about their attitudes, opinions, behaviors, and experiences. This information can then be analyzed and used to draw conclusions and make inferences.
  • Healthcare : In healthcare, questionnaires can be used to gather information about patients’ medical history, symptoms, and lifestyle habits. This information can help healthcare professionals diagnose and treat medical conditions more effectively.
  • Marketing : Questionnaires are commonly used in marketing to gather information about consumers’ preferences, buying habits, and opinions on products and services. This information can help businesses develop and market products more effectively.
  • Human Resources: Questionnaires are used in human resources to gather information from job applicants, employees, and managers about job satisfaction, performance, and workplace culture. This information can help organizations improve their hiring practices, employee retention, and organizational culture.
  • Education : Questionnaires are used in education to gather information from students, teachers, and parents about their perceptions of the educational experience. This information can help educators identify areas for improvement and develop more effective teaching strategies.

Purpose of Questionnaire

Some common purposes of questionnaires include:

  • To collect information on attitudes, opinions, and beliefs: Questionnaires can be used to gather information on people’s attitudes, opinions, and beliefs on a particular topic. For example, a questionnaire can be used to gather information on people’s opinions about a particular political issue.
  • To collect demographic information: Questionnaires can be used to collect demographic information such as age, gender, income, education level, and occupation. This information can be used to analyze trends and patterns in the data.
  • To measure behaviors or experiences: Questionnaires can be used to gather information on behaviors or experiences such as health-related behaviors or experiences, job satisfaction, or customer satisfaction.
  • To evaluate programs or interventions: Questionnaires can be used to evaluate the effectiveness of programs or interventions by gathering information on participants’ experiences, opinions, and behaviors.
  • To gather information for research: Questionnaires can be used to gather data for research purposes on a variety of topics.

When to use Questionnaire

Here are some situations when questionnaires might be used:

  • When you want to collect data from a large number of people: Questionnaires are useful when you want to collect data from a large number of people. They can be distributed to a wide audience and can be completed at the respondent’s convenience.
  • When you want to collect data on specific topics: Questionnaires are useful when you want to collect data on specific topics or research questions. They can be designed to ask specific questions and can be used to gather quantitative data that can be analyzed statistically.
  • When you want to compare responses across groups: Questionnaires are useful when you want to compare responses across different groups of people. For example, you might want to compare responses from men and women, or from people of different ages or educational backgrounds.
  • When you want to collect data anonymously: Questionnaires can be useful when you want to collect data anonymously. Respondents can complete the questionnaire without fear of judgment or repercussions, which can lead to more honest and accurate responses.
  • When you want to save time and resources: Questionnaires can be more efficient and cost-effective than other methods of data collection such as interviews or focus groups. They can be completed quickly and easily, and can be analyzed using software to save time and resources.

Characteristics of Questionnaire

Here are some of the characteristics of questionnaires:

  • Standardization : Questionnaires are standardized tools that ask the same questions in the same order to all respondents. This ensures that all respondents are answering the same questions and that the responses can be compared and analyzed.
  • Objectivity : Questionnaires are designed to be objective, meaning that they do not contain leading questions or bias that could influence the respondent’s answers.
  • Predefined responses: Questionnaires typically provide predefined response options for the respondents to choose from, which helps to standardize the responses and make them easier to analyze.
  • Quantitative data: Questionnaires are designed to collect quantitative data, meaning that they provide numerical or categorical data that can be analyzed using statistical methods.
  • Convenience : Questionnaires are convenient for both the researcher and the respondents. They can be distributed and completed at the respondent’s convenience and can be easily administered to a large number of people.
  • Anonymity : Questionnaires can be anonymous, which can encourage respondents to answer more honestly and provide more accurate data.
  • Reliability : Questionnaires are designed to be reliable, meaning that they produce consistent results when administered multiple times to the same group of people.
  • Validity : Questionnaires are designed to be valid, meaning that they measure what they are intended to measure and are not influenced by other factors.

Advantage of Questionnaire

Some Advantage of Questionnaire are as follows:

  • Standardization: Questionnaires allow researchers to ask the same questions to all participants in a standardized manner. This helps ensure consistency in the data collected and eliminates potential bias that might arise if questions were asked differently to different participants.
  • Efficiency: Questionnaires can be administered to a large number of people at once, making them an efficient way to collect data from a large sample.
  • Anonymity: Participants can remain anonymous when completing a questionnaire, which may make them more likely to answer honestly and openly.
  • Cost-effective: Questionnaires can be relatively inexpensive to administer compared to other research methods, such as interviews or focus groups.
  • Objectivity: Because questionnaires are typically designed to collect quantitative data, they can be analyzed objectively without the influence of the researcher’s subjective interpretation.
  • Flexibility: Questionnaires can be adapted to a wide range of research questions and can be used in various settings, including online surveys, mail surveys, or in-person interviews.

Limitations of Questionnaire

Limitations of Questionnaire are as follows:

  • Limited depth: Questionnaires are typically designed to collect quantitative data, which may not provide a complete understanding of the topic being studied. Questionnaires may miss important details and nuances that could be captured through other research methods, such as interviews or observations.
  • R esponse bias: Participants may not always answer questions truthfully or accurately, either because they do not remember or because they want to present themselves in a particular way. This can lead to response bias, which can affect the validity and reliability of the data collected.
  • Limited flexibility: While questionnaires can be adapted to a wide range of research questions, they may not be suitable for all types of research. For example, they may not be appropriate for studying complex phenomena or for exploring participants’ experiences and perceptions in-depth.
  • Limited context: Questionnaires typically do not provide a rich contextual understanding of the topic being studied. They may not capture the broader social, cultural, or historical factors that may influence participants’ responses.
  • Limited control : Researchers may not have control over how participants complete the questionnaire, which can lead to variations in response quality or consistency.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

One-to-One Interview in Research

One-to-One Interview – Methods and Guide

Descriptive Research Design

Descriptive Research Design – Types, Methods and...

Textual Analysis

Textual Analysis – Types, Examples and Guide

Focus Groups in Qualitative Research

Focus Groups – Steps, Examples and Guide

Observational Research

Observational Research – Methods and Guide

Qualitative Research

Qualitative Research – Methods, Analysis Types...

Netigate

How many questions should be asked in a survey?

Of course, everyone wants to collect as much detailed data as possible. But while statisticians might love large amounts of data, when it comes to creating a survey, fewer questions are usually better – as long as they are the right questions.

Companies that already have some experience with surveys have likely noticed that asking more questions doesn’t necessarily lead to more insights. On the contrary, too many irrelevant questions may make it difficult to evaluate data. Check out the tips below to learn how to align the right questions with the right goals, and be efficient in your survey development process.

How many survey questions should one ask?

How many survey questions are too many?

Statistics show that most companies have room for improvement when it comes to the efficiency of their employee engagement surveys. In fact, only 22% of businesses are getting actionable results . Customer surveys pose yet another challenge. More than 70% of consumers perceive surveys as an unwelcome interruption in their user experience.

There’s no question the number of survey questions has an impact on the response rate. You can read more about how to improve that particular aspect of your survey in the article 5 keys to improving survey response rates. However, in this article, we want to focus on how the number of questions and length of a survey can help you keep your research goals focused and deliver significant results.

Determining one central survey question

What is the purpose of your survey? The core purpose of your survey should be based on a single question, examined from several different perspectives. Ultimately, you are trying to gather insight about one specific point, and then all the rest of the questions are about adding nuance and detail to that answer.

While it might seem challenging to narrow down your focus, having a single objective will actually help you build and structure your survey. It is important for respondents to understand the purpose of your survey to avoid becoming distracted. Using models such as logic & branching or multiple choice can help you group questions and increase the data volume while keeping the survey length effectively brief.

Track the average answering time

Although you might be focused on collecting the right data, you don’t want to miss one important yet often neglected metric: The average time respondents spend answering the questions.

According to customer case studies, 52% of survey participants tend to drop out after spending more than three minutes on a survey . Keeping track of how long it takes to answer your questions will help you to adjust the survey if necessary. Make sure to choose a survey tool that will provide you with these insights.

  • Create surveys based on our templates
  • Send surveys via email, links, API or individual logins
  • Analyse responses with filters & AI

Evaluate pulse surveys, NPS, and ESI

Some surveys cannot be short. For example, annual employee surveys or B2B partner surveys give survey respondents the opportunities to provide long-form answers. This kind of detail is hard to make up in other ways. Surveys like this give companies a bigger and more complete picture of their overall satisfaction, which helps them take immediate action.

But if a longer survey is not necessary, consider some shorter options.  Pulse surveys typically consist of 5-15 questions and are dispatched at a higher frequency than annual surveys. Pulse surveys are also an effective way to collect critical data and more often. Shorter surveys, such as Employee Satisfaction Index or Net Promoter Score, are best for providing straightforward, real-time feedback.

Consider market research panels

Market research involves a substantial amount of data to make more informed decisions. However, market research surveys directed towards non-existing customers are also the surveys with the lowest response rates. Therefore, consumer panels are highly recommended. Reaching out to your target group via a panel provider will allow you to collect data from a group of people with chosen demographics. You can easily track demographics data in your analytics tool.

Ask the right survey questions

For survey newbies, or for those looking for a starting point for your next survey, modifiable survey templates are your best friends. Remember, you aren’t the first to conduct a survey. There are a number of service providers available to provide you valuable advice regarding best survey designs and the best course of action. Which question format has a similar industry / organisation type / target group used? Why not leverage professional expertise and read case studies and success stories ? These resources can help you choose the right survey tool that just might make you the next success story.

How to utilise the advantages of qualitative research

How to utilise the advantages of qualitative research

When we think of survey, we often think of a single question to rate something…

How to speed up the pretest of your questionnaire: Practical tips for everyday survey work

How to speed up the pretest of your questionnaire: Practical tips for everyday survey work

The pretest of a questionnaire can often be done with simple means. We explain how this works.

5 essential tips for avoiding untruths and ensuring reliable survey data

5 essential tips for avoiding untruths and ensuring reliable survey data

Discover essential strategies for dealing with untruths in surveys, and unlock reliable data insights with Netigate's expertise in questionnaire design and untruth recognition.

Sign up to our monthly newsletter and get the latest insights

By submitting the form, you agree to Netigate's terms and conditions and order processing agreement and acknowledge that you have read Netigate's privacy policy .

✅ Get the latest insights, reports, and eBooks ✅ See feedback management tips and best practices ✅ Be the first to hear about platform enhancements and features

Almost there!

Please confirm your email address by clicking the link in the email we just sent you.

Quick start guide to Text Analysis

How many questions should be asked in a survey?

But first, cookies 🍪

Privacy overview.

CookieDurationDescription
cookielawinfo-checkbox-advertisement2 yearsRecords if the user has consented with marketing cookies
cookielawinfo-checkbox-analytics2 yearsRecords if the user has consented with analytics cookies
cookielawinfo-checkbox-functional2 yearsRecords if the user has consented with functional cookies
cookielawinfo-checkbox-necessary2 yearsRecords if the user has consented with necessary cookies
CookieLawInfoConsent2 yearsRecords the default button state of the corresponding category & the status of CCPA. It works only in coordination with the primary cookie.
viewed_cookie_policy2 yearsTo record if a cookie message box has been shown.
CookieDurationDescription
lpv294052sessionThis LPV cookie is set to keep Pardot from tracking multiple page views on a single asset over a 30-minute session. For example, if a visitor reloads a landing page several times over a 30-minute period, this cookie keeps each reload from being tracked as a page view.
visitor_id# [x2]14 monthsUnique visitor id related to Pardot account
visitor_id#-hash [x3]14 monthsSaves visitor id as a hash
CookieDurationDescription
_ga14 monthsThe _ga cookie, installed by Google Analytics, calculates visitor, session and campaign data and also keeps track of site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognize unique visitors.
_gid1 dayInstalled by Google Analytics, _gid cookie stores information on how visitors use a website, while also creating an analytics report of the website's performance. Some of the data that are collected include the number of visitors, their source, and the pages they visit anonymously.
AnalyticsSyncHistory1 monthUsed to store information about the time a sync with the lms_analytics cookie took place for users in the Designated Countries
CookieDurationDescription
_fbp3 monthsThis cookie is set by Facebook to display advertisements when either on Facebook or on a digital platform powered by Facebook advertising, after visiting the website.
_gcl_au3 monthsProvided by Google Tag Manager to experiment advertisement efficiency of websites using their services.
_uetsid1 dayStores and tracks visitors across websites
1P_JAR, CONSENT, NID2 yearsCollects site statistics and tracks conversion rate
bcookie2 yearsLinkedIn sets this cookie from LinkedIn share buttons and ad tags to recognize browser ID.
bscookie2 yearsThis cookie is a browser ID cookie set by Linked share Buttons and ad tags.
langsessionLinkedIn sets this cookie to remember a user's language setting.
lidc1 dayLinkedIn sets the lidc cookie to facilitate data center selection.
MUID1 year 24 daysBing sets this cookie to recognize unique web browsers visiting Microsoft sites. This cookie is used for advertising, site analytics, and other operations.
UserMatchHistory1 monthLinkedin - Used to track visitors on multiple websites, in order to present relevant advertisement based on the visitor's preferences.
  • (855) 776-7763

Training Maker

All Products

Qualaroo Insights

ProProfs.com

  • Get Started Free

FREE. All Features. FOREVER!

Try our Forever FREE account with all premium features!

How Many Questions Should Be Asked in a Survey?

how many questions in research questionnaire

Market Research Specialist

Emma David, a seasoned market research professional, specializes in employee engagement, survey administration, and data management. Her expertise in leveraging data for informed decisions has positively impacted several brands, enhancing their market position.

How Many Questions Should Be Asked in a Survey

“How many questions should I include in my survey?”

“With limited survey questions, will I be able to collect enough information from my target audience?”

“Or, will a large number of questions tire my customers and make them drop out of my survey?”

You might have faced the above dilemmas while creating a survey.

As a survey creator, you want to collect as much information as possible from your target audience. But remember, your survey takers will not always be willing to take up the surveys and most of them will have time constraints. They may find it challenging to fill out lengthy, open-ended questions and abandon the survey midway. So, it is crucial to find a balance between your data collection needs and your survey taker’s ease of survey response.

The ideal number of questions in a survey depends on many factors ranging from your survey goals to your audience type. Through this blog, let’s explore the tips and tricks to tackle the confusion of every survey creator “How many questions should be asked in a survey?”

What Are the Factors That Affect Your Survey Length

How long should a survey be? There is no definite answer to this question! It depends on many aspects ranging from what are your survey goals and objectives to the type of audience who will take your survey.

Let’s dig deeper into these factors:

1. Survey Type

There are many types of surveys like market research surveys , customer satisfaction survey , employee engagement surveys , Net Promoter Score (NPS) surveys, and more. An NPS survey aims to understand your customer loyalty with only one question on a scale of 0-10.  “How likely will you recommend our brand to your friends and family?”. Hence, an NPS survey is a very short survey that can be completed in seconds.

Market research survey example

On the other hand, customer satisfaction surveys help you understand the customer’s happiness with your products and services. This could consist of a number of open-ended and closed-ended questions ranging from five to ten in number. Hence, it takes more time to answer.

Customer Satisfaction Survey questions example

A powerful survey maker tool like ProProfs Survey Maker comes with customized survey templates to create any type of survey with ease.

2. Target Audience

Who is your ideal target audience? Are they your customers or potential customers or they are your employees? Based on your ideal survey base, you can keep your questions for a survey less or more in number.

 For example, when you are surveying your employees, you may consider asking more standard survey questions . Since your employees are well aligned with your vision and goals, they may not mind answering in-depth questions.

Employee Satisfaction Survey question example

On the other hand, when you want to know about your online visitor’s website experience, it is better to ask them a limited number of questions (1-2) in the form of a pop-up survey. Your website visitors easily get tired and frustrated with a lengthy survey. It is better to wait till they convert into customers before collecting and understanding their feedback about your brand.

Exit survey question

3. Survey Objectives

Largely, business survey questions are designed with a specific goal in mind. As a startup owner, you may have a goal of understanding the needs of your target audience to serve them better. A series of market research questions , designed to understand their choices will serve the purpose here. The survey can have different questions including demographic questions , open-ended questions , and more.

Market research survey example

Alternatively, the efforts made by a customer while interacting with your brand can be measured by a customer effort score (CES). , This survey consists of no more than 2-3 questions. Usually, it is embedded at major touchpoints like product purchases or customer complaint resolution.

Survey question type

Asking too many questions can reduce the quality of your online survey and affect the genuineness of the feedback collected.

Why Should You Avoid Asking Too Many Questions in Your Survey?

Asking too many questions in your survey brings about a lot of issues ranging from survey dropout to lack of accuracy in data collection. Here are a few reasons to keep the surveys short and crisp:

1. Avoid Survey Fatigue

Survey fatigue is a situation in which your survey taker feels bored or tired of taking the survey. Usually, too many questions can bring in a tiring effect on the survey participants. For example, when there are so many open-ended questions, your survey takers need to think more and frame in-depth answers to the questions. Most of them do not more than 2-3 mins from their busy schedule for the survey. They might see the entire survey process as a waste of time.

Survey Fatigue

2. Reduce Survey Dropout Rate

The chances of survey takers’ dropping out of the survey increase when the survey is long and exhausting. The more the survey dropout rate, the less accurate the survey responses.  Research shows that a survey with more than 25 minutes of completion time shows 3 times more dropout as compared to a survey that can be completed in just 5 minutes.

3. Ensure High Data Quality

With high survey dropout rates, the quality of your survey data can be inaccurate. For example, if only 10 out of 60 survey respondents take your survey, you may not get a clear view of your audience’s inputs. Maybe, the 10 respondents only speak in favor of your new product. You need inputs on every aspect of your product/service to understand the product challenges and serve them better.

Based on different survey types, you can frame varying questions to suit your business needs. Let’s see how.

What Are the Ideal Number of Questions For Different Survey Types?

How many questions should I ask in an online survey form? How long should an employee survey be? What is the i deal and maximum length for a web survey ? Each type of survey has its own dynamics when it comes to surveying length, survey question types, templates, customization options, and so on.

Let’s understand this one by one.

1. Customer Surveys

Research suggests that ideally, customer surveys can have anywhere between 15-20 questions. This ensures that your customers do not get overburdened with too many questions and can quickly respond to them in 2-5 minutes. Especially, if you have a new customer, you don’t want to lose their valuable feedback by overwhelming them with a lot of questions. In fact, a new customer brings a new perspective to your products and services.

2. Employee Surveys

Unlike a customer survey, an employee survey gives you the privilege to ask more than 20 questions. Employees might have a number of concerns regarding daily responsibilities, pay, perks, work environment, and more. They will be more than happy to voice their concerns through a survey platform. 

3. Pulse Surveys

Pulse surveys are short surveys aimed at understanding the pulse of your employees and customers alike. These surveys are usually conducted on a more frequent basis like once a month or once in two months. Also, the questions are more specific and precise. 

With pulse surveys, you can collect quick and actionable responses from both your customers and employees alike. Usually, this is a shorter version of the annual surveys with just 2-10 questions. These 2-10 questions focus on the most crux issues concerning an organization.

4. Intercept Surveys

Intercept surveys are in-person surveys conducted at points of contact like malls, restaurants, public places like parks, and more. Here, you intercept people and ask them to give feedback on a product or service.

While creating intercept surveys, you need to keep in mind that people do not have much time at hand. You might be interrupting their daily routine to ask for feedback. If the number of questions are more, they may not be willing to stop by and give you feedback. Hence, keep your questions as relevant as possible and limit them to 3-5. 

Since you have understood how many questions should be asked in a survey of different kinds, the next section decodes the methods to determine the right number of survey questions.

Methods to Determine How Many Questions Should be Asked In a Survey

Survey length is a combination of different factors like survey purpose, average response time, determining the type of data collection, etc. There is no fixed limit on how many questions should a survey have .

1. Determine the Purpose of Your Survey

What are the aims and objectives of conducting your survey? Is it to understand the customer opinion about a newly launched product? Or, is it the aim of the survey to collect feedback after a customer touchpoint like product purchase or customer service call.

Based on the above survey objectives, your survey will vary a lot. For example, a customer service survey should have no more than 1-2 questions.

Customer service survey

On the other hand, a product survey can have up to 5 questions to collect more detailed feedback about a newly launched product.

2. Identify the Type of Data You Wish to Collect Through the Survey

After identifying the objective of your survey, you need to identify the type of data you want to collect. Different survey questions help you collect varied information from your target audience. 

For example, with checkbox-type questions, you can collect bulk information from your audience. 

Checkbox type survey

With a comment type of question, you can collect additional information from your target audience.

survey for additional information

3. Use MicroSurveys in Case of More Questions

Micro Surveys contain 1-2 questions. It is ideal to split up your long survey into micro-surveys to reduce the burden on your survey respondents.

For example, if you want to conduct a survey on a newly launched product feature, first conduct a micro survey with the question, “Did you like our new product feature?” If they say yes, you can conduct another follow up micro survey with in-depth questions like “How has the new product feature solved your problem?”, “How would you rate our new product feature?” and more.

4. Determine the Average Answering Time

Based on the different survey types and the industry type, you need to determine the average survey completion time. For example, pulse surveys just require just 2-3 minutes to complete. While a detailed employee survey requires around 10-15 minutes to complete. Also, make sure that you inform the survey takers about the time that will be taken to complete the survey so they can be mentally prepared to complete the survey.

5. Make Use of Market Research Panels

Market research panels consist of a predefined group of individuals who have agreed to take part in the survey process. With this, you are assured of a requisite participation rate for your survey and design the number of questions accordingly. Also, you get to easily decide certain parameters like the demographics to be tracked, choices to be captured, and more.

6. Ask the Right Survey Question

Every survey type is characterized by a definite set of questions. For example, the Net Promoter Score (NPS) survey is based on customer loyalty questions while the  employee satisfaction survey takes into account an employee’s overall satisfaction with their job. 

The easiest way to present the right questions to your target audience is to access survey templates from a good survey maker tool . Most of the tools provide customized templates to get started with different types of surveys ranging from a customer satisfaction survey to an employee engagement survey.

7. Test Your Survey

Unless and until you test your survey, you can’t experience the first-hand performance of your survey. Ideally, you can get a third party to test and review your survey. When a fresh pair of eyes evaluate the survey, you get a fair perspective on the different question types: Are the questions long or short? Is a question precise enough? Will your survey respondents be able to complete the survey in the stipulated time, and more.

Be Market Ready by Asking the Right Questions at the Right Time!

Deciding the number of questions to be asked in a survey is necessary to ensure higher survey response rates. You need to take into account different factors like the type of survey, your target audience, and your survey objectives.

Asking the right number of questions involves careful research and planning. Firstly, you need to brainstorm your survey goals and the type of data you want to capture. Once you have decided this, it is time to determine the average time to answer based on your survey type. For example, a pulse survey is very short and needs just 2-3 minutes of your survey respondent’s time. Lastly, test your survey with a third party to understand its efficacy.

Are you looking to create a survey with the right number of questions? ProProfs Survey Maker gives you access to the right survey templates with the exact number of impactful survey questions.

Emma David

About the author

Emma David is a seasoned market research professional with 8+ years of experience. Having kick-started her journey in research, she has developed rich expertise in employee engagement, survey creation and administration, and data management. Emma believes in the power of data to shape business performance positively. She continues to help brands and businesses make strategic decisions and improve their market standing through her understanding of research methodologies.

Related Posts

how many questions in research questionnaire

What Are Product Survey Questions? [Guide + Examples]

how many questions in research questionnaire

How to Write Quantitative Research Questions: Types With Examples

how many questions in research questionnaire

75+ Fun Survey Questions to Create Engaging Surveys

how many questions in research questionnaire

150+ Poll Questions to Engage Your Target Audience

how many questions in research questionnaire

Employee Motivation Survey: 30+ Questions & Ready-to-Use Templates

how many questions in research questionnaire

Email Surveys Guide: Types, Questions And Examples

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Perspect Clin Res
  • v.14(3); Jul-Sep 2023
  • PMC10405529

Designing and validating a research questionnaire - Part 1

Priya ranganathan.

Department of Anaesthesiology, Tata Memorial Centre, Homi Bhabha National Institute, Mumbai, Maharashtra, India

Carlo Caduff

1 Department of Global Health and Social Medicine, King’s College London, London, United Kingdom

Questionnaires are often used as part of research studies to collect data from participants. However, the information obtained through a questionnaire is dependent on how it has been designed, used, and validated. In this article, we look at the types of research questionnaires, their applications and limitations, and how a new questionnaire is developed.

INTRODUCTION

In research studies, questionnaires are commonly used as data collection tools, either as the only source of information or in combination with other techniques in mixed-method studies. However, the quality and accuracy of data collected using a questionnaire depend on how it is designed, used, and validated. In this two-part series, we discuss how to design (part 1) and how to use and validate (part 2) a research questionnaire. It is important to emphasize that questionnaires seek to gather information from other people and therefore entail a social relationship between those who are doing the research and those who are being researched. This social relationship comes with an obligation to learn from others , an obligation that goes beyond the purely instrumental rationality of gathering data. In that sense, we underscore that any research method is not simply a tool but a situation, a relationship, a negotiation, and an encounter. This points to both ethical questions (what is the relationship between the researcher and the researched?) and epistemological ones (what are the conditions under which we can know something?).

At the start of any kind of research project, it is crucial to select the right methodological approach. What is the research question, what is the research object, and what can a questionnaire realistically achieve? Not every research question and not every research object are suitable to the questionnaire as a method. Questionnaires can only provide certain kinds of empirical evidence and it is thus important to be aware of the limitations that are inherent in any kind of methodology.

WHAT IS A RESEARCH QUESTIONNAIRE?

A research questionnaire can be defined as a data collection tool consisting of a series of questions or items that are used to collect information from respondents and thus learn about their knowledge, opinions, attitudes, beliefs, and behavior and informed by a positivist philosophy of the natural sciences that consider methods mainly as a set of rules for the production of knowledge; questionnaires are frequently used instrumentally as a standardized and standardizing tool to ask a set of questions to participants. Outside of such a positivist philosophy, questionnaires can be seen as an encounter between the researcher and the researched, where knowledge is not simply gathered but negotiated through a distinct form of communication that is the questionnaire.

STRENGTHS AND LIMITATIONS OF QUESTIONNAIRES

A questionnaire may not always be the most appropriate way of engaging with research participants and generating knowledge that is needed for a research study. Questionnaires have advantages that have made them very popular, especially in quantitative studies driven by a positivist philosophy: they are a low-cost method for the rapid collection of large amounts of data, even from a wide sample. They are practical, can be standardized, and allow comparison between groups and locations. However, it is important to remember that a questionnaire only captures the information that the method itself (as the structured relationship between the researcher and the researched) allows for and that the respondents are willing to provide. For example, a questionnaire on diet captures what the respondents say they eat and not what they are eating. The problem of social desirability emerges precisely because the research process itself involves a social relationship. This means that respondents may often provide socially acceptable and idealized answers, particularly in relation to sensitive questions, for example, alcohol consumption, drug use, and sexual practices. Questionnaires are most useful for studies investigating knowledge, beliefs, values, self-understandings, and self-perceptions that reflect broader social, cultural, and political norms that may well diverge from actual practices.

TYPES OF RESEARCH QUESTIONNAIRES

Research questionnaires may be classified in several ways:

Depending on mode of administration

Research questionnaires may be self-administered (by the research participant) or researcher administered. Self-administered (also known as self-reported or self-completed) questionnaires are designed to be completed by respondents without assistance from a researcher. Self-reported questionnaires may be administered to participants directly during hospital or clinic visits, mailed through the post or E-mail, or accessed through websites. This technique allows respondents to answer at their own pace and simplifies research costs and logistics. The anonymity offered by self-reporting may facilitate more accurate answers. However, the disadvantages are that there may be misinterpretations of questions and low response rates. Significantly, relevant context information is missing to make sense of the answers provided. Researcher-reported (or interviewer-reported) questionnaires may be administered face-to-face or through remote techniques such as telephone or videoconference and are associated with higher response rates. They allow the researcher to have a better understanding of how the data are collected and how answers are negotiated, but are more resource intensive and require more training from the researchers.

The choice between self-administered and researcher-administered questionnaires depends on various factors such as the characteristics of the target audience (e.g., literacy and comprehension level and ability to use technology), costs involved, and the need for confidentiality/privacy.

Depending on the format of the questions

Research questionnaires can have structured or semi-structured formats. Semi-structured questionnaires allow respondents to answer more freely and on their terms, with no restrictions on their responses. They allow for unusual or surprising responses and are useful to explore and discover a range of answers to determine common themes. Typically, the analysis of responses to open-ended questions is more complex and requires coding and analysis. In contrast, structured questionnaires provide a predefined set of responses for the participant to choose from. The use of standard items makes the questionnaire easier to complete and allows quick aggregation, quantification, and analysis of the data. However, structured questionnaires can be restrictive if the scope of responses is limited and may miss potential answers. They also may suggest answers that respondents may not have considered before. Respondents may be forced to fit their answers into the predetermined format and may not be able to express personal views and say what they really want to say or think. In general, this type of questionnaire can turn the research process into a mechanical, anonymous survey with little incentive for participants to feel engaged, understood, and taken seriously.

STRUCTURED QUESTIONS: FORMATS

Some examples of close-ended questions include:

e.g., Please indicate your marital status:

  • Prefer not to say.

e.g., Describe your areas of work (circle or tick all that apply):

  • Clinical service
  • Administration
  • Strongly agree
  • Strongly disagree.
  • Numerical scales: Please rate your current pain on a scale of 1–10 where 1 is no pain and 10 is the worst imaginable pain
  • Symbolic scales: For example, the Wong-Baker FACES scale to rate pain in older children
  • Ranking: Rank the following cities as per the quality of public health care, where 1 is the best and 5 is the worst.

A matrix questionnaire consists of a series of rows with items to be answered with a series of columns providing the same answer options. This is an efficient way of getting the respondent to provide answers to multiple questions. The EORTC QLQ-C30 is an example of a matrix questionnaire.[ 1 ]

For a more detailed review of the types of research questions, readers are referred to a paper by Boynton and Greenhalgh.[ 2 ]

USING PRE-EXISTING QUESTIONNAIRES VERSUS DEVELOPING A NEW QUESTIONNAIRE

Before developing a questionnaire for a research study, a researcher can check whether there are any preexisting-validated questionnaires that might be adapted and used for the study. The use of validated questionnaires saves time and resources needed to design a new questionnaire and allows comparability between studies.

However, certain aspects need to be kept in mind: is the population/context/purpose for which the original questionnaire was designed similar to the new study? Is cross-cultural adaptation required? Are there any permission needed to use the questionnaire? In many situations, the development of a new questionnaire may be more appropriate given that any research project entails both methodological and epistemological questions: what is the object of knowledge and what are the conditions under which it can be known? It is important to understand that the standardizing nature of questionnaires contributes to the standardization of objects of knowledge. Thus, the seeming similarity in the object of study across diverse locations may be an artifact of the method. Whatever method one uses, it will always operate as the ground on which the object of study is known.

DESIGNING A NEW RESEARCH QUESTIONNAIRE

Once the researcher has decided to design a new questionnaire, several steps should be considered:

Gathering content

It creates a conceptual framework to identify all relevant areas for which the questionnaire will be used to collect information. This may require a scoping review of the published literature, appraising other questionnaires on similar topics, or the use of focus groups to identify common themes.

Create a list of questions

Questions need to be carefully formulated with attention to language and wording to avoid ambiguity and misinterpretation. Table 1 lists a few examples of poorlyworded questions that could have been phrased in a more appropriate manner. Other important aspects to be noted are:

Examples of poorly phrased questions in a research questionnaire

Original questionIssueRephrased question
Like most people here, do you consume a rice-based diet?Leading questionWhat type of diet do you consume?
What type of alcoholic drink do you prefer?Loaded or assumptive question (assumes that the respondent consumes alcohol)Do you consume alcoholic drinks? If yes, what type of alcoholic drink do you prefer?
Over the past 30 days, how many hours in total have you exercised?Difficult to recall informationOn average, how many days in a week do you exercise? And how many hours per day?
Do you agree that not smoking is associated with no risk to health?Double negativeDo you agree that smoking is associated with risk to health?
Was the clinic easy to locate and did you like the clinic?Double-barreled questionSplit into two separate questions: was the clinic easy to locate? Did you like the clinic?
Do you eat fries regularly?Ambiguous – the term “regularly” is open to interpretationHow often do you eat fries?
  • Provide a brief introduction to the research study along with instructions on how to complete the questionnaire
  • Allow respondents to indicate levels of intensity in their replies, so that they are not forced into “yes” or “no” answers where intensity of feeling may be more appropriate
  • Collect specific and detailed data wherever possible – this can be coded into categories. For example, age can be captured in years and later classified as <18 years, 18–45 years, 46 years, and above. The reverse is not possible
  • Avoid technical terms, slang, and abbreviations. Tailor the reading level to the expected education level of respondents
  • The format of the questionnaire should be attractive with different sections for various subtopics. The font should be large and easy to read, especially if the questionnaire is targeted at the elderly
  • Question sequence: questions should be arranged from general to specific, from easy to difficult, from facts to opinions, and sensitive topics should be introduced later in the questionnaire.[ 3 ] Usually, demographic details are captured initially followed by questions on other aspects
  • Use contingency questions: these are questions which need to be answered only by a subgroup of the respondents who provide a particular answer to a previous question. This ensures that participants only respond to relevant sections of the questionnaire, for example, Do you smoke? If yes, then how long have you been smoking? If not, then please go to the next section.

TESTING A QUESTIONNAIRE

A questionnaire needs to be valid and reliable, and therefore, any new questionnaire needs to be pilot tested in a small sample of respondents who are representative of the larger population. In addition to validity and reliability, pilot testing provides information on the time taken to complete the questionnaire and whether any questions are confusing or misleading and need to be rephrased. Validity indicates that the questionnaire measures what it claims to measure – this means taking into consideration the limitations that come with any questionnaire-based study. Reliability means that the questionnaire yields consistent responses when administered repeatedly even by different researchers, and any variations in the results are due to actual differences between participants and not because of problems with the interpretation of the questions or their responses. In the next article in this series, we will discuss methods to determine the reliability and validity of a questionnaire.

Financial support and sponsorship

Conflicts of interest.

There are no conflicts of interest.

tools4dev Practical tools for international development

how many questions in research questionnaire

How to pretest and pilot a survey questionnaire

Download as PDF

It’s important to test your survey questionnaire before using it to collect data. Pretesting and piloting can help you identify questions that don’t make sense to participants, or problems with the questionnaire that might lead to biased answers. This guide explains how to conduct basic pretesting and piloting for a survey.

This advice is for:

  • Basic quantitative surveys such as feedback forms, needs assessments, simple baseline and endline surveys, etc.

This advice is NOT for:

  • Complex baseline and endline surveys or research studies.
  • Developing new measurement instruments for use in research (e.g. psychological instruments for measuring concepts such as confidence, motivation, etc).
  • Qualitative focus groups or interviews.

Any testing is better than no testing

People often think that testing a survey takes a long time. They think they don’t have the time or resources for it, and so they end up just running the survey without any testing. This is a big mistake. Even testing with one person is better than no testing at all. So if you don’t have the time or resources to do everything in this guide, just do as much as you can with what you have available.

As a general rule, you should aim to pretest all your surveys and forms with at least 5 people. Even with this small number of people you’ll be surprised how many improvements you can make. Piloting is only really needed for large or complex surveys, and it takes significantly more time and effort.

Find 5-10 people from your target group

Once you’ve finished designing your survey questionnaire, find 5-10 people from your target group to pretest it. If you can’t get people from your exact target group then find people who are as close as possible. I once designed a survey that was going to be completed by garment factory workers in another province. There wasn’t enough budget available for us to travel to that province to pretest it, so we found some garment factory workers in our own province to test it.

Try to get a range of different people who are representative of your target group. For example, if your target group is young people aged 15-25, try to include some who are younger, some who are older, boys and girls with different socioeconomic backgrounds.

Finding pretesters

Although 5-10 people might not sound like many, you will usually find that most of them have the same problems with the survey. So even with this small number of people you should be able to identify most of the major issues. Adding more people might identify some additional smaller issues, but it also makes pretesting more time consuming and costly.

Ask them to complete the survey while thinking out loud

Once you’ve found your testers, ask them to complete the survey one at a time (they shouldn’t be able to watch each other complete it). The testers should complete the survey the same way that it will be completed in the actual project. So if it’s an online survey they should complete it online, if it’s a verbal survey you should have a trained interviewer ask them the questions.

While they are completing the survey ask them to think out loud. Each time they read and answer a question they should tell you exactly what comes into their mind. Take notes on everything they say.

Thinking out loud

Observe how they complete the survey

You should also observe them completing the survey. Look for places where they hesitate or make mistakes, such as the example below. This is an indication that the survey questions and layout are not clear enough and need to be improved. Keep notes on what you observe.

observations

Make improvements based on the results

Once all the testers have completed the survey review your notes from each session. At this point it’s normally clear what the major problems are so you can go about improving the survey to address those problems. Normally this is all that’s needed. However, if major changes are needed to the questions or structure it might be necessary to repeat the pretesting exercise with different people before starting the survey.

Select the pilot sample

For large or complex surveys it’s a good idea to do a full pilot before starting actual data collection. To do a pilot you need to test all the survey steps from start to finish with a reasonably large sample. The size of the pilot sample depends on how big your actual sample is, and how many data collectors you have. For a typical baseline or endline survey a sample of around 30-50 people is usually enough to identify any major bugs in the system.

Implement all the steps from start to finish

Start by training your data collectors, if you have them. Then distribute and collect the survey exactly as you would in practice. Enter the completed surveys into the database that you plan to use and then test the analysis that you plan to perform.

survey steps

Make improvements

Assuming that the survey was pretested, piloting will normally identify practical problems with implementation, rather than problems with the survey design. For example, lack of staff training, challenges with the logistics of distributing and collecting the survey, or errors in data entry. These can then be fixed before you do the actual survey.

Photo by CDKNetwork

Tags Monitoring & Evaluation

About Piroska Bisits Bullen

Avatar photo

Related Articles

how many questions in research questionnaire

What can international development learn from tech start-ups?

13 May 2021

how many questions in research questionnaire

Social Enterprise Business Plan Template

12 May 2021

how many questions in research questionnaire

How to write an M&E framework – Free video tutorial & templates

10 September 2017

  • PRO Courses Guides New Tech Help Pro Expert Videos About wikiHow Pro Upgrade Sign In
  • EDIT Edit this Article
  • EXPLORE Tech Help Pro About Us Random Article Quizzes Request a New Article Community Dashboard This Or That Game Happiness Hub Popular Categories Arts and Entertainment Artwork Books Movies Computers and Electronics Computers Phone Skills Technology Hacks Health Men's Health Mental Health Women's Health Relationships Dating Love Relationship Issues Hobbies and Crafts Crafts Drawing Games Education & Communication Communication Skills Personal Development Studying Personal Care and Style Fashion Hair Care Personal Hygiene Youth Personal Care School Stuff Dating All Categories Arts and Entertainment Finance and Business Home and Garden Relationship Quizzes Cars & Other Vehicles Food and Entertaining Personal Care and Style Sports and Fitness Computers and Electronics Health Pets and Animals Travel Education & Communication Hobbies and Crafts Philosophy and Religion Work World Family Life Holidays and Traditions Relationships Youth
  • Browse Articles
  • Learn Something New
  • Quizzes Hot
  • Happiness Hub
  • This Or That Game
  • Train Your Brain
  • Explore More
  • Support wikiHow
  • About wikiHow
  • Log in / Sign up
  • Education and Communications

How to Develop a Questionnaire for Research

Last Updated: July 21, 2024 Fact Checked

This article was co-authored by Alexander Ruiz, M.Ed. . Alexander Ruiz is an Educational Consultant and the Educational Director of Link Educational Institute, a tutoring business based in Claremont, California that provides customizable educational plans, subject and test prep tutoring, and college application consulting. With over a decade and a half of experience in the education industry, Alexander coaches students to increase their self-awareness and emotional intelligence while achieving skills and the goal of achieving skills and higher education. He holds a BA in Psychology from Florida International University and an MA in Education from Georgia Southern University. There are 12 references cited in this article, which can be found at the bottom of the page. This article has been fact-checked, ensuring the accuracy of any cited facts and confirming the authority of its sources. This article has been viewed 593,406 times.

A questionnaire is a technique for collecting data in which a respondent provides answers to a series of questions. [1] X Research source To develop a questionnaire that will collect the data you want takes effort and time. However, by taking a step-by-step approach to questionnaire development, you can come up with an effective means to collect data that will answer your unique research question.

Designing Your Questionnaire

Step 1 Identify the goal of your questionnaire.

  • Come up with a research question. It can be one question or several, but this should be the focal point of your questionnaire.
  • Develop one or several hypotheses that you want to test. The questions that you include on your questionnaire should be aimed at systematically testing these hypotheses.

Step 2 Choose your question type or types.

  • Dichotomous question: this is a question that will generally be a “yes/no” question, but may also be an “agree/disagree” question. It is the quickest and simplest question to analyze, but is not a highly sensitive measure.
  • Open-ended questions: these questions allow the respondent to respond in their own words. They can be useful for gaining insight into the feelings of the respondent, but can be a challenge when it comes to analysis of data. It is recommended to use open-ended questions to address the issue of “why.” [2] X Research source
  • Multiple choice questions: these questions consist of three or more mutually-exclusive categories and ask for a single answer or several answers. [3] X Research source Multiple choice questions allow for easy analysis of results, but may not give the respondent the answer they want.
  • Rank-order (or ordinal) scale questions: this type of question asks your respondent to rank items or choose items in a particular order from a set. For example, it might ask your respondents to order five things from least to most important. These types of questions forces discrimination among alternatives, but does not address the issue of why the respondent made these discriminations. [4] X Research source
  • Rating scale questions: these questions allow the respondent to assess a particular issue based on a given dimension. You can provide a scale that gives an equal number of positive and negative choices, for example, ranging from “strongly agree” to “strongly disagree.” [5] X Research source These questions are very flexible, but also do not answer the question “why.”

Step 3 Develop questions for your questionnaire.

  • Write questions that are succinct and simple. You should not be writing complex statements or using technical jargon, as it will only confuse your respondents and lead to incorrect responses.
  • Ask only one question at a time. This will help avoid confusion
  • Asking questions such as these usually require you to anonymize or encrypt the demographic data you collect.
  • Determine if you will include an answer such as “I don’t know” or “Not applicable to me.” While these can give your respondents a way of not answering certain questions, providing these options can also lead to missing data, which can be problematic during data analysis.
  • Put the most important questions at the beginning of your questionnaire. This can help you gather important data even if you sense that your respondents may be becoming distracted by the end of the questionnaire.

Step 4 Restrict the length of your questionnaire.

  • Only include questions that are directly useful to your research question. [8] X Trustworthy Source Food and Agricultural Organization of the United Nations Specialized agency of the United Nations responsible for leading international efforts to end world hunger and improve nutrition Go to source A questionnaire is not an opportunity to collect all kinds of information about your respondents.
  • Avoid asking redundant questions. This will frustrate those who are taking your questionnaire.

Step 5 Identify your target demographic.

  • Consider if you want your questionnaire to collect information from both men and women. Some studies will only survey one sex.
  • Consider including a range of ages in your target demographic. For example, you can consider young adult to be 18-29 years old, adults to be 30-54 years old, and mature adults to be 55+. Providing the an age range will help you get more respondents than limiting yourself to a specific age.
  • Consider what else would make a person a target for your questionnaire. Do they need to drive a car? Do they need to have health insurance? Do they need to have a child under 3? Make sure you are very clear about this before you distribute your questionnaire.

Step 6 Ensure you can protect privacy.

  • Consider an anonymous questionnaire. You may not want to ask for names on your questionnaire. This is one step you can take to prevent privacy, however it is often possible to figure out a respondent’s identity using other demographic information (such as age, physical features, or zipcode).
  • Consider de-identifying the identity of your respondents. Give each questionnaire (and thus, each respondent) a unique number or word, and only refer to them using that new identifier. Shred any personal information that can be used to determine identity.
  • Remember that you do not need to collect much demographic information to be able to identify someone. People may be wary to provide this information, so you may get more respondents by asking less demographic questions (if it is possible for your questionnaire).
  • Make sure you destroy all identifying information after your study is complete.

Writing your questionnaire

Step 1 Introduce yourself.

  • My name is Jack Smith and I am one of the creators of this questionnaire. I am part of the Department of Psychology at the University of Michigan, where I am focusing in developing cognition in infants.
  • I’m Kelly Smith, a 3rd year undergraduate student at the University of New Mexico. This questionnaire is part of my final exam in statistics.
  • My name is Steve Johnson, and I’m a marketing analyst for The Best Company. I’ve been working on questionnaire development to determine attitudes surrounding drug use in Canada for several years.

Step 2 Explain the purpose of the questionnaire.

  • I am collecting data regarding the attitudes surrounding gun control. This information is being collected for my Anthropology 101 class at the University of Maryland.
  • This questionnaire will ask you 15 questions about your eating and exercise habits. We are attempting to make a correlation between healthy eating, frequency of exercise, and incidence of cancer in mature adults.
  • This questionnaire will ask you about your recent experiences with international air travel. There will be three sections of questions that will ask you to recount your recent trips and your feelings surrounding these trips, as well as your travel plans for the future. We are looking to understand how a person’s feelings surrounding air travel impact their future plans.

Step 3 Reveal what will happen with the data you collect.

  • Beware that if you are collecting information for a university or for publication, you may need to check in with your institution’s Institutional Review Board (IRB) for permission before beginning. Most research universities have a dedicated IRB staff, and their information can usually be found on the school’s website.
  • Remember that transparency is best. It is important to be honest about what will happen with the data you collect.
  • Include an informed consent for if necessary. Note that you cannot guarantee confidentiality, but you will make all reasonable attempts to ensure that you protect their information. [11] X Research source

Step 4 Estimate how long the questionnaire will take.

  • Time yourself taking the survey. Then consider that it will take some people longer than you, and some people less time than you.
  • Provide a time range instead of a specific time. For example, it’s better to say that a survey will take between 15 and 30 minutes than to say it will take 15 minutes and have some respondents quit halfway through.
  • Use this as a reason to keep your survey concise! You will feel much better asking people to take a 20 minute survey than you will asking them to take a 3 hour one.

Step 5 Describe any incentives that may be involved.

  • Incentives can attract the wrong kind of respondent. You don’t want to incorporate responses from people who rush through your questionnaire just to get the reward at the end. This is a danger of offering an incentive. [12] X Research source
  • Incentives can encourage people to respond to your survey who might not have responded without a reward. This is a situation in which incentives can help you reach your target number of respondents. [13] X Research source
  • Consider the strategy used by SurveyMonkey. Instead of directly paying respondents to take their surveys, they offer 50 cents to the charity of their choice when a respondent fills out a survey. They feel that this lessens the chances that a respondent will fill out a questionnaire out of pure self-interest. [14] X Research source
  • Consider entering each respondent in to a drawing for a prize if they complete the questionnaire. You can offer a 25$ gift card to a restaurant, or a new iPod, or a ticket to a movie. This makes it less tempting just to respond to your questionnaire for the incentive alone, but still offers the chance of a pleasant reward.

Step 6 Make sure your questionnaire looks professional.

  • Always proof read. Check for spelling, grammar, and punctuation errors.
  • Include a title. This is a good way for your respondents to understand the focus of the survey as quickly as possible.
  • Thank your respondents. Thank them for taking the time and effort to complete your survey.

Distributing Your Questionnaire

Step 1 Do a pilot study.

  • Was the questionnaire easy to understand? Were there any questions that confused you?
  • Was the questionnaire easy to access? (Especially important if your questionnaire is online).
  • Do you feel the questionnaire was worth your time?
  • Were you comfortable answering the questions asked?
  • Are there any improvements you would make to the questionnaire?

Step 2 Disseminate your questionnaire.

  • Use an online site, such as SurveyMonkey.com. This site allows you to write your own questionnaire with their survey builder, and provides additional options such as the option to buy a target audience and use their analytics to analyze your data. [18] X Research source
  • Consider using the mail. If you mail your survey, always make sure you include a self-addressed stamped envelope so that the respondent can easily mail their responses back. Make sure that your questionnaire will fit inside a standard business envelope.
  • Conduct face-to-face interviews. This can be a good way to ensure that you are reaching your target demographic and can reduce missing information in your questionnaires, as it is more difficult for a respondent to avoid answering a question when you ask it directly.
  • Try using the telephone. While this can be a more time-effective way to collect your data, it can be difficult to get people to respond to telephone questionnaires.

Step 3 Include a deadline.

  • Make your deadline reasonable. Giving respondents up to 2 weeks to answer should be more than sufficient. Anything longer and you risk your respondents forgetting about your questionnaire.
  • Consider providing a reminder. A week before the deadline is a good time to provide a gentle reminder about returning the questionnaire. Include a replacement of the questionnaire in case it has been misplaced by your respondent.

Community Q&A

Community Answer

You Might Also Like

Do a Science Investigatory Project

  • ↑ https://www.questionpro.com/blog/what-is-a-questionnaire/
  • ↑ https://www.hotjar.com/blog/open-ended-questions/
  • ↑ https://www.questionpro.com/a/showArticle.do?articleID=survey-questions
  • ↑ https://surveysparrow.com/blog/ranking-questions-examples/
  • ↑ https://www.lumoa.me/blog/rating-scale/
  • ↑ http://www.sciencebuddies.org/science-fair-projects/project_ideas/Soc_survey.shtml
  • ↑ http://www.fao.org/docrep/W3241E/w3241e05.htm
  • ↑ http://managementhelp.org/businessresearch/questionaires.htm
  • ↑ https://www.surveymonkey.com/mp/survey-rewards/
  • ↑ http://www.ideafit.com/fitness-library/how-to-develop-a-questionnaire
  • ↑ https://www.surveymonkey.com/mp/take-a-tour/?ut_source=header

About This Article

Alexander Ruiz, M.Ed.

To develop a questionnaire for research, identify the main objective of your research to act as the focal point for the questionnaire. Then, choose the type of questions that you want to include, and come up with succinct, straightforward questions to gather the information that you need to answer your questions. Keep your questionnaire as short as possible, and identify a target demographic who you would like to answer the questions. Remember to make the questionnaires as anonymous as possible to protect the integrity of the person answering the questions! For tips on writing out your questions and distributing the questionnaire, keep reading! Did this summary help you? Yes No

  • Send fan mail to authors

Reader Success Stories

Abdul Bari Khan

Abdul Bari Khan

Nov 11, 2020

Did this article help you?

Abdul Bari Khan

Jul 25, 2023

Iman Ilhusadi

Iman Ilhusadi

Nov 26, 2016

Jaydeepa Das

Jaydeepa Das

Aug 21, 2018

Atefeh Abdollahi

Atefeh Abdollahi

Jan 3, 2017

Do I Have a Dirty Mind Quiz

Featured Articles

Enjoy Your Preteen Years

Trending Articles

Pirate Name Generator

Watch Articles

Make Fluffy Pancakes

  • Terms of Use
  • Privacy Policy
  • Do Not Sell or Share My Info
  • Not Selling Info

wikiHow Tech Help Pro:

Develop the tech skills you need for work and life

how many questions in research questionnaire

Top tips for research questionnaire design

Master question types, prioritise user experience, and boost data quality..

A well-thought-out questionnaire is the backbone of any great piece of research. However, this is often overlooked as people tend to focus on the output rather than the process of how to get there. Refining and perfecting your questionnaire with the output in mind will make the process much easier when it comes to analysing the data and will also ensure that this data is robust.

A good questionnaire can help you build a story before you even have any data, ensure high data quality, and maximise completion rates.

But how can you do this?

Top six tips for questionnaire design

Tip 1 – know the headlines you are looking for.

Having a set of desired headlines that you’d like to get from the survey as a base makes writing the questionnaire much easier. Do you have a hypothesis to validate? Do you have an impact or maturity model to create? If so, make sure you create questions that lend themselves to that specific headline or model.

The questions should support the story you’re looking to tell or the headlines you need for your campaign, and it will be easier to get the desired angles to explore in the data. The type of question used here will also be crucial in dictating the kind of data that the research generates.

Tip 2 – Understand which question type is appropriate

Use ‘select one’ when you want to know exactly which is the ‘most important’ (or equivalent), or when it’s appropriate (e.g. on an agreement scale). However, bear in mind that if you ask respondents to select one from a long answer list, it can not only be tricky to answer but can result in very split results with no clear ‘winner’.

In these situations, using ‘select all that apply’ or ‘select up to three’ is the best option. This allows you to get higher percentages whilst also providing a rank order of answers (i.e. you can still see which comes out as first, second, and third from the list). You can see a list of good and bad questions below after ‘Key Takeaways’.

Tip 3 – Be mindful of the respondent’s experience

To make it as easy as possible, try to avoid ranking questions as they are more complex than they seem for respondents to answer and don’t tend to provide high-quality, interesting data. Instead, we recommend using Likert scales (from strongly agree to strongly disagree) or asking the respondents to select their top 3 challenges. This provides better quality responses and can still be reported as the most popular or most preferred when doing PR research, for example.

Try not to ask more than 25 questions (roughly a 10-minute survey), as this can mean people lose interest and engagement, which impacts the quality of responses as the survey progresses.

Tip 4 – Cover all the bases in question options to ensure data quality

Use closed questions that have a list of ready-made responses (i.e. code frames) that respondents can tick.  Understanding and researching issues related to the topic/market before code-frame creation is key to ensure a comprehensive list. If you’re in doubt about knowing all the potential ready-made options, use an ‘other’ category, particularly if you want to understand the impact of a problem or list of information sources.

We also strongly recommend including a neutral and ‘don’t know’ options as this allows for a larger breadth of responses rather than forcing participants to select options that might not reflect the truth, resulting in skewed data. Forcing respondents to choose an option they do not necessarily align with can also lead to higher dropout rates.

Tip 5 – Pick the correct format and ensure variety throughout

There are many ways that you can ensure the most engaging and varied set up for the respondent:  mix it up with short and long questions; include images or video where relevant. Put some simple to answer questions among a set that require more thought.

Also, make sure to keep the formatting of the questions clear and interesting. Don’t just ask lots of grid questions (e.g., rating questions), as respondents easily get bored with these.

Tip 6 – Watch the flow of your questionnaire and know your audience

Asking questions in the right order can be the difference between good and bad data. Ensuring questions which are related to each other are asked consecutively and placing questions into sections means that respondents will be ‘in the zone’ for that type of question before moving on to the next topic.

Try to be sensitive in the way that the questions are posed and the topics that are handled. Ask yourself how you would feel if you were in the respondent’s shoes and asked that question. Adding instructional text or pages with definitions can also aid the respondent’s experience.

Key Takeaways

There are numerous benefits to having a good questionnaire but there are two key outcomes you should focus on:

  • Setting yourself up to succeed – The goal of any research project is to have a great output that meets your objectives. It is pivotal to frame your questionnaire with headlines in mind and in a way that meets these objectives. The types of questions used and the context of the questions asked are very important and can vary depending on what it is you want to find out.
  • Improving data quality – Here at Sapio Research, our purpose is to  provide data confidence for all. A major part of this is about optimising data quality, and a good questionnaire can significantly influence this by increasing respondent engagement, ensuring the respondents views are covered, and reducing dropout rates. Quality data means quality results and should always be at the forefront of any research project.

Example Questions

Yes/no questions:.

We recommend splitting the Yes / No options to provide more insight .

Don’t do this ❌

Q1. Do you currently know how to write a good questionnaire? Select one  

Do this instead ✅

Q1. Do you currently know how to write a good questionnaire? Select one  

  • Yes , I am proficient
  • Yes, but I would like more training
  • No, but I would like to get better
  • No, and I don’t think I need to know this

Select all questions:

  • Ensure there are enough options for the respondent to choose from
  • Check whether a ‘select up to three’ question would be more appropriate to avoid respondents selecting all the options, which can result in flat data
  • Give respondents an option if none of the options apply to them

Q2. Which of the following are the key elements of a good questionnaire? Select all that apply.   

  • Interesting topic
  • Different types of questions
  • Short length

Q2. Which of the following are the key elements of a good questionnaire? Select up to three.  

  • Plenty of options to choose from for each question
  • Opportunities to write in own answers
  • Understandable language
  • Other (Please specify) 
  • I don’t know the key elements of a good questionnaire (Exclusive) 

Select one questions:

  • Ensure the options do not overlap 
  • Make sure all options are covered
  • Give respondents another option if none of the options apply to them

  Q3.  What is your current level of expertise on questionnaire writing? Select one  

  • Our organisation is very good at it
  • We make our best effort to write good questionnaires
  • Questionnaire writing is a time-consuming task
  • Questionnaire writing is fun

Q3.  What is your current level of expertise on questionnaire writing? Select one  

  • Expert – Our questionnaire writing is the best
  • Competent – We make the best effort to write good questionnaires but can still improve
  • Novice – We have had some training but can definitely improve further
  • Not applicable – Questionnaire writing is something we are not trained in at all

Numerical questions:

  • Ensure the bands do not overlap 
  • Where possible, try and keep the bands consistent (one-year gap, for example)

Q4.  How long have you been writing questionnaires? Select one   

  • 3-6 years 
  • I have never written a questionnaire
  • Less than a year
  • 3-4 years 
  • More than 5 years

Scale / Grid questions:

  • Make sure the questions are not leading 
  • Try to keep the options neutral to avoid bias 
  • Try to be more specific and avoid using subjective terminology 
  • Allow for answering neither agree nor disagree or don’t know to avoid false data and high dropout rates

Q5.  To what extent do you agree with the following statements? Select one per row  

  • My team is well-trained in questionnaire writing
  • My team finds it difficult to pick up new skills 
  • Strongly agree 
  • Strongly disagree
  • My business likes to support staff with personal development
  • My team prefers to stick with the tasks that employees are experienced in
  • Somewhat agree
  • Neither agree nor disagree
  • Somewhat disagree

How can we help?

Below are the trio of services that we provide when it comes to questionnaire design:.

Full design:   If your team is low on time and/or if you would love our expertise, we can build a questionnaire from scratch using your objectives, ideal headlines, business problems or stories for a campaign, and feedback rounds. This is also very useful when the questionnaire builds on a previous stage, such as the desk research or the qualitative stage. 

Partial design: We’d recommend this for people who have some good ideas of what questions they would like to ask but need a bit of support structing them and their ideas. We will feed in with suggesting questionnaire edits that will support your objectives and carefully review at every stage to make sure that your needs are being supported with our expertise.  

Questionnaire advice: This is the best approach if you have a lot of experience creating questionnaires in-house and are confident in writing questions, with little or no support needed. We’ll flag any potential changes that could improve the questionnaire that your team can choose to implement (or not), and support on developing the screening questions to define the survey audience.

And if you have any questions, feel free to contact us here to chat with us.

Need help figuring out the research you need, get topical tips, insights and more delivered to your inbox.

Be the first to know about the latest international business sentiments, behaviours and plans to stay one step ahead of your competition.

  • Panelist area
  • Become a panelist

How to prepare a questionnaire for qualitative research

' src=

Our guide to market research can be downloaded free of charge

Qualitative research is what we consider the first real technical step of market research. The qualitative questionnaire(also called interview guide) is instrumental for the success of this phase. In this post we explain EXACTLY how to prepare it.

As we’ve explained in other posts , market research is NOT only administrating a survey. It’s much more than that (get an overview of market research tools and techniques on our online guide ).

Today we’d like to discuss what we consider being the very first step when it comes to better understand your market and your future customer: conducting interviews .

Table of contents

Introduction.

  • Step 1 : define your topics
  • Step 2 : rank your topics
  • Step 3 : define the questions
  • Our secret tip for framing better questions

Only for our subscribers: exclusive analyses and marketing advice

Esteban Hendrickx

"I thought the blog was good. But the newsletter is even better!"

Interviews are just one of many qualitative research tools available to the market researcher. You could also do focus groups, participatory or non-participatory observations. The range is very wide. If you want to know more we recommend you read this  book by D. Silverman who has featured my work in the latest edition of his book. Choosing the right qualitative technique (or the right combination of techniques) is a work in itself. We have prepared a quite complete list of marketing problems you my face with the methodological approaches we recommend.

But whatever you chose to do you should make sure you do it right. Don’t do like most people we met who just ask a few questions, don’t even take the time to write them down and after having interviewed 5 people think they are finished . Qualitative research is more than that. Do it the right way and you’ll be rewarded.

On the funnel below (which represents the 7-step market research approach detailed here ), qualitative market research (to which qualitative interviews belong) is step #5.

how many questions in research questionnaire

Before you even start: the literature review

Preparing an interview guide (or qualitative questionnaire) requires that you master the topic you will cover. In particular you need to do your homework to understand what has already been said, written, researched on the topic. You could look after professional reports to get that knowledge. However, for the sake of quality, we highly recommend you only use academic sources. We have explained all you need to know about desk research . It’s one of our most comprehensive articles written in the last 10 years, so make sure you read it.

We suggest you use Google Scholar to find the latest academic articles on your topic. If you want to conduct qualitative interview about organic food consumption, a search in google scholar will return this.

how many questions in research questionnaire

As you can see I’ve entered “organic food consumption” as keywords and have not used any filters. I already get interesting results and some articles are freely available (I’ve highlighted them in yellow). This means you won’t have to pay to get access to them. Normally most scientific articles can be accessed only if you pay; unless you are a student or a researcher and in that case you can access them through your local library.

STEP 1 : define your topics

Before even starting out, you have to define which topics you want to cover . Two or three topics are the maximum. Don’t try to cover too many. That won’t work. Remember.

If you want to investigate organic food consumption, you need to focus on that. You won’t have time to explore in depth other topics (non-organic food consumption or non-food organic consumption for instance). You really need to dedicate all your energy and all your focus to that topic. Along the way you probably will discuss anyway other topics that touch on your main one.

Your main topic (organic food consumption) will fold into sub-themes. Here are a few examples that come to mind :

  • The patterns of organic shopping
  • The patterns of organic food consumption
  • the reasons behind organic food consumption
  • How respondents’ behaviors have changed since they adopted organic food

These sub-themes will come from your literature (that’s why it’s important to do one) and you’ll have to order them in your qualitative questionnaire.

how many questions in research questionnaire

STEP 2 : rank your topics

Now that you topics are defined, you have to rank them. Use the funnel method (hence the image above) and start with the most general topic and then move to more particular ones. This ranking will ensure your interview doesn’t look strange when conducted. It will follow a natural path that will not bother the interviewee.

Just like in a quantitative questionnaire you also need to end your interview with the questions that require the less cognitive energy. If your qualitative interview includes some creative techniques (collage, image ranking, …) do that at the beginning.

From a strategic viewpoint it’s also important not to end with the most important themes. You want your respondents to be in full cognitive power to answer your most important questions.

STEP 3 : define your questions

For each theme you should have a set of questions at your disposal to cover the different possible aspects. Keep this in mind: it is impossible to address one topic (especially when it’s complex) with one question alone. Usually, you’ll need three or more questions. Here again use the funnel technique. Start with a general open question, then with an open-ended one, and finish with a directed question.

Defining and framing the questions is obviously a big factor of success. The good news is that other researchers before you have been faced with the same issues. Other researchers have tried to define and validate what what are called “scales of measure”. Scales are intrinsically quantitative instrument that allow the measurement of a construct. Although they are primarily used for quantitative research, scales offer great insights into which questions to ask. Why not use these scales to define the questions of your qualitative questionnaire too?

Our secret tip for better questions: the book of scales

This is a tip no one will probably give you. You can get access to a book of scales for free (at least partially) and define qualitative questions that will make the difference. Use the table of content to find the construct(s) you need to evaluate and go to the dedicated chapter to see which questions you need to ask. A book of scale is worth 150$ and you’ll get it partially free below.

Here’s the rule you should remember for your next qualitative interview : 3×3=9. You should have no more than 3 themes to address, and address each topic with 3 questions. Eventually 9 questions should be sufficient to conduct the interview. The Art is actually in maintaining a good level of dialogue with the respondent and to make sure the answers provided are the most comprehensive.

  • Market research agency france
  • Market research belgium
  • Market research methods

' src=

1 August 2017

' src=

12 November 2017

But it can be 3×4, right? One main topic with four sub topic? Perception on four areas..

' src=

7 January 2019

Very helpful comments. Will use this information in my research.

' src=

8 January 2019

You are most welcome. Always happy to help.

' src=

6 April 2023

thank you it was very helpful

Post your opinion

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Don't forget to check your spam folder .

You didn't receive the link?

Pour offrir les meilleures expériences, nous utilisons des technologies telles que les cookies pour stocker et/ou accéder aux informations des appareils. Le fait de consentir à ces technologies nous permettra de traiter des données telles que le comportement de navigation ou les ID uniques sur ce site. Le fait de ne pas consentir ou de retirer son consentement peut avoir un effet négatif sur certaines caractéristiques et fonctions.

  • Help Center
  • اَلْعَرَبِيَّةُ
  • Deutsch (Schweiz)
  • Español (Mexico)
  • Bahasa Indonesia
  • Bahasa Melayu
  • Português (Brasil)
  • Tiếng việt

Survey Questions — Types & Examples

Survey questions Surveys are the doorway to understanding — the pulse of your market, the sentiment of your employees, and the satisfaction of your customers. But what makes this doorway effective? A well-crafted question. With a myriad of types and categories at your disposal, creating a compelling survey can feel like navigating through a maze. Fear not, curious explorer! In this guide, we'll equip you with everything you need to formulate the perfect survey questions. Remember, every step we take in this journey is geared towards a single goal — making your voice heard, and more importantly, understanding the voices that respond. Survey questions: List of types and categories The diversity in survey questions is what makes them a potent tool in your research arsenal. Let's uncover the various types that you can leverage, each with its unique flavor and purpose: Dichotomous Questions (Yes or No) These questions are simple and straightforward, requiring just a "yes" or "no" response. For instance, a tech company might ask, "Have you used our new mobile app?" Multiple-Choice Questions Great for when there are several potential answers but respondents need to select just one. An online store could ask, "Which method of payment did you use for your most recent purchase? (Credit Card, Debit Card, PayPal, Cash on Delivery)" Numerical Questions Numerical questions require respondents to provide a number as their answer, often related to age, quantity, or ranking. A fitness app might ask, "How many days per week do you exercise?" Nominal and Ordinal Questions Nominal questions offer categories with no inherent order, like "Which social media platform do you use the most? (Facebook, Twitter, Instagram, LinkedIn)". Ordinal questions provide ordered choices, such as "How would you rate your experience with our customer service? (Excellent, Good, Fair, Poor)". Rating Scale Questions Rating scale questions allow respondents to rate an experience on a scale, such as 1-5 or 1-10. For instance, a restaurant might ask, "On a scale of 1-10, how would you rate your dining experience?" Ranking Order Questions When you need to gauge preference or importance among options, ranking order questions are the way to go. A software company might ask, "Please rank these software features in order of importance: Speed, User-friendliness, Cost, and Customer Support." Likert Scale Questions Likert scale questions allow respondents to indicate their agreement or disagreement with a series of statements. An e-commerce website might ask, "I find the website easy to navigate: Strongly Agree, Agree, Neutral, Disagree, Strongly Disagree."You can also check out our article on Likert-Scales here. Matrix Questions Matrix questions let you collect multiple data points in one question. For instance, a product satisfaction survey might include a matrix question like "Please rate the following characteristics of our product: Price, Quality, Appearance, Packaging, etc." Dropdown Questions Dropdown questions are perfect when you have a long list of options. A survey about automobiles might include a dropdown question like, "From the dropdown menu, please select the make of your current vehicle." Demographic Questions Demographic questions offer insights about respondents like age, gender, income, and education. An example would be "What is your age bracket? (18-24, 25-34, 35-44, 45-54, 55+)". Image Choice Questions Image choice questions let respondents express their opinions using visuals. A clothing brand could include images of different styles and ask, "Which of these styles do you prefer?" Benchmarkable Questions Benchmarkable questions let you compare your data with industry standards. An example is the Net Promoter Score (NPS) question, "On a scale of 0-10, how likely are you to recommend our company to a friend or colleague?"In case you want to read more about the NPS, check out our article on this topic. Matrix Table Questions Matrix table questions offer a more detailed view of different aspects of a single topic. An airline might ask you to rate your satisfaction with several aspects of your flight, from booking to arrival, on a scale of 1-5. Open- and Closed-ended Questions Closed-ended questions provide fixed options for respondents, such as "Did you find what you were looking for today? (Yes, No)". Open-ended questions, on the other hand, allow for open-text responses like, "What improvements would you suggest for our website?" We'll dig deeper into the practical applications of these question types in the next section. Stay tuned for more enlightening insights! Ideas on what to ask Crafting the right survey questions is both an art and a science. It's a skill that requires understanding your objectives and your audience. Below are some fundamental considerations to stimulate your thought process: Identify Your Goals: Before jotting down your questions, take a moment to define what you intend to achieve with your survey. Are you seeking customer feedback about a product? Do you want to understand your employees' job satisfaction levels? Or, are you trying to gauge the effectiveness of a recent event? Know Your Audience: Understand who will be answering your questions. The language, tone, and type of questions you use should be tailored to fit your respondents. For instance, the questions you ask your employees would be different from those you ask your customers. Keep it Simple and Relevant: Keep your questions clear, simple, and relevant to your survey goals. Avoid technical jargon and ensure that each question contributes to achieving your survey's objectives. Strike a Balance: Include a mix of open-ended, closed-ended, and scaled questions. This way, you not only obtain specific data but also invite respondents to share their thoughts and experiences. Test Your Questions: Before sending out your survey, test it with a small group to ensure the questions are understood as intended. This will help you catch any confusing or leading questions. {loadmoduleid 430} Examples of common survey questions Now that we've covered the basics, let's jump into examples for different scenarios. Below, we'll share some common survey questions for various sectors and purposes: Survey Questions for Market Research How did you learn about our product/service? How likely are you to purchase our product/service again? What do you like most about our product/service? What improvements would you suggest? Survey Questions for Employees On a scale of 1-10, how satisfied are you with your job? How strongly do you agree with this statement: "I feel valued at work." Do you feel your work contributes to the company's goals? What suggestions do you have for improving the workplace? Survey Questions for Students On a scale of 1-5, how would you rate the effectiveness of the teaching methods used in the course? What did you find most challenging about this course? What suggestions do you have for improving the course? Survey Questions for Universities How well does our program meet your educational goals? How would you rate the quality of teaching provided in your course? What improvements would you suggest for our course structure? How effective do you think the current course assessment methods are? Are the learning resources provided, including library and online resources, sufficient and helpful? Survey Questions for Schools and Teachers How satisfied are you with the learning environment at school? On a scale of 1-10, how would you rate your teacher's teaching effectiveness? Do you feel your concerns are addressed promptly and effectively? How comfortable do you feel voicing your opinions in class? What would you suggest to make the school environment more engaging? Survey Questions for Events How did you hear about our event? How satisfied were you with the event's organization? Would you attend a similar event in the future? What did you like most about the event? What suggestions do you have for improving future events? Survey Questions for Businesses How often do you use our product/service? What factors influence your decision to choose our product/service? Is there anything we could do to improve your experience with our product/service? What additional features would you like to see in our product/service? How would you compare our product/service with others in the market? Survey Questions for Marketing Where do you usually find out about our new products/services? How well does our marketing communicate the benefits of our product/service? Are our marketing messages clear and easy to understand? How much do our marketing efforts impact your decision to purchase our products/services? What type of marketing content do you find most appealing or persuasive? Survey Questions to Ask About a Product What do you like most about our product? Is there anything you dislike about our product? What improvements would you suggest for our product? How does our product meet your needs compared to alternative products? If you could change one thing about our product, what would it be? Survey Questions for Customer Satisfaction On a scale of 1-10, how satisfied are you with our product/service? How likely are you to recommend our product/service to a friend? How can we improve your experience? What aspect of our service exceeds your expectations? What aspect of our service could be improved? Survey Questions About Social Media How often do you interact with our posts on social media? How useful do you find the information we share on social media? What type of content would you like to see more of on our social media platforms? Do you feel engaged with our brand on social media? How often would you like to see updates/posts from us on social media? Survey Questions for Kids What is your favorite activity in school? Who is your favorite character in our program/book? What would make our program/book more enjoyable for you? If you could change something about school, what would it be? What do you like most about our book/program? Survey Questions for Health Care and Hospital Satisfaction How would you rate the quality of care you received? How satisfied were you with the communication from our medical staff? How can we improve our service? How would you rate the comfort and cleanliness of our facility? Did you feel cared for and respected by our staff? Wording best practices: How to write survey questions Crafting questions for your survey is both an art and a science. The power of the right questions can unlock rich insights, while unclear or biased questions can lead to skewed results. So, let's dive in to discover the best practices to pen your survey questions. Keep It Simple, Smarty (KISS) Your questions need to be straightforward and simple. Avoid jargon, acronyms, or complex words. The goal is to make the respondent understand the question quickly, without having to read it twice. Example: Instead of asking "How would you appraise our service?" opt for "How would you rate our service?" Be Specific Broad questions can lead to broad answers, which might not give you the specific data you're looking for. Make sure your questions are targeted and clear. Example: Instead of "Do you like our products?" ask "Do you like our new spring collection?" Avoid Double-Barreled Questions Double-barreled questions ask about two topics but allow for only one response. This can confuse respondents and skew your data. Example: Instead of "Do you like our pricing and product quality?" break it down into "Do you like our pricing?" and "Do you like our product quality?" Avoid Leading and Loaded Questions Leading questions point respondents in a specific direction, while loaded questions contain an assumption. Both types can bias your survey results. Example: Instead of "Don't you think our app is user-friendly?" ask "How would you rate the user-friendliness of our app?" Provide a Neutral Option Sometimes respondents don't have a strong opinion either way. By providing a neutral option, you give them a choice without forcing them to lean in a direction they don't genuinely feel. Consider Using Open-Ended Questions These types of questions allow respondents to provide more detailed feedback. However, use them sparingly as they require more effort to answer. Example: "What features would you like to see added to our product?" Test Your Questions Finally, test your questions with a small group before sending out the survey. This can help you spot confusing or poorly worded questions. How many questions should be in a survey? Finding the Goldilocks number of questions for your survey - not too many, not too few - can be a tricky task. The optimal number depends on your survey's complexity, the time you expect respondents to have, and the type of questions asked. As a rule of thumb, a survey should take no longer than 5-10 minutes to complete, which typically equates to around 10-20 questions. What question order is best? Ordering your questions correctly can have a significant impact on response rates and the quality of feedback you receive. Here are a few tips to consider: Start with broad and general questions. These serve as a warm-up and are typically easier for respondents to answer. Move to more specific questions. Once you have set the stage, you can delve into the specifics. Place sensitive or potentially off-putting questions near the end. This ensures that you don't alienate respondents early. End with demographic questions. These questions are often seen as less interesting, but they are essential for data segmentation. Survey questions about personal information Gathering personal information in your survey can help you segment your data and understand your audience better. But it's important to respect your respondents' privacy. Only ask for information that's absolutely necessary and always inform respondents why you're asking for it. Examples of such questions include "What is your age range?" or "What is your employment status?" "What is your gender?" Survey question Asking about gender nowadays in a survey can be delicate, and it's essential to approach this question with sensitivity. An inclusive way to ask this question is to provide multiple choices beyond just 'male' and 'female', such as 'prefer not to say' and 'other (please specify)'. Example: "Which of the following best describes your gender? (1) Male (2) Female (3) Prefer not to say (4) Other (please specify)" Survey question templates Templates can be a great time-saver when creating a survey. Here are a few basic templates to get you started: Satisfaction questions: On a scale of 1-10, how satisfied are you with [product/service]? How would you rate your overall experience with [product/service]? How likely are you to recommend [product/service] to a friend or colleague? How satisfied were you with our customer service interaction? Are you satisfied with the quality of [product/service]? Usage questions: How often do you use [product/service]? How frequently do you purchase [product/service]? On average, how many times a week do you use our [website/app]? How often do you use [specific feature] in our [product/service]? How often would you say you need to use our [product/service]? Comparison questions: How does [product/service] compare to similar options on the market? In comparison to our competitors, how would you rate the value for money of our [product/service]? How would you compare the quality of our [product/service] to others you have used? Would you say our [product/service] meets your needs better than other options you've tried? How does our customer service compare to that of other companies you've interacted with? Improvement questions: What can we do to improve [product/service]? Are there any features you would like us to add to our [product/service]? What changes would most improve our [product/service]? How could we make our [product/service] more useful for you? If you could change one thing about our [product/service], what would it be? Survey questions generator If you're struggling to come up with the perfect questions for your survey, LimeSurvey has got your back! Our powerful Survey Questions Generator can help you create compelling, effective questions in no time. There you have it! The art of creating powerful survey questions demystified. Remember, the key to a successful survey is not only asking the right questions but asking them the right way. So, now that you're armed with these tips and tricks, it's time to create your own knockout survey! Try out LimeSurvey now! Happy surveying! {loadmoduleid 429}

how many questions in research questionnaire

Table Content

Survey questions.

Surveys are the doorway to understanding — the pulse of your market, the sentiment of your employees, and the satisfaction of your customers. But what makes this doorway effective? A well-crafted question. With a myriad of types and categories at your disposal, creating a compelling survey can feel like navigating through a maze. Fear not, curious explorer! In this guide, we'll equip you with everything you need to formulate the perfect survey questions. Remember, every step we take in this journey is geared towards a single goal — making your voice heard, and more importantly, understanding the voices that respond.

Survey questions: List of types and categories

The diversity in survey questions is what makes them a potent tool in your research arsenal. Let's uncover the various types that you can leverage, each with its unique flavor and purpose:

Dichotomous Questions (Yes or No)

These questions are simple and straightforward, requiring just a "yes" or "no" response. For instance, a tech company might ask, "Have you used our new mobile app?"

Multiple-Choice Questions

Great for when there are several potential answers but respondents need to select just one. An online store could ask, "Which method of payment did you use for your most recent purchase? (Credit Card, Debit Card, PayPal, Cash on Delivery)"

Numerical Questions

Numerical questions require respondents to provide a number as their answer, often related to age, quantity, or ranking. A fitness app might ask, "How many days per week do you exercise?"

Nominal and Ordinal Questions

Nominal questions offer categories with no inherent order, like "Which social media platform do you use the most? (Facebook, Twitter, Instagram, LinkedIn)". Ordinal questions provide ordered choices, such as "How would you rate your experience with our customer service? (Excellent, Good, Fair, Poor)".

Rating Scale Questions

Rating scale questions allow respondents to rate an experience on a scale, such as 1-5 or 1-10. For instance, a restaurant might ask, "On a scale of 1-10, how would you rate your dining experience?"

Ranking Order Questions

When you need to gauge preference or importance among options, ranking order questions are the way to go. A software company might ask, "Please rank these software features in order of importance: Speed, User-friendliness, Cost, and Customer Support."

Likert Scale Questions

Likert scale questions allow respondents to indicate their agreement or disagreement with a series of statements. An e-commerce website might ask, "I find the website easy to navigate: Strongly Agree, Agree, Neutral, Disagree, Strongly Disagree." You can also check out our article on Likert-Scales here.

Matrix Questions

Matrix questions let you collect multiple data points in one question. For instance, a product satisfaction survey might include a matrix question like "Please rate the following characteristics of our product: Price, Quality, Appearance, Packaging, etc."

Dropdown Questions

Dropdown questions are perfect when you have a long list of options. A survey about automobiles might include a dropdown question like, "From the dropdown menu, please select the make of your current vehicle."

Demographic Questions

Demographic questions offer insights about respondents like age, gender, income, and education. An example would be "What is your age bracket? (18-24, 25-34, 35-44, 45-54, 55+)".

Image Choice Questions

Image choice questions let respondents express their opinions using visuals. A clothing brand could include images of different styles and ask, "Which of these styles do you prefer?"

Benchmarkable Questions

Benchmarkable questions let you compare your data with industry standards. An example is the Net Promoter Score (NPS) question, "On a scale of 0-10, how likely are you to recommend our company to a friend or colleague?" In case you want to read more about the NPS, check out our article on this topic.

Matrix Table Questions

Matrix table questions offer a more detailed view of different aspects of a single topic. An airline might ask you to rate your satisfaction with several aspects of your flight, from booking to arrival, on a scale of 1-5.

Open- and Closed-ended Questions

Closed-ended questions provide fixed options for respondents, such as "Did you find what you were looking for today? (Yes, No)". Open-ended questions, on the other hand, allow for open-text responses like, "What improvements would you suggest for our website?"

We'll dig deeper into the practical applications of these question types in the next section. Stay tuned for more enlightening insights!

Ideas on what to ask

Crafting the right survey questions is both an art and a science. It's a skill that requires understanding your objectives and your audience. Below are some fundamental considerations to stimulate your thought process:

Identify Your Goals: Before jotting down your questions, take a moment to define what you intend to achieve with your survey. Are you seeking customer feedback about a product? Do you want to understand your employees' job satisfaction levels? Or, are you trying to gauge the effectiveness of a recent event?

Know Your Audience: Understand who will be answering your questions. The language, tone, and type of questions you use should be tailored to fit your respondents. For instance, the questions you ask your employees would be different from those you ask your customers.

Keep it Simple and Relevant: Keep your questions clear, simple, and relevant to your survey goals. Avoid technical jargon and ensure that each question contributes to achieving your survey's objectives.

Strike a Balance: Include a mix of open-ended, closed-ended, and scaled questions. This way, you not only obtain specific data but also invite respondents to share their thoughts and experiences.

Test Your Questions: Before sending out your survey, test it with a small group to ensure the questions are understood as intended. This will help you catch any confusing or leading questions.

  •   Create surveys in 40+ languages
  •   Unlimited number of users
  •   Ready-to-go survey templates
  •   So much more...

Examples of common survey questions

Now that we've covered the basics, let's jump into examples for different scenarios. Below, we'll share some common survey questions for various sectors and purposes:

Survey Questions for Market Research

  • How did you learn about our product/service?
  • How likely are you to purchase our product/service again?
  • What do you like most about our product/service?
  • What improvements would you suggest?

Survey Questions for Employees

  • On a scale of 1-10, how satisfied are you with your job?
  • How strongly do you agree with this statement: "I feel valued at work."
  • Do you feel your work contributes to the company's goals?
  • What suggestions do you have for improving the workplace?

Survey Questions for Students

  • On a scale of 1-5, how would you rate the effectiveness of the teaching methods used in the course?
  • What did you find most challenging about this course?
  • What suggestions do you have for improving the course?

Survey Questions for Universities

  • How well does our program meet your educational goals?
  • How would you rate the quality of teaching provided in your course?
  • What improvements would you suggest for our course structure?
  • How effective do you think the current course assessment methods are?
  • Are the learning resources provided, including library and online resources, sufficient and helpful?

Survey Questions for Schools and Teachers

  • How satisfied are you with the learning environment at school?
  • On a scale of 1-10, how would you rate your teacher's teaching effectiveness?
  • Do you feel your concerns are addressed promptly and effectively?
  • How comfortable do you feel voicing your opinions in class?
  • What would you suggest to make the school environment more engaging?

Survey Questions for Events

  • How did you hear about our event?
  • How satisfied were you with the event's organization?
  • Would you attend a similar event in the future?
  • What did you like most about the event?
  • What suggestions do you have for improving future events?

Survey Questions for Businesses

  • How often do you use our product/service?
  • What factors influence your decision to choose our product/service?
  • Is there anything we could do to improve your experience with our product/service?
  • What additional features would you like to see in our product/service?
  • How would you compare our product/service with others in the market?

Survey Questions for Marketing

  • Where do you usually find out about our new products/services?
  • How well does our marketing communicate the benefits of our product/service?
  • Are our marketing messages clear and easy to understand?
  • How much do our marketing efforts impact your decision to purchase our products/services?
  • What type of marketing content do you find most appealing or persuasive?

Survey Questions to Ask About a Product

  • What do you like most about our product?
  • Is there anything you dislike about our product?
  • What improvements would you suggest for our product?
  • How does our product meet your needs compared to alternative products?
  • If you could change one thing about our product, what would it be?

Survey Questions for Customer Satisfaction

  • On a scale of 1-10, how satisfied are you with our product/service?
  • How likely are you to recommend our product/service to a friend?
  • How can we improve your experience?
  • What aspect of our service exceeds your expectations?
  • What aspect of our service could be improved?

Survey Questions About Social Media

  • How often do you interact with our posts on social media?
  • How useful do you find the information we share on social media?
  • What type of content would you like to see more of on our social media platforms?
  • Do you feel engaged with our brand on social media?
  • How often would you like to see updates/posts from us on social media?

Survey Questions for Kids

  • What is your favorite activity in school?
  • Who is your favorite character in our program/book?
  • What would make our program/book more enjoyable for you?
  • If you could change something about school, what would it be?
  • What do you like most about our book/program?

Survey Questions for Health Care and Hospital Satisfaction

  • How would you rate the quality of care you received?
  • How satisfied were you with the communication from our medical staff?
  • How can we improve our service?
  • How would you rate the comfort and cleanliness of our facility?
  • Did you feel cared for and respected by our staff?

Wording best practices: How to write survey questions

Crafting questions for your survey is both an art and a science. The power of the right questions can unlock rich insights, while unclear or biased questions can lead to skewed results. So, let's dive in to discover the best practices to pen your survey questions.

Keep It Simple, Smarty (KISS)

Your questions need to be straightforward and simple. Avoid jargon, acronyms, or complex words. The goal is to make the respondent understand the question quickly, without having to read it twice.

Example: Instead of asking "How would you appraise our service?" opt for "How would you rate our service?"

Be Specific

Broad questions can lead to broad answers, which might not give you the specific data you're looking for. Make sure your questions are targeted and clear.

Example: Instead of "Do you like our products?" ask "Do you like our new spring collection?"

Avoid Double-Barreled Questions

Double-barreled questions ask about two topics but allow for only one response. This can confuse respondents and skew your data.

Example: Instead of "Do you like our pricing and product quality?" break it down into "Do you like our pricing?" and "Do you like our product quality?"

Avoid Leading and Loaded Questions

Leading questions point respondents in a specific direction, while loaded questions contain an assumption. Both types can bias your survey results.

Example: Instead of "Don't you think our app is user-friendly?" ask "How would you rate the user-friendliness of our app?"

Provide a Neutral Option

Sometimes respondents don't have a strong opinion either way. By providing a neutral option, you give them a choice without forcing them to lean in a direction they don't genuinely feel.

Consider Using Open-Ended Questions

These types of questions allow respondents to provide more detailed feedback. However, use them sparingly as they require more effort to answer.

Example: "What features would you like to see added to our product?"

Test Your Questions

Finally, test your questions with a small group before sending out the survey. This can help you spot confusing or poorly worded questions.

How many questions should be in a survey?

Finding the Goldilocks number of questions for your survey - not too many, not too few - can be a tricky task. The optimal number depends on your survey's complexity, the time you expect respondents to have, and the type of questions asked. As a rule of thumb, a survey should take no longer than 5-10 minutes to complete, which typically equates to around 10-20 questions.

What question order is best?

Ordering your questions correctly can have a significant impact on response rates and the quality of feedback you receive. Here are a few tips to consider:

Start with broad and general questions. These serve as a warm-up and are typically easier for respondents to answer.

Move to more specific questions. Once you have set the stage, you can delve into the specifics.

Place sensitive or potentially off-putting questions near the end. This ensures that you don't alienate respondents early.

End with demographic questions. These questions are often seen as less interesting, but they are essential for data segmentation.

Survey questions about personal information

Gathering personal information in your survey can help you segment your data and understand your audience better. But it's important to respect your respondents' privacy. Only ask for information that's absolutely necessary and always inform respondents why you're asking for it. Examples of such questions include "What is your age range?" or "What is your employment status?"

"What is your gender?" Survey question

Asking about gender nowadays in a survey can be delicate, and it's essential to approach this question with sensitivity. An inclusive way to ask this question is to provide multiple choices beyond just 'male' and 'female', such as 'prefer not to say' and 'other (please specify)'.

Example: "Which of the following best describes your gender? (1) Male (2) Female (3) Prefer not to say (4) Other (please specify)"

Survey question templates

Templates can be a great time-saver when creating a survey. Here are a few basic templates to get you started:

Satisfaction questions:

  • On a scale of 1-10, how satisfied are you with [product/service]?
  • How would you rate your overall experience with [product/service]?
  • How likely are you to recommend [product/service] to a friend or colleague?
  • How satisfied were you with our customer service interaction?
  • Are you satisfied with the quality of [product/service]?

Usage questions:

  • How often do you use [product/service]?
  • How frequently do you purchase [product/service]?
  • On average, how many times a week do you use our [website/app]?
  • How often do you use [specific feature] in our [product/service]?
  • How often would you say you need to use our [product/service]?

Comparison questions:

  • How does [product/service] compare to similar options on the market?
  • In comparison to our competitors, how would you rate the value for money of our [product/service]?
  • How would you compare the quality of our [product/service] to others you have used?
  • Would you say our [product/service] meets your needs better than other options you've tried?
  • How does our customer service compare to that of other companies you've interacted with?

Improvement questions:

  • What can we do to improve [product/service]?
  • Are there any features you would like us to add to our [product/service]?
  • What changes would most improve our [product/service]?
  • How could we make our [product/service] more useful for you?
  • If you could change one thing about our [product/service], what would it be?

Survey questions generator

If you're struggling to come up with the perfect questions for your survey, LimeSurvey has got your back! Our powerful Survey Questions Generator can help you create compelling, effective questions in no time.

There you have it! The art of creating powerful survey questions demystified. Remember, the key to a successful survey is not only asking the right questions but asking them the right way. So, now that you're armed with these tips and tricks, it's time to create your own knockout survey!

Try out LimeSurvey now!

Happy surveying!

Think one step ahead.

Step into a bright future with our simple online survey tool

Open Source

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case AskWhy Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

how many questions in research questionnaire

Home Market Research

Survey Sample Sizes: How Many People Should I Ask?

how many questions in research questionnaire

One of the first questions I wonder when I’m looking at survey results is: who was the audience? The second question: how large was the sample surveyed? Both of these are critical when looking at survey data and making inferences based on that data. You might have the best survey in the world, but if you didn’t get enough responses, you can’t generalize the answers to your audience.

We’ve featured a couple of insights into sample sizes in the past. First, we have a post about obtaining a “nationally representative” sample. In the post, the author talks about using quotas to help set up a nationally representative sample, but calls caution to the difference between how you set your quotas and how you analyze the results (hint: they should match exactly for best results). In another post, a different author goes more in-depth into sample sizes and makes the assertion that more is better, especially if you’re wanting to look at behavioral metrics.

Here, I’m taking a step back and looking at basic sample sizes. These are mathematical sample sizes, and, really, are best for obtaining general data. Let’s start with defining your audience.

LEARN ABOUT: Survey Sample Sizes

Getting your audience right first is critical in market research. For example, you have a questionnaire that is all about preschools in your area, and which preschools are most popular, best rated, least expensive. You would not want to ask people without preschool-aged children about the topic. Nor would you want to ask people whose youngest children are in college. Instead, your audience should include parents and caregivers of children ages 3-6 (asking those whose children are slightly older than preschool-aged will get you answers from those who already put their children through preschool, which can be relevant in this case if they sent their children to the preschools you are asking about in your study).

To make sure that these individuals are the ones answering your survey, there are two things you can do. First, you can purchase respondents via a panel . The second thing is ask screening questions such as, “How old are your children?” and “Are your children currently enrolled in preschool?” and “How long ago did your children attend preschool?”

In addition to making sure you have the right audience for your study, you also need to track how many responses you’re receiving to be sure that you’re getting the right number to provide valid insights. For this, I love this handy, free online sample size calculator . I use this calculator any time I need to get a sample size. The site has two sections, one that lets you enter the confidence level you want (how sure you want to be about your data) to determine the sample size needed, and the other section helps calculate the confidence interval for your data, based on the sample size you intend to use and the population size from which you’re getting your sample (such as the group of parents and caregivers with preschool age children in your area).

The section I use most is the first section – sample size I need based on the population I want to study and the confidence level I would like for my research. What this tells me is that I need at least that many responses, not that I need to survey at least that many people. If you get the number of responses equal to the sample size, you can infer predictions about the entire population you’re studying. If you get fewer responses, you can describe your sample, but you can’t necessarily apply the information to the rest of your audience.

Stay tuned for next week, when we’ll look at best practices for how many surveys you need to distribute in order to assure you get the number of responses you need based on the distribution method you intend to use (paper, online, email, social media).

MORE LIKE THIS

Interactive forms

Interactive Forms: Key Features, Benefits, Uses + Design Tips

Sep 4, 2024

closed-loop management

Closed-Loop Management: The Key to Customer Centricity

Sep 3, 2024

Net Trust Score

Net Trust Score: Tool for Measuring Trust in Organization

Sep 2, 2024

how many questions in research questionnaire

Why You Should Attend XDAY 2024

Aug 30, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • What’s Coming Up
  • Workforce Intelligence
  • Quantitative
  • Qualitative
  • Opinion Polling
  • Membership & Community
  • Decision Support
  • Concept Testing
  • Complex Designs
  • Statistics & Analytics
  • Couples and Dyadic Design
  • Marriage Equality
  • Personas Old and New
  • Tracking Wellbeing
  • Supporting Thought Leadership
  • Newsletters
  • Versta Blog
  • Best of the Blog
  • Subscribe to Newsletter
  • About Versta
  • Video Introduction
  • Turning Data Into Stories
  • Versta in the News
  • ISO 27001 Certified
  • For Survey Participants
  • Do Not Contact

Versta Research

How Many Questions in a 10-Minute Survey?

It all depends on what counts as a “question,” and also on how complex those questions are. A grid question with multiple rows takes a longer to answer than a simple yes-or-no question. A question that allows you to select multiple items from a laundry list of options takes longer than a standard 4-point scale. Ranking, in order, the top three to five items from that laundry list will take even longer.

Not that anyone is vying for the prestigious honor of being a survey-timing-guru, but we did feel proud to discover that Versta Research is now cited in the Federal Register as providing authoritative guidance on survey length. A notice filed by the Department of Transportation in Vol. 81, No. 193, October 5, 2016 says:

The estimate time for survey completion was calculated using Versta Research’s methodology for calculating an estimate of survey length, where each question is given a number of points based on the estimated burden required to respond to the question (for example, simple multiple choice questions are 1point, whereas short answer questions are 3 points per expected short phrase). The total number of points for all questions is then divided by eight (the number of simple questions a user can respond to online in 1 minute) to determine the estimate required length for finishing the survey.

You can find the details of our method in the Versta Research Winter 2011 Newsletter . Basically, each simple question takes 7½ seconds, so the trick is to translate all the standard question-types (many of which are not simple) into a number “points” that are simple-question equivalents. The method is easy to learn and easy to implement. Once you get the hang of it, it becomes second nature and you’ll know the length of any survey you write in just a few minutes.

That being said, and knowing that all the different question types tend to average themselves out in most of the surveys we write, here is what we generally proffer as the number of questions you can ask in a survey:

     5 minute survey:    10 to 15 questions

   10-minute survey:    20 to 30 questions

   15-minute survey:    30 to 45 questions

If you take a look at the DOT’s Federal Register entry you will also note they used our method to estimate that their survey would take respondents about 2 hours and 12 minutes to compete. Yikes! Unless you are surveying government officials who are well paid and required to fill out informational surveys like this, don’t try this at home. For most surveys, 20 minutes is about the maximum you can go before respondent attention lags and data quality deteriorates.

By Joe Hopper, Ph.D.

Related Posts

Survey Respondents Rarely Lie. They Skate By with “Satisficing.”

More From Forbes

Ask the right questions: why you should enable your team to challenge.

  • Share to Facebook
  • Share to Twitter
  • Share to Linkedin

Challenge is more likely to happen in a relaxed, inclusive environment

The importance of challenge is often underestimated in business. In fact, many leaders do not like to be challenged by their teams, fearing that challenge might make them look incompetent, uninformed or even weak.

Yet challenge is something that leaders should welcome, suggests research from Imperial College Business School’s Centre for Responsible Leadership. That’s because a culture of positive, constructive challenge can help to mitigate risks and improve decision making.

Professor Celia Moore, academic director of the Centre for Responsible Leadership at Imperial and lead author of the white paper, said: “The consequences of a corporate culture where teams are hesitant to challenge leaders can be dire. This can include financial damage and service failure.”

She adds that as staff can hesitate to offer challenge if they feel vulnerable, leaders “need to be clear that speaking up and disagreeing will not incur risk.”

Seek out opposing views

Evidence presented in the report highlights the importance of leaders asking the right questions to generate meaningful challenge from employees. The report found that open-ended questions and general queries were less likely to result in challenge.

Best High-Yield Savings Accounts Of 2024

Best 5% interest savings accounts of 2024.

Instead, leaders should focus on questions that specifically ask for disagreement. An example could be: “does anyone think there is a better idea?” Or “what would stop you taking this option?”

The report also revealed that leaders need to acknowledge challenges as legitimate, while focusing on the idea at hand. For example, they could say “that’s a fair challenge, and we could definitely go with that option.” Acknowledgement that is too general or strays into gratitude was found to be much less effective.

Additionally, the research found that individuals are more likely to speak up in an inclusive environment, where they feel comfortable and relaxed. It is also essential to allow time for thorough debate and challenge of ideas since meetings risk shutting down discussion before ideas can be properly explored.

Finally, team members are more likely to provide healthy challenge when they are held accountable for their views. For example, they could be asked to commit to a specific idea through a question such as “which do you prefer, option A or B?” Alternatively, they could be asked to vote on specific ideas.

Agree to disagree?

Challenge – and its connotations of disagreement – may seem a risky concept to leaders who want to build a happy, cohesive team. Nevertheless, it is important to bear in mind that those who provide the healthiest levels of challenge may also be their leaders’ biggest supporters, who are keen to help them achieve their business objectives.

Separate research by Durham University Business School found that employees who share the same goals as their leader are much more likely to contribute their ideas, concerns or feedback. This is a boost for organizations, says Dr Janey Zheng, professor of leadership at Durham University Business School, since employees who feel they have a voice “bring new perspectives, ideas and insight” and are also more likely to be happy in their roles.

Enjoyed this article? Follow me by clicking the blue “Follow” button beneath the headline above.

Sally Percy

  • Editorial Standards
  • Reprints & Permissions

Join The Conversation

One Community. Many Voices. Create a free account to share your thoughts. 

Forbes Community Guidelines

Our community is about connecting people through open and thoughtful conversations. We want our readers to share their views and exchange ideas and facts in a safe space.

In order to do so, please follow the posting rules in our site's  Terms of Service.   We've summarized some of those key rules below. Simply put, keep it civil.

Your post will be rejected if we notice that it seems to contain:

  • False or intentionally out-of-context or misleading information
  • Insults, profanity, incoherent, obscene or inflammatory language or threats of any kind
  • Attacks on the identity of other commenters or the article's author
  • Content that otherwise violates our site's  terms.

User accounts will be blocked if we notice or believe that users are engaged in:

  • Continuous attempts to re-post comments that have been previously moderated/rejected
  • Racist, sexist, homophobic or other discriminatory comments
  • Attempts or tactics that put the site security at risk
  • Actions that otherwise violate our site's  terms.

So, how can you be a power user?

  • Stay on topic and share your insights
  • Feel free to be clear and thoughtful to get your point across
  • ‘Like’ or ‘Dislike’ to show your point of view.
  • Protect your community.
  • Use the report tool to alert us when someone breaks the rules.

Thanks for reading our community guidelines. Please read the full list of posting rules found in our site's  Terms of Service.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

About half of TikTok users under 30 say they use it to keep up with politics, news

The Pew-Knight Initiative supports new research on how Americans absorb civic information, form beliefs and identities, and engage in their communities.

Pew Research Center is a nonpartisan, nonadvocacy fact tank that informs the public about the issues, attitudes and trends shaping the world. Knight Foundation is a social investor committed to supporting informed and engaged communities. Learn more >

TikTok has been so popular among young Americans that presidential campaigns are using it for voter outreach. And some young adults are using TikTok to keep up with politics or get news, a March Pew Research Center survey shows.

Pew Research Center conducted this analysis to understand age differences in TikTok users’ views and experiences on the platform. The questions are drawn from a broader survey exploring the views and experiences of TikTok, X, Facebook and Instagram users. For this analysis, we surveyed 10,287 adult internet users in the United States from March 18 to 24, 2024.

Everyone who took part in the survey is a member of the Center’s American Trends Panel (ATP), an online survey panel that is recruited through national, random sampling of residential addresses. This way, nearly all U.S. adults have a chance of selection. The survey was weighted by combining the sample of internet users with data from ATP members who do not use the internet and weighting the combined dataset to be representative of all U.S. adults by gender, race, ethnicity, partisan affiliation, education and other categories. This analysis is based on those who use TikTok. Read more about the ATP’s methodology .

Here are the questions used for this analysis , along with responses, and the survey methodology .

This is a Pew Research Center analysis from the Pew-Knight Initiative, a research program funded jointly by The Pew Charitable Trusts and the John S. and James L. Knight Foundation. Find related reports online at https://www.pewresearch.org/pew-knight/ .

Our survey explored various reasons people might use TikTok and other social media platforms. Young TikTok users stand out from their older peers on several of these reasons, including:

A bar chart showing that young adults stand out in using TikTok to keep up with politics and get news.

Keeping up with politics or political issues. For 48% of TikTok users ages 18 to 29, this is a major or minor reason why they’re on the platform.

By comparison, 36% of those ages 30 to 49 and even smaller shares of older users say the same:

  • 22% of those 50 to 64
  • 24% of those 65 and older

Getting news. We also asked TikTok users if getting news in general is a reason they use the platform – regardless of whether that’s political news or another topic entirely. About half of those under 30 say getting news is a major or minor reason they use TikTok.

That compares with 41% of TikTok users ages 30 to 49 who say getting news is a reason they’re on it. The shares of older users saying so are even smaller:

  • 29% of those 50 to 64
  • 23% of those 65 and older

TikTok has increasingly become a destination for news, bucking trends on other social media sites. A 2023 Center study showed more Americans – and especially young Americans – regularly get news on the platform compared with a few years ago. 

For more on what motivates TikTok use – like entertainment, which is a major draw for most TikTok users – read our deep dive into why and how people use the platform .

What people see and share on TikTok

A bar chart showing that TikTok users under 30 are more likely than those 50 and older to say they see at least some political content there.

Seeing political content

Nearly half of all TikTok users (45%) say they see at least some content about politics or political issues on the platform. That includes 6% of users who say political content is all or most of what they see.

Half of users under 30 say they see at least some political content on TikTok. That’s higher than the 39% of those 50 and older who say the same. However, the shares of 18- to 29-year-old users and 30- to 49-year-old users who say this are statistically similar.

Sharing political content

As on other platforms we’ve studied , far smaller shares post about politics than see political content on TikTok. About one-in-ten users ages 18 to 29 (7%), 30 to 49 (8%) and 50 to 64 (8%) post at least some political content there. That compares with just 2% of TikTok users 65 and older.

But many users – 63% – post nothing at all.

Only 36% of TikTok users say they ever post or share on the platform. Users ages 30 to 49 are most likely to say this, at 44%. That compares with 37% of those 18 to 29, 26% of those 50 to 64 and 15% of those 65 and older.

Seeing news-related content

A bar chart showing that TikTok users under 30 stand out in seeing breaking news, opinions about current events.

Regardless of whether TikTok users say getting news is a reason they’re there, most see humor and opinions about news on the platform:

  • 84% say they ever see funny posts that reference current events on TikTok
  • 80% ever see people expressing opinions about current events
  • 57% ever see news articles posted, reposted, linked or screenshotted
  • 55% ever see information about a breaking news event as it’s happening

Users under 50 are more likely than older users to say they ever see each of these.

And TikTok users under 30 stand out further in seeing opinions about current events and information about breaking news. They are more likely than any other age group to ever see these two kinds of content.

TikTok and democracy

Debates around TikTok’s impact on the political environment in the United States – including for young voters specifically – are squarely in the national spotlight. We wanted to understand: Do TikTok users think the platform impacts democracy, and how?

how many questions in research questionnaire

Overall, TikTok users are roughly twice as likely to think it’s mostly good for American democracy as they are to think it’s mostly bad (33% vs. 17%). But the largest share of users (49%) think it has no impact on democracy.

TikTok users under 30 are more positive, however – 45% of this group say it’s mostly good for democracy. That compares with:

  • 30% of users ages 30 to 49
  • 23% of users 50 to 64
  • 15% of users 65 and older

Even among users under 30, 39% say the platform has no impact on democracy. That share increases to 66% among users 65 and older.

The March survey found only minor differences by political party among TikTok users in views of its impact on democracy. Still, as lawmakers attempt to ban TikTok over national security concerns , other Center research has found that views of banning the platform have been sharply divided by political party among the general public.

To learn more about how Americans view and experience TikTok, X (formerly Twitter), Facebook and Instagram, read these companion reports:

How Americans Navigate Politics on TikTok, X, Facebook and Instagram

How americans get news on tiktok, x, facebook and instagram.

These Pew Research Center reports and this analysis are from the Pew-Knight Initiative, a research program funded jointly by The Pew Charitable Trusts and the John S. and James L. Knight Foundation.

Note: Here are the questions used for this analysis , along with responses, and the survey methodology .

  • News Media Trends
  • Politics Online
  • Social Media & the News

Download Colleen McClain's photo

Colleen McClain is a senior researcher focusing on internet and technology research at Pew Research Center .

How U.S. Adults Use TikTok

6 facts about americans and tiktok, whatsapp and facebook dominate the social media landscape in middle-income nations, most popular.

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

Polls show a changed, close 2024 race heading into Labor Day

Two words sum up the national and battleground state polls released ahead of Labor Day weekend, with fewer than 10 weeks to go until Election Day: changed and close.

Changed, because most of the surveys — conducted after President Joe Biden’s exit from the 2024 race, after the Democratic convention, and after independent Robert F. Kennedy Jr. endorsed former President Donald Trump — show Vice President Kamala Harris with narrow leads nationally and in key battlegrounds. 

That’s compared with polling that mostly showed Trump with a narrow edge before Biden’s departure. 

And close, because almost all of Harris’ leads are within the polls’ margins of error. And given the polling errors of 2016 and especially 2020, a candidate holding a 1-, 2-, or 3-point advantage in surveys doesn’t guarantee victory — far from it. 

Nationally, almost every recent survey shows Harris doing better than Trump by a handful of points. The latest  Wall Street Journal poll  finds Harris getting support from 48% of registered voters, while Trump gets 47%, well within the poll’s margin of error. The  previous Wall Street Journal poll , conducted immediately after Biden’s exit, had Trump ahead by 2 points, 49% to 47% — again within the margin of error. 

In addition, a  national Quinnipiac University poll  shows Harris ahead by 1 point among likely voters, 49% to 48%. It’s Quinnipiac’s first poll measuring likely voters, so there isn’t a past apples-to-apples comparison. But prior Quinnipiac polls of registered voters found Trump narrowly ahead of Biden  in June  and 2 points ahead of Harris  in July . 

And a  USA Today/Suffolk poll  looking at a multicandidate field has Harris ahead of Trump by 5 points among likely voters — again within the margin of error. 

In the battleground states, meanwhile, a set of  Bloomberg News/Morning Consult polls  have Harris and Trump tied in Arizona and North Carolina; Harris ahead within the margin of error in Georgia, Michigan, Nevada and Pennsylvania; and Harris ahead outside the margin of error in Wisconsin. 

But the battleground state polling picture is more varied: An  EPIC-MRA poll of Michigan  shows Trump with a narrow 1 point over Harris in that battleground, 47% to 46% among likely voters. Still, that’s a change from this poll back in June, when Trump enjoyed a 4-point lead over Biden. 

Here are other key takeaways and observations from the recent polling: 

The Sun Belt is more than in play for Harris

This might be the most significant polling change since Biden’s exit. When Biden was in the race, the states of Arizona, Georgia, Nevada and North Carolina seemed out of reach for the president.

But they are more than in reach with Harris at the top of the ticket.

Before the June 27 Biden-Trump debate, Biden was trailing Trump by thin margins in the Great Lakes swing states and by wider gaps in the Sun Belt. Not only is Harris doing better everywhere, but the distance between the margins in many of those states appears to have tightened with her leading the ticket.

Is this Harris’ peak?

The timing of these polls also is important. They come nearly six weeks after Biden bowed out of the 2024 contest, after the Democratic convention, and after what’s been a political honeymoon for Harris. 

Does that momentum, which has coincided with  increased Democratic enthusiasm , last? Or will Harris eventually come back to Earth? 

It’s pretty easy to explain why either of those scenarios could be true. The only way to find out for sure is to wait and see. 

Trump’s near-constant 47%

Notice a pattern in Trump’s ballot share in these recent polls? He’s at 47% nationally in the Wall Street Journal poll; 47% in that EPIC-MRA Michigan poll; and 47% in Georgia and Michigan, per the Bloomberg/Morning Consult surveys. 

As it turns out, 47% was Trump’s popular-vote share in the 2020 election (which he lost), and it was 46% in 2016 (which he won). 

The third-party vote shrinks

The major reason why Trump’s 46% was a winning number in 2016 and why 47% wasn’t in 2020 was the  size of the third-party vote . 

In 2016, the third-party vote share was 6%. But four years later, it was just 2%. 

And the most recent polls — with Kennedy out of the contest — show third-party candidates receiving a combined 2% in  Quinnipiac’s national poll , and getting a combined 4% in the  USA Today/Suffolk poll . 

By comparison, when Kennedy was in the race, that third-party share was bigger, even after falling from higher heights in the early summer.

how many questions in research questionnaire

Mark Murray is a senior political editor at NBC News.

To revisit this article, visit My Profile, then View saved stories .

  • The Big Story
  • Newsletters
  • Steven Levy's Plaintext Column
  • WIRED Classics from the Archive
  • WIRED Insider
  • WIRED Consulting

Major Sites Are Saying No to Apple’s AI Scraping

Apple logo on a store

Less than three months after Apple quietly debuted a tool for publishers to opt out of its AI training , a number of prominent news outlets and social platforms have taken the company up on it.

WIRED can confirm that Facebook, Instagram, Craigslist, Tumblr, The New York Times, The Financial Times, The Atlantic, Vox Media, the USA Today network, and WIRED’s parent company, Condé Nast, are among the many organizations opting to exclude their data from Apple’s AI training. The cold reception reflects a significant shift in both the perception and use of the robotic crawlers that have trawled the web for decades. Now that these bots play a key role in collecting AI training data, they’ve become a conflict zone over intellectual property and the future of the web.

This new tool, Applebot-Extended, is an extension to Apple’s web-crawling bot that specifically lets website owners tell Apple not to use their data for AI training. (Apple calls this “controlling data usage” in a blog post explaining how it works.) The original Applebot, announced in 2015, initially crawled the internet to power Apple’s search products like Siri and Spotlight. Recently, though, Applebot’s purpose has expanded: The data it collects can also be used to train the foundational models Apple created for its AI efforts.

Applebot-Extended is a way to respect publishers' rights, says Apple spokesperson Nadine Haija. It doesn’t actually stop the original Applebot from crawling the website—which would then impact how that website’s content appeared in Apple search products—but instead prevents that data from being used to train Apple's large language models and other generative AI projects. It is, in essence, a bot to customize how another bot works.

Publishers can block Applebot-Extended by updating a text file on their websites known as the Robots Exclusion Protocol, or robots.txt. This file has governed how bots go about scraping the web for decades—and like the bots themselves, it is now at the center of a larger fight over how AI gets trained. Many publishers have already updated their robots.txt files to block AI bots from OpenAI, Anthropic, and other major AI players.

Robots.txt allows website owners to block or permit bots on a case-by-case basis. While there’s no legal obligation for bots to adhere to what the text file says, compliance is a long-standing norm. (A norm that is sometimes ignored: Earlier this year, a WIRED investigation revealed that the AI startup Perplexity was ignoring robots.txt and surreptitiously scraping websites.)

Applebot-Extended is so new that relatively few websites block it yet. Ontario, Canada–based AI-detection startup Originality AI analyzed a sampling of 1,000 high-traffic websites last week and found that approximately 7 percent—predominantly news and media outlets—were blocking Applebot-Extended. This week, the AI agent watchdog service Dark Visitors ran its own analysis of another sampling of 1,000 high-traffic websites, finding that approximately 6 percent had the bot blocked. Taken together, these efforts suggest that the vast majority of website owners either don’t object to Apple’s AI training practices are simply unaware of the option to block Applebot-Extended.

The Internet Archive Loses Its Appeal of a Major Copyright Case

In a separate analysis conducted this week, data journalist Ben Welsh found that just over a quarter of the news websites he surveyed (294 of 1,167 primarily English-language, US-based publications) are blocking Applebot-Extended. In comparison, Welsh found that 53 percent of the news websites in his sample block OpenAI’s bot. Google introduced its own AI-specific bot, Google-Extended, last September; it’s blocked by nearly 43 percent of those sites, a sign that Applebot-Extended may still be under the radar. As Welsh tells WIRED, though, the number has been “gradually moving” upward since he started looking.

Welsh has an ongoing project monitoring how news outlets approach major AI agents. “A bit of a divide has emerged among news publishers about whether or not they want to block these bots,” he says. “I don't have the answer to why every news organization made its decision. Obviously, we can read about many of them making licensing deals, where they're being paid in exchange for letting the bots in—maybe that's a factor.”

Last year, The New York Times reported that Apple was attempting to strike AI deals with publishers. Since then, competitors like OpenAI and Perplexity have announced partnerships with a variety of news outlets, social platforms, and other popular websites. “A lot of the largest publishers in the world are clearly taking a strategic approach,” says Originality AI founder Jon Gillham. “I think in some cases, there's a business strategy involved—like, withholding the data until a partnership agreement is in place.”

There is some evidence supporting Gillham’s theory. For example, Condé Nast websites used to block OpenAI’s web crawlers. After the company announced a partnership with OpenAI last week, it unblocked the company’s bots. (Condé Nast declined to comment on the record for this story.) Meanwhile, Buzzfeed spokesperson Juliana Clifton told WIRED that the company, which currently blocks Applebot-Extended, puts every AI web-crawling bot it can identify on its block list unless its owner has entered into a partnership—typically paid—with the company, which also owns the Huffington Post.

Because robots.txt needs to be edited manually, and there are so many new AI agents debuting, it can be difficult to keep an up-to-date block list. “People just don’t know what to block,” says Dark Visitors founder Gavin King. Dark Visitors offers a freemium service that automatically updates a client site’s robots.txt, and King says publishers make up a big portion of his clients because of copyright concerns.

Robots.txt might seem like the arcane territory of webmasters—but given its outsize importance to digital publishers in the AI age, it is now the domain of media executives. WIRED has learned that two CEOs from major media companies directly decide which bots to block.

Some outlets have explicitly noted that they block AI scraping tools because they do not currently have partnerships with their owners. “We’re blocking Applebot-Extended across all of Vox Media’s properties, as we have done with many other AI scraping tools when we don’t have a commercial agreement with the other party,” says Lauren Starke, Vox Media’s senior vice president of communications. “We believe in protecting the value of our published work.”

Others will only describe their reasoning in vague—but blunt!—terms. “The team determined, at this point in time, there was no value in allowing Applebot-Extended access to our content,” says Gannett chief communications officer Lark-Marie Antón.

Meanwhile, The New York Times, which is suing OpenAI over copyright infringement, is critical of the opt-out nature of Applebot-Extended and its ilk. “As the law and The Times' own terms of service make clear, scraping or using our content for commercial purposes is prohibited without our prior written permission,” says NYT director of external communications Charlie Stadtlander, noting that the Times will keep adding unauthorized bots to its block list as it finds them. “Importantly, copyright law still applies whether or not technical blocking measures are in place. Theft of copyrighted material is not something content owners need to opt out of.”

It’s unclear whether Apple is any closer to closing deals with publishers. If or when it does, though, the consequences of any data licensing or sharing arrangements may be visible in robots.txt files even before they are publicly announced.

“I find it fascinating that one of the most consequential technologies of our era is being developed, and the battle for its training data is playing out on this really obscure text file, in public for us all to see,” says Gillham.

You Might Also Like …

In your inbox: Our biggest stories , handpicked for you each day

How one bad CrowdStrike update crashed the world’s computers

The Big Story: How soon might the Atlantic Ocean break ?

Welcome to the internet's hyper-consumption era

how many questions in research questionnaire

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • NEWS EXPLAINER
  • 28 August 2024

Mpox is spreading rapidly. Here are the questions researchers are racing to answer

  • Sara Reardon

You can also search for this author in PubMed   Google Scholar

You have full access to this article via your institution.

Coloured transmission electron micrograph of mpox (previously monkeypox) virus particles (orange) within an infected cell (yellow).

Monkeypox virus particles (shown in this coloured electron micrograph) can spread through close contact with people and animals. Credit: NIAID/Science Photo Library

When the World Health Organization (WHO) declared a public-health emergency over mpox earlier this month , it was because a concerning form of the virus that causes the disease had spread to multiple African countries where it had never been seen before. Since then, two people travelling to Africa — one from Sweden and one from Thailand — have become infected with that type of virus, called clade Ib, and brought it back to their countries.

how many questions in research questionnaire

Monkeypox virus: dangerous strain gains ability to spread through sex, new data suggest

Although researchers have known about the current outbreak since late last year, the need for answers about it is now more pressing than ever. The Democratic Republic of the Congo (DRC) has spent decades grappling with monkeypox clade I virus — the lineage to which Ib belongs. But in the past, clade I infections usually arose when a person came into contact with wild animals, and outbreaks would fizzle out.

Clade Ib seems to be different, and is spreading largely through contact between humans, including through sex . Around 18,000 suspected cases of mpox, many of them among children, and at least 600 deaths potentially attributable to the disease have been reported this year in the DRC alone.

How does this emergency compare with one declared in 2022, when mpox cases spread around the globe? How is this virus behaving compared with the version that triggered that outbreak, a type called clade II? And will Africa be able to rein this one in? Nature talks to researchers about information they are rushing to gather.

Is clade Ib more deadly than the other virus types?

It’s hard to determine, says Jason Kindrachuk, a virologist at the University of Manitoba in Winnipeg, Canada. He says that the DRC is experiencing two outbreaks simultaneously. The clade I virus, which has been endemic in forested regions of the DRC for decades, circulates in rural regions, where people get it from animals. That clade was renamed Ia after the discovery of clade Ib. Studies in animals suggest that clade I is deadlier than clade II 1 — but Kindrachuk says that it’s hard to speculate on what that means for humans at this point.

Even when not fatal, mpox can trigger fevers, aches and painful fluid-filled skin lesions.

how many questions in research questionnaire

Growing mpox outbreak prompts WHO to declare global health emergency

Although many reports state that 10% of clade I infections in humans are fatal, infectious-disease researcher Laurens Liesenborghs at the Institute of Tropical Medicine in Antwerp, Belgium, doubts that this figure is accurate. Even the WHO’s latest estimate of a 3.5% fatality rate for people with mpox in the DRC might be high.

There are many reasons that fatality estimates might be unreliable, Liesenborghs says. For one, surveillance data capture only the most severe cases; many people who are less ill might not seek care at hospitals or through physicians, so their infections go unreported.

Another factor that can confound fatality rates is a secondary health condition. For example, people living with HIV — who can represent a large proportion of the population in many African countries — die from mpox at twice the rate of the general population 2 , especially if their HIV is untreated. And the relatively high death rate among children under age 5 could be partly because of malnutrition, which is common among kids in rural parts of the DRC, Liesenborghs says.

Is clade Ib more transmissible than other types?

The clade Ib virus has garnered particular attention because epidemiological data suggest that it transmits more readily between people than previous strains did, including through sexual activity, whereas clade Ia mostly comes from animals. An analysis posted ahead of peer review on the preprint server medRxiv 3 shows that clade Ib’s genome contains genetic mutations that seem to have been induced by the human immune system, suggesting that it has been in humans for some time. Clade Ia genomes have fewer of these mutations.

But Liesenborghs says that the mutations and clades might not be the most important factor in understanding how monkeypox virus spreads. Although distinguishing Ia from Ib is useful in tracking the disease, he says, the severity and transmissibility of the disease could be affected more by the region where the virus is circulating and the people there. Clade Ia, for instance, seems to be more common in sparsely populated rural regions where it is less likely to spread far. Clade Ib is cropping up in densely populated areas and spreading more readily.

Jean Nachega, an infectious-disease physician at the University of Pittsburgh in Pennsylvania, says that scientists don’t understand many aspects of mpox transmission — they haven’t even determined which animal serves as a reservoir for the virus in the wild, although rodents are able to carry it. “We have to be very humble,” Nachega says.

How effective are vaccines against the clade I virus?

Just as was the case during the COVID-19 pandemic, health experts are looking to vaccines to help curb this mpox outbreak. Although there are no vaccines designed specifically against the monkeypox virus, there are two vaccines proven to ward off a related poxvirus — the one that causes smallpox. Jynneos, made by biotechnology company Bavarian Nordic in Hellerup, Denmark, contains a type of poxvirus that can’t replicate but can trigger an immune response. LC16m8, made by pharmaceutical company KM Biologics in Kumamoto, Japan, contains a live — but weakened — version of a different poxvirus strain.

how many questions in research questionnaire

Hopes dashed for drug aimed at monkeypox virus spreading in Africa

Still, it’s unclear how effective these smallpox vaccines are against mpox generally. Dimie Ogoina, an infectious-disease specialist at Niger Delta University in Wilberforce Island, Nigeria, points out that vaccines have been tested only against clade II virus in European and US populations, because these shots were distributed by wealthy nations during the 2022 global outbreak . And those recipients were primarily young, healthy men who have sex with men, a population that was particularly susceptible during that outbreak. One study in the United States found that one dose of Jynneos was 80% effective at preventing the disease in at-risk people, whereas two doses were 82% effective 4 ; the WHO recommends getting both jabs.

People in Africa infected with either the clade Ia or Ib virus — especially children and those with compromised immune systems — might respond differently. However, one study in the DRC found that the Jynneos vaccine generally raised antibodies against mpox in about 1,000 health-care workers who received it 5 .

But researchers are trying to fill in some data gaps. A team in the DRC is about to launch a clinical trial of Jynneos in people who have come into close contact with the monkeypox virus — but have not shown symptoms — to see whether it can prevent future infection, or improve outcomes if an infection arises.

Will the vaccines help to rein in the latest outbreak?

Mpox vaccines have been largely unavailable in Africa, but several wealthy countries have pledged to donate doses to the DRC and other affected African nations. The United States has offered 50,000 Jynneos doses from its national stockpile, and the European Union has ordered 175,000, with individual member countries pledging extra doses. Bavarian Nordic has also added another 40,000. Japan has offered 3.5 million doses of LC16m8 — for which only one jab is recommended instead of two.

how many questions in research questionnaire

Monkeypox in Africa: the science the world ignored

None of them have arrived yet, though, says Espoir Bwenge Malembaka, an epidemiologist at the Catholic University of Bukavu in the DRC. Low- and middle-income nations cannot receive vaccines until the WHO has deemed the jabs safe and effective. And the WHO has not given its thumbs up yet. It is evaluating data from vaccine manufacturers, delaying donors’ ability to send the vaccines.

Even when the vaccines arrive, Bwenge Malembaka says, “it’s really a drop in the bucket”. The Africa Centres for Disease Control and Prevention in Addis Ababa, Ethiopia, estimates that 10 million doses are needed to rein in the outbreak.

Bwenge Malembaka says that the uncertainty over vaccine arrival has made it difficult for the government to form a distribution plan. “I don’t know how one can go about this kind of challenge,” he says. Bwenge Malembaka suspects that children are likely to receive doses first, because they are highly vulnerable to clade I, but officials haven’t decided which regions to target. It’s also unclear how the government would prioritize other vulnerable populations such as sex workers, who have been affected by clade Ib. Their profession is criminalized in the DRC, so they might not be able to come forward for treatment.

Researchers lament that public-health organizations didn’t provide vaccines and other resources as soon as the clade I outbreak was identified, especially given lessons learnt from the 2022 global mpox outbreak. “The opportunity was there a couple months ago to cut this transmission chain, but resources weren’t available,” Liesenborghs says. “Now, it will be more challenging to tackle this outbreak, and the population at risk is much broader.”

Nature 633 , 16-17 (2024)

doi: https://doi.org/10.1038/d41586-024-02793-9

Americo, J. L., Earl, P. L. & Moss, B. Proc. Natl Acad. Sci. USA 120 , e2220415120 (2023).

Article   PubMed   Google Scholar  

Yinka-Ogunleye, A. et al. BMJ Glob. Health 8 , e013126 (2023).

Kinganda-Lusamaki, E. et al. Preprint at medRxiv https://doi.org/10.1101/2024.08.13.24311951 (2024).

Yeganeh, N. et al. Vaccine 42 , 125987 (2024).

Priyamvada, L. et al. Vaccine 40 , 7321–7327 (2022).

Download references

Reprints and permissions

Related Articles

how many questions in research questionnaire

  • Public health

Mapping glycoprotein structure reveals Flaviviridae evolutionary history

Mapping glycoprotein structure reveals Flaviviridae evolutionary history

Article 04 SEP 24

Farmed fur animals harbour viruses with zoonotic spillover potential

Farmed fur animals harbour viruses with zoonotic spillover potential

Mysterious Oropouche virus is spreading: what you should know

Mysterious Oropouche virus is spreading: what you should know

News Q&A 26 AUG 24

Found: a brain-wiring pattern linked to depression

Found: a brain-wiring pattern linked to depression

News 04 SEP 24

The hepatitis C virus envelope protein complex is a dimer of heterodimers

The hepatitis C virus envelope protein complex is a dimer of heterodimers

How rival weight-loss drugs fare at treating obesity, diabetes and more

How rival weight-loss drugs fare at treating obesity, diabetes and more

News 03 SEP 24

What accelerates brain ageing? This AI ‘brain clock’ points to answers

What accelerates brain ageing? This AI ‘brain clock’ points to answers

News 27 AUG 24

Extreme heat is a huge killer — these local approaches can keep people safe

Extreme heat is a huge killer — these local approaches can keep people safe

News 22 AUG 24

Faculty Recruitment, Westlake University School of Medicine

Faculty positions are open at four distinct ranks: Assistant Professor, Associate Professor, Full Professor, and Chair Professor.

Hangzhou, Zhejiang, China

Westlake University

how many questions in research questionnaire

Postdoctoral Researcher - Neural Circuits Genetics and Physiology for Learning and Memory

A postdoctoral position is available to study molecular mechanisms, neural circuits and neurophysiology of learning and memory.

Dallas, Texas (US)

The University of Texas Southwestern Medical Center

how many questions in research questionnaire

Assistant/Associate Professor (Tenure Track) - Integrative Biology & Pharmacology

The Department of Integrative Biology and Pharmacology (https://med.uth.edu/ibp/), McGovern Medical School at The University of Texas Health Scienc...

Houston, Texas (US)

UTHealth Houston

Faculty Positions

The Yale Stem Cell Center invites applications for faculty positions at the rank of Assistant, Associate, or full Professor. Rank and tenure will b...

New Haven, Connecticut

Yale Stem Cell Center

Postdoc/PhD opportunity – Pharmacology of Opioids

Join us at MedUni Vienna to explore the pharmacology of circular and stapled peptide therapeutics targetting the κ-opioid receptor in the periphery.

Vienna (AT)

Medical University of Vienna

how many questions in research questionnaire

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

IMAGES

  1. Designing Effective Questions and Questionnaires

    how many questions in research questionnaire

  2. Research Questions

    how many questions in research questionnaire

  3. What are the different types of questionnaires involved in research

    how many questions in research questionnaire

  4. 30+ Questionnaire Templates (Word) ᐅ TemplateLab

    how many questions in research questionnaire

  5. Survey Questions: 250+ Good Examples, Types & Best Practices

    how many questions in research questionnaire

  6. 🌷 Types of questionnaire in research. PURPOSE AND TYPES OF

    how many questions in research questionnaire

VIDEO

  1. Questionnaire || Meaning and Definition || Type and Characteristics || Research Methodology ||

  2. Research Questionnaire

  3. What is Questionnaire?Types of Questionnaire in Research .#Research methodology notes

  4. 50 must-know Questionnaires in medical/healthcare research. #questionnaire #research #healthcare

  5. Research Methodology

  6. Qualities of good research questionnaire, Types of questionnaire

COMMENTS

  1. Questionnaire Design

    Revised on June 22, 2023. A questionnaire is a list of questions or items used to gather data from respondents about their attitudes, experiences, or opinions. Questionnaires can be used to collect quantitative and/or qualitative information. Questionnaires are commonly used in market research as well as in the social and health sciences.

  2. Writing Good Survey Questions: 10 Best Practices

    4. Focus on Closed-Ended Questions. Surveys are, at their core, a quantitative research method.They rely upon closed-ended questions (e.g., multiple-choice or rating-scale questions) to generate quantitative data. Surveys can also leverage open-ended questions (e.g., short-answer or long-answer questions) to generate qualitative data.

  3. How short or long should be a questionnaire for any research

    Response rate is defined as the number of people who responded to a question asked divided by the number of total potential respondents. Response rate which is a crucial factor in determining the quality and generalizability of the outcome of the survey depends indirectly on the length and number of questions in a questionnaire.[7,8]Several studies have been conducted to assess the ...

  4. How Many Survey Questions Should I Use?

    This means, you should aim for 10 survey questions (or fewer, if you are using multiple text and essay box question types). When you start moving into long surveys with lots of questions and over 10 minute completion times, you may want to consider offering respondents an incentive to compensate them for their time.

  5. Questionnaire

    Definition: A Questionnaire is a research tool or survey instrument that consists of a set of questions or prompts designed to gather information from individuals or groups of people. It is a standardized way of collecting data from a large number of people by asking them a series of questions related to a specific topic or research objective.

  6. PDF Question and Questionnaire Design

    the topic of the survey, as it was described to the respondent prior to the interview. 3. Questions on the same topic should be grouped together. 4. Questions on the same topic should proceed from general to specific. 5. Questions on sensitive topics that might make respondents uncomfortable should be placed at the end of the questionnaire. 6.

  7. Designing a Questionnaire for a Research Paper: A Comprehensive Guide

    A questionnaire is an important instrument in a research study to help the researcher collect relevant data regarding the research topic. It is significant to ensure that the design of the ...

  8. How many questions should be asked in a survey?

    Pulse surveys typically consist of 5-15 questions and are dispatched at a higher frequency than annual surveys. Pulse surveys are also an effective way to collect critical data and more often. Shorter surveys, such as Employee Satisfaction Index or Net Promoter Score, are best for providing straightforward, real-time feedback.

  9. How Many Questions Should Be Asked in a Survey?

    Usually, this is a shorter version of the annual surveys with just 2-10 questions. These 2-10 questions focus on the most crux issues concerning an organization. 4. Intercept Surveys. Intercept surveys are in-person surveys conducted at points of contact like malls, restaurants, public places like parks, and more.

  10. Designing and validating a research questionnaire

    However, the quality and accuracy of data collected using a questionnaire depend on how it is designed, used, and validated. In this two-part series, we discuss how to design (part 1) and how to use and validate (part 2) a research questionnaire. It is important to emphasize that questionnaires seek to gather information from other people and ...

  11. How to pretest and pilot a survey questionnaire

    To do a pilot you need to test all the survey steps from start to finish with a reasonably large sample. The size of the pilot sample depends on how big your actual sample is, and how many data collectors you have. For a typical baseline or endline survey a sample of around 30-50 people is usually enough to identify any major bugs in the system.

  12. How to Develop a Questionnaire for Research: 15 Steps

    Come up with a research question. It can be one question or several, but this should be the focal point of your questionnaire. Develop one or several hypotheses that you want to test. The questions that you include on your questionnaire should be aimed at systematically testing these hypotheses. 2.

  13. Top tips for research questionnaire design

    Top tips for research questionnaire design Master question types, prioritise user experience, and boost data quality. August 27, 2024 by Katie Vinogradova and Joe Dalzell A well-thought-out questionnaire is the backbone of any great piece of research. However, this is often overlooked as people tend to focus on the output rather than the ...

  14. How long should a survey be? What is the ideal survey length

    For surveys longer than 30 questions, the average amount of time respondents spend on each question is nearly half of that compared to on surveys with less than 30 questions. In addition to the decreased time spent answering each question as surveys grew in length , we saw survey abandon rates increase for surveys that took more than 7-8 ...

  15. How to prepare a questionnaire for qualitative research

    STEP 3 : define your questions. For each theme you should have a set of questions at your disposal to cover the different possible aspects. Keep this in mind: it is impossible to address one topic (especially when it's complex) with one question alone. Usually, you'll need three or more questions.

  16. Survey Questions

    Survey questions: List of types and categories The diversity in survey questions is what makes them a potent tool in your research arsenal. Let's uncover the various types that you can leverage, each with its unique flavor and purpose: Dichotomous Questions (Yes or No) These questions are simple and straightforward, requiring just a "yes" or ...

  17. How to Estimate the Length of a Survey

    So instead, stay with the point system outlined above. Divide by 8, then multiple by 1.5 (if you can ask 5⅓ questions in one minute, you can ask 8 questions in one and a half minutes). Presto, you get the number of minutes it will take most survey respondents to complete your survey by phone.

  18. How many questions are needed for qualitative research? Is there any

    When I conducted my qualitative research I opted to interview 2 people with around 15 questions in my questionnaire. However, you can chose to do more if you wish to get a richer quality of ...

  19. (PDF) Questionnaires and Surveys

    First, in terms of data collection, this study was obtained using a questionnaire method, that is, participants' self-report, which is the most common and popular method for quantitative research ...

  20. Survey Sample Sizes: How Many People Should I Ask?

    Likert Scale Complete Likert Scale Questions, Examples and Surveys for 5, 7 and 9 point scales. Learn everything about Likert Scale with corresponding example for each question and survey demonstrations. Conjoint Analysis; Net Promoter Score (NPS) Learn everything about Net Promoter Score (NPS) and the Net Promoter Question. Get a clear view on the universal Net Promoter Score Formula, how to ...

  21. How Many Questions in a 10-Minute Survey?

    That being said, and knowing that all the different question types tend to average themselves out in most of the surveys we write, here is what we generally proffer as the number of questions you can ask in a survey: 5 minute survey: 10 to 15 questions. 10-minute survey: 20 to 30 questions. 15-minute survey: 30 to 45 questions.

  22. Why Many Parents and Teens Think It's Harder Being a Teen Today

    Pew Research Center has a long history of studying the attitudes and experiences of U.S. teens and parents, especially when it comes to their relationships with technology. For this analysis, the Center conducted an online survey of 1,453 U.S. teens and parents from Sept. 26 to Oct. 23, 2023, through Ipsos.

  23. Ask The Right Questions: Why You Should Enable Your Team To ...

    Research from Imperial College Business School's Centre for Responsible Leadership argues that a culture of constructive challenge can help to improve decision making.

  24. Many say leaders should stand up for people's religious beliefs

    These findings are from a 35-country Pew Research Center survey conducted from January to May 2024 among more than 53,000 respondents. Leaders who stand up for people with your religious beliefs. ... Differences on these questions also emerge by respondents' religion. Among Hindus, majorities say that all three measures of leaders' religion ...

  25. About half of TikTok users under 30 keep up with ...

    The Pew-Knight Initiative supports new research on how Americans absorb civic information, form beliefs and identities, and engage in their communities. ... The questions are drawn from a broader survey exploring the views and experiences of TikTok, X, Facebook and Instagram users. For this analysis, we surveyed 10,287 adult internet users in ...

  26. How many questions is too many, in a questionnaire/ how long in time is

    Time length for respondents to fill up your questionnaire is depending on number of constructs / variables and their total questions to be responded, number points of rating scales, clarity of ...

  27. Polls show a changed, close 2024 race heading into Labor Day

    And given the polling errors of 2016 and especially 2020, a candidate holding a 1-, 2-, or 3-point advantage in surveys doesn't guarantee victory — far from it.

  28. Major Sites Are Saying No to Apple's AI Scraping

    In a separate analysis conducted this week, data journalist Ben Welsh found that just over a quarter of the news websites he surveyed (294 of 1,167 primarily English-language, US-based ...

  29. To share or not to share, that is the question: a qualitative ...

    To address those research questions, we conducted a qualitative study comprising 14 semi-structured interviews and 136 open-ended survey responses with Chinese astronomers to understand their ...

  30. Mpox is spreading rapidly. Here are the questions researchers are

    Around 18,000 suspected cases of mpox, many of them among children, and at least 600 deaths potentially attributable to the disease have been reported this year in the DRC alone.