Avoid These Common Survey Mistakes

6 Common Survey Mistakes

Surveys are an excellent source of information that help organizations understand the communities they serve. Unfortunately, not all surveys generate quality data due to common survey mistakes and ineffective survey designs.

Here are 6 survey mistakes that organizations commonly make:

Mistake #1: Using All Qualitative or All Quantitative Questions

Using all qualitative or quantitative survey questions prevent organizations from gathering comprehensive data. For example, when only qualitative questions are included, respondents may experience survey fatigue, leading to vague or missing data. Qualitative questions also take longer to synthesize, interpret, and report. Yet asking only quantitative questions can lull respondents to answer on autopilot, providing little thought and consideration to each question. Further, only asking quantitative questions limit the richness of data collected. Key information may be missing because it was never asked. Using only one type of survey question prevents capturing the most comprehensive information.

Mistake #2: Double-Barreled Questions

A double-barreled question asks about more than one topic. This can be problematic with either qualitative or quantitative questions. Questions like, “How has your happiness and financial stability changed because of this program?” ask respondents to provide answers about two different concepts. Respondents may only focus on the component most important or noteworthy to them ignoring the other part of the question. Double-barreled questions lead to missing or incomplete data. They also can confuse respondents, leading to inaccurate information.

Mistake #3: Asking Biased Survey Questions

Biased questions influence how respondents answer questions towards specific outcomes. Leading questions and loaded questions are two common types of biased questions. Leading questions encourage respondents in a particular direction, swaying their response to a desired outcome. A question like, “How wonderful was our program for you?” encourages respondents to answer more favorably. Loaded questions make an assumption about the respondent that may or may not be true. For example, the question, “Where do you enjoy playing basketball?” assumes that the respondent participates in basketball and enjoys doing so. While this question may be more appropriate for a group of basketball players, it may be difficult to answer for someone who does not play basketball. Both question types lead to inaccurate or biased information and should be avoided.

Mistake #4: Writing Complicated Survey Questions

Writing complicated questions is a common survey mistake, especially for youth or other harder-to-survey populations. Complicated questions ask about an idea in a confusing way. For example, a question such as, “What is the best meal to eat, based on what time of day it is?” does not provide a clear path for the respondent to answer. The question is indirect and hard to answer. Ambiguous questions also can confuse respondents. These questions are broad, allowing respondents to interpret words or phrases in different ways. For example, a question that asks “How well is the government helping you?” does not indicate which type of government or what is meant by help.

Non-specific questions provide unclear avenues for respondents to answer, making it difficult to respond. Questions that utilize inaccessible, academic, or unfamiliar language can prevent respondents from providing meaningful answers. Without clear, simple language, respondents struggle to understand and accurately answer questions.

Mistake #5: Providing Confusing Item Responses

Effective surveys require different item response options that most accurately measure what you intend to learn. When item responses are confusing, poorly organized or presented, illogical, or not in alignment with survey questions, this may result in bad data. For instance, some questions ask respondents to rate items on a Likert scale. Being intentional about the amount that each item represents in a Likert scale is vital. For example, options like: “A Little Bit” or “Slightly” can be difficult for respondents to differentiate and can be harder to interpret after analysis. Providing confusing item responses restrict respondents’ ability to clearly distinguish between different answer choices, making it harder for them to accurately report their opinions and perspectives.

Mistake #6: Creating a Disorganized Survey Design Structure

Effective surveys are easy to follow and complete. When questions are disorganized, out of order, or vacillate between different topics, it is difficult for respondents to answer accurately and completely. Missing instructions, poor labels, and inconsistencies in the survey aesthetic can district respondents and lead to bad data. For example, asking respondents about the outcomes of a program before reminding them of the different program components can alter their answers. Question order does matter and can cause response bias if ignored. Disorganized survey structures may leave respondents feeling lost and confused, unable to focus on the content of the questions.

In sum, this blog has presented 6 common survey mistakes. Read Part 2 for solutions to these mistakes. If you have specific follow-up questions to this blog post, or any other research and evaluation needs, reach out to REC!

Sources

Better Together. (n. d.). Engaging with Hard-to-Reach Groups and Individuals. Better Together – Government of South Australia. https://www.bettertogether.sa.gov.au/planning-tools/prepare/engaging-with-hard-to-reach-groups-and-individuals

DiLeonardo, A., Lauricella, T., & Schaninger, B. (2021). Survey Fatigue? Blame the Leader, Not the Question. McKinsey & Company. Link: https://www.mckinsey.com/capabilities/people-and-organizational-performance/our-insights/the-organization-blog/survey-fatigue-blame-the-leader-not-the-question

Ed Tech Books. (n. d.). Response Options. Ed Tech Books.
Link: https://edtechbooks.org/designing_surveys/response_scale_devel

Editorial Team. (n. d.). Leading Questions: Definition, Examples, and Why You Should Avoid Them. Delighted by Qualtrics.
Link: https://delighted.com/blog/leading-questions

Effictiviology. (n. d.). Loaded Questions: What They Are and How to Respond to Them. Effectiviology.
Link: https://effectiviology.com/loaded-question/

Great Brook. (n. d.). Ambiguous Questions: The Biggest Threat to Survey Findings
Link: https://greatbrook.com/ambiguous-questions-biggest-mistake-survey-design/

Opinion Stage. (n. d.). Biased Survey Questions: What Is a Double-Barreled Question & How to Avoid It. Opinion Stage Ltd. Link: https://www.opinionstage.com/survey/double-barreled-question/

Nolinske, T. (n. d.) Questionnaire Context, Order, and Meaning. National Business Research Institute.
Link: https://www.nbrii.com/customer-survey-white-papers/questionnaire-context-order-and-meaning/

Shtivelband, A. (2021). 5 Tips for Creating a Survey. Research Evaluation Consulting LLC.
Link: https://researchevaluationconsulting.com/tips-survey-creation/

Shtivelband, A. (2018). 6 Tips to Collect Quality Data. Research Evaluation Consulting LLC.
Link: https://researchevaluationconsulting.com/6-tips-to-collect-quality-data/

SurveyMonkey. (n. d.). What is a Likert Scale?
Link: https://www.surveymonkey.com/mp/likert-scale/

Related Posts:

6 Solutions to Common Survey Mistakes

4 Data Collection Methods

6 Evaluation Challenges

5 Tips for Creating a Survey

Leave a Comment

Your email address will not be published. Required fields are marked *