The Risks of Poor Evaluation Research
Imagine this: Your organization launches a new program based on what seems like a great idea. You invest time, money, and other resources, but after a year, the results are underwhelming. You’re left wondering: What went wrong? This scenario happens all too often when research isn’t used effectively to guide decisions.
Leaders often recognize the value of research, but many organizations struggle to collect, interpret, and apply data in meaningful ways. Poor research, or a lack of research, can lead to flawed decisions, wasted resources, and missed opportunities for real impact.
Here are six common research challenges in organizations that can hinder progress, and how to recognize them before they derail your efforts.
Challenge 1. Unclear Research Questions
Many organizations dive into research without a clearly defined guiding question. Instead of focusing on a specific issue, they collect broad or unfocused data, leading to insights that are difficult to interpret or act upon.
Example: A nonprofit wants to understand why community participation in its programs is low. They launch a survey asking participants, “How do you feel about our services?” The responses are vague and don’t pinpoint the barriers to engagement.
Why It Matters: A well-defined research question sets the foundation for meaningful data collection and analysis that suggest relevant and effective actions. Without it, research and evaluation efforts can become scattered and unproductive.
Challenge 2. Poor Data Quality
Research is only as good as the data it’s built on. Poorly designed surveys, inconsistent data collection methods, or biased sampling can lead to misleading conclusions.
Example: A company conducts employee satisfaction research but only surveys staff who have remained with the company for over five years, thus missing out on critical feedback from those who have left.
Why It Matters: Low-quality data can lead organizations to make decisions based on false assumptions, ultimately undermining progress.
Challenge 3. Unrepresentative Samples
If research only captures a narrow segment of the target audience, the results will not be generalizable. Too often, organizations collect feedback only from their most engaged or easiest-to-reach stakeholders, causing them to miss key perspectives.
Example: A statewide healthcare organization wants to improve services for all of its patients but collects feedback primarily from urban residents.
Why It Matters: If your sample isn’t representative, your conclusions won’t reflect the experiences of the broader population, this can lead to flawed strategies.
Challenge 4. Lack of Time and Resources
Research takes time, expertise, and funding. Many organizations, especially nonprofits and small businesses, struggle to allocate the necessary resources. This often results in rushed or incomplete research.
Example: A school district wants to study student engagement but only has funding for a small, one-time survey. The results provide a limited snapshot rather than a comprehensive understanding of long-term trends.
Why It Matters: Inadequate research can lead to partial or misleading conclusions, which can be worse than having no research at all.
Challenge 5. Misinterpreting Data
Even when research is well-executed, organizations often misinterpret the results. For instance, leaders may see correlations and mistake them for causation, leading to incorrect conclusions.
Example: A company finds that employees who take part in wellness programs report higher productivity. They assume that wellness programs cause productivity gains without considering other factors like job role or work culture.
Why It Matters: Misreading research findings can lead organizations to invest in ineffective strategies while overlooking the real drivers of success.
Challenge 6. Failing to Apply Research Findings
Research is most valuable when it drives action, yet many organizations struggle to turn insights into meaningful change.
Example: A nonprofit conducts a thorough study on community needs but leaves the findings in a report rather than integrating them into program planning.
Why It Matters: Without a plan for applying research, even the best studies will sit on a shelf, unused and forgotten.
Conclusion
Research challenges are experienced by my organizations, but if having the proper data is a powerful tool for decision-making, but only when done well. By identifying and addressing these common research challenges, leaders can ensure their efforts lead to smarter strategies, stronger programs, and greater impact.
Need More Help?
Be sure to contact REC. We support organizations like yours in conducting quality projects that avoid research challenges.
Related Posts
Collecting Quality Data – 4 Questions
6 Tips to Collect Quality Data
Sources
Creswell, J. W., & Creswell, J. D. (2018). Chapter 7: Research questions and hypotheses. In Research design: Qualitative, quantitative, and mixed methods approaches (5th ed.). SAGE Publications.
Education Development Center (EDC) Solutions. (2018, March). Evaluation Tool – Sample Representativeness and Nonresponse Bias: Frequently Asked Questions. https://solutions.edc.org/sites/default/files/Sample_Representativeness_Nonresponse_Bias_FAQs_0_0.pdf
Fowler, F. J. (2014). Survey Research Methods (5th ed.). SAGE Publications.
Gugerty, M. K. & Karlan, K. (2018). Ten reasons not to measure impact – and what to do instead. Stanford Social Innovation Review, 16(3), 41-47. https://doi.org/10.48558/2A2K-0K07
Luca, M. & Edmonson A. C. (2024). Where data-driven decision-making can go wrong. Five pitfalls to avoid. Harvard Business Review, September-October issue. https://hbr.org/2024/09/where-data-driven-decision-making-can-go-wrong
Shtivelband, A. (2024, July 17). Survey bias exposed: How to spot and prevent them. Research Evaluation Consulting. https://researchevaluationconsulting.com/survey-bias-exposed-how-to-spot-and-prevent-them/
Shtivelband, A. (2024, February 23). 4 tips to get to know your target audience. Research Evaluation Consulting. https://researchevaluationconsulting.com/4-tips-reach-target-audience/
U.S. Centers for Disease Control and Prevention (CDC). (2024, August 18). CDC Program Evaluation Action Guide: Step 6 – Act on findings. https://www.cdc.gov/evaluation/php/evaluation-framework-action-guide/step-6-act-on-findings.html
Wingate, L., & Schroeter, D. (2007). Evaluation questions checklist for program evaluation. The Evaluation
Center at Western Michigan University. http://wmich.edu/evaluation/checklists