At the design stage you will define your evaluation questions, identify an appropriate methodology, and plan and budget for the evaluation activities. Use the following guidelines to help design and plan your impact evaluation.
Impact evaluations assess the changes in development outcomes that are caused by a particular project, program, or policy. To establish a causal relationship, impact evaluations rely on a set of experimental and quasi-experimental methods. The following links offer an overview of the main methodologies:
To implement your impact evaluation you may require technical assistance from evaluators and data specialists. This section gives you a head start in identifying and contracting service providers.
EXPRESSIONS OF INTEREST
TERMS OF REFERENCE
Terms of Reference (TORs) establish the roles and responsibilities, activities, products and schedules of the parties involved in an impact evaluation. The following examples can be adapted based on the needs of each particular program:
Examples of TORs:
- Comprehensive TORs
- Data collection firm TORs
- Principal investigator TORs
- Technical support and supervision TORs (QA)
ROSTERS OF EVALUATORS
High quality data are a key input for impact evaluations. These survey materials can be adapted for your data collection needs using these instructions. Also, see our questionnaire designer manual and the data entry manual for guidance on preparing your survey.
Once the data required to conduct an impact analysis are available, the data will be analyzed using statistical software and appropriate estimation strategies. In this section you can find useful codes and guidelines to assist you with your data analysis.
STEPS TO ANALYZE IMPACT EVALUATION DATA
1. Validate survey data and potential attrition
- Article on attrition analysis of surveys in developing countries "Attrition in Longitudinal Household Survey Data: Some Tests for Three Developing- Country Samples"
- An example on the attrition analysis of a long-term panel survey "An Analysis of Sample Attrition in Panel Data: The Michigan Panel Study of Income Dynamics"
2. Check the internal validity of the data
- Balancing check
- Placebo analysis regarding clinical trials
- Lack of spill-over effects. SPD’s guide "Program Evaluation and Spillover Effects"
- Case study of deworming project "Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities"
3. Impact estimation
- Robustness checks “ A Practitioner’s Guide to Cluster-Robust Inference ”
4. Results presentation
- Report template: outline of key content for a full impact evaluation report makers and consumers of evaluation results.
- Policy brief template: outline of the content included in a short brief intended for policy
EXAMPLES OF IMPACT EVALUATION ANALYSIS
JPAL offers data and code for many of their impact evaluations. For more information click here.
Once the data is analyzed, the results are then reported through presentations, evaluation reports, papers and policy brief. In addition, evaluations results are increasingly communicated using multimedia. Examples of these dissemination strategies are presented below.
TOOLS FOR REPORTING EVALUATION RESULTS
- Policy brief template : outline of the content included in a short brief intended for policy makers and consumers of evaluation results
- Report template : outline of key content for a full impact evaluation report
- Examples of infographics for impact evaluations:
- Videos and multimedia:
This content is dynamic
a. Introductory readings:
- Impact evaluation in practice (Gertler et al.; World Bank, 2010).
- Evaluating the impact of development projects on poverty. A handbook for practitioners(Baker, Judy; World Bank, 2000).
b. Intermediate readings:
- Evaluating anti-poverty programs (Ravallion, Martin; World Bank, 2005).
- Program evaluation and spillover effects (Angelucci and Di Maro; IDB, 2010).
- Handbook on impact evaluation: quantitative methods and practices (Khandker et al.; World Bank, 2009).
- Recent developments in the econometrics of impact evaluation (Imbens and Wooldrige; Journal of Economic Literature, 2009).
- The mystery of vanishing benefits: an introduction to impact evaluation (Ravallion, Martin; World Bank Economic Review, Vol. 15, No. 1).
a. Experimental methods:
- In pursuit of balance. Randomization in practice in development field experiments (Bruhn and McKenzie; World Bank, 2008).
- Using randomization in development economics research:a toolkit (Duflo et al.; Center for Economic Policy Research, 2007).
b. Quasi-experimental methods:
- A primer for applying propensity-score matching (Heinrich et al; IDB, 2010).
- Instrumental variables and the search for ddentification: from supply and demand to natural experiment (Angrist and Krueger; Journal of Economic Perspectives, 2001).
- Regression discontinuity designs:a guide to practice (Imbens and Lemieux; National Bureau of Economic Research, 2007).
Designing impact evaluations for agricultural projects (Winters et al.; IDB, 2010).
Early childhood development
Methodologies to evaluate early childhood development programs (Behrman et al.; World Bank, 2007).
Impact evaluation for school-based management reform (Gertler et al.; World Bank, 2007).
Guidelines for impact evaluation in education using experimental design (Bando, Rosangela; IDB, 2013).
Impact evaluation for slum upgrading interventions (Field and Kremer; World Bank, 2008).
Evaluating the impact of cluster development programs (Giuliani et al.; IDB,2013).
Building in an evaluation component for active labor market programs: a practitioner’s guide(Card et al.; IDB, 2011).
Impact evaluation for land property rights reforms (Conning and Deb; World Bank, 2007).
Impact evaluation for microfinance (Karlan and Goldberg; World Bank, 2007).
Methodologies to evaluate the impact of large scale nutrition programs (Habicht et al.; World Bank, 2009).
Evaluating the impact of regional development programs (Winters and Sitja; IDB, 2010).
Science and technology
Evaluating the impact of science, technology and innovation programs:a methodological toolkit(Crespi et al.; IDB, 2011).
Technical guidelines for evaluating the impacts of tourism using simulation models (Taylor, Edward; IDB, 2010).
Conducting impact evaluations in urban transport (Boarnet, Marlon; World Bank, 2007).
Impact evaluation of rural roads projects (Van de Walle, Dominique; World Bank, 2008).
Water and sanitation
A guide to water and sanitation sector impact evaluations (Poulos et al.; World Bank, 2006).
- Budget template : a tool for estimating the costs of an impact evaluation.
- Concept note template : a document that describes the details of the impact evaluation methodology.
- Design template : a blank presentation to guide the main steps and components of an impact evaluation design.
- Impact evaluation checklist :a list of core activities to be carried out during an impact evaluation.
The credibility of the impact evaluation must be considered in the design stage of the evaluation cycle. Here we present some tools to facilitate the compliment of ethical and transparency protocols applied to the practice of the impact evaluation.
Prior to launching data collection it is imperative that safety and protection of survey participants be considered. Institutional Review Boards (IRBs) help ensure that research involving human participants is conducted in an ethical manner. This includes verifying that risks to participants are minimized, that their selection is equitable, that they are fully informed of what the survey entails, and understand the potential risks and benefits. Information about IRBs as well as available services are included in the following links:
- IRB FAQs
- Definition of IRB
- NIH IRB Training
- Association for the Accreditation of Human Research Protection Programs (AAHRPP). Provides training and certifies IRB's. Webpage includes a list of currently certifiedIRB's
- Biomed IRB.
- Chesapeake IRB. Independent IRB services
- Copernicus Group. Independent IRB services
- Institutional Review Board Services. Independent IRB services
- NRC-National Research Center. Provides services to implement surveys, program evaluation, performance measurement, policy building, etc. Includes IRB in consultancy.
- Office of Human Research Protections (OHRP) at the United States Department of Health and Human Services (HHS). Located within HHS, this officehas the leadership in human subjects and provides a list of lawas, regulation and guidelines on human subjects protection in over 100 countries.
- Pearl IRB. Independent IRB services
- Quorum Review IRB. Independent IRB services – Only US and Canada
- Sterling IRB. Independent IRB services
- Western IRB. Independent IRB services - International
- Open Science Framework
- Berkeley Initiative for Transparency in the Social Sciences
- Meta Research Innovation Center at Stanford
- ClinicalTrials.gov: registry and results database
- The American Economic Association’s registry for RCTs
- Equator Network: Enhancing the QUAlity and Transparency Of health Research
- 3ie’s Registry for International Development Impact Evaluations (RIDIE)
Further examples of survey questions, as well as questionnaires, can be found at:
- Demographic and Health Surveys (DHS)
- Family Life Surveys: Mexican Family Life Survey (MXFLS), Guatemala Surveyof Family Health (EGSF)
- Fundación SEPI: Survey on business strategy
- Living Standards Measurement Surveys (LSMS)
- National Household Survey Network(IHSN) Catalog of Survey Questionnaires
- OAS Victimization Surveys
- Standardized Tests
- Early Childhood Development
- Designing Household Survey Questionnaires for Developing Countries: Lessons from 15 years of Living Standards Measurement Study (Grosh and Glewwe; World Bank, 2000) : A comprehensive handbook with insights ofr customizing surveys and improving data quality.
- The Survey Quality Assessment Framework (SQAF) and reference guide (Spanish) assist survey managers with good practices for planning, implementing and documenting surveys. The guidelines can be used to develop new surveys and evaluate existing surveys. SQAF advice for impact evaluations can be found here.