Impact Evaluation Toolkit

This web-based toolkit offers a step-by-step guide on how to evaluate the impact of interventions, especially those related to maternal and child health and those involving results-based financing (RBF). According to its developer, the World Bank Human Development Network, the guide can also be easily adapted for impact evaluation (IE) in other fields. The toolkit includes:
- Guidelines that feature best practices for each stage of the IE cycle, such as how to choose evaluation questions, build a team, design an evaluation, and collect and analyse data.
- Over 60 Tools, including terms of reference for team members and survey firms, household and health facility questionnaires, data-entry programmes, and materials for training enumerators and supervising field work. These standardised tools can facilitate cross-country comparisons of the results of RBF projects.
The website allows users - even those with limited bandwidth - to navigate through the toolkit following the impact evaluation cycle, which is split up in Modules represented by website tabs. Both Guidelines and Tools can easily be downloaded from each module of the website. There are eight modules:
- Choosing Evaluation Questions: This module explores why it is important to evaluate RBF interventions. Users will find out how to establish a theory of change and a results chain and how these help develop hypotheses that an IE can test through evaluation questions. It includes first-generation questions on whether RBF works and second-generation questions on how RBF can work better. By the end of this module, users should know how to develop IE questions relevant to the context in which they work. These questions should respond to key aspects of interventions and fill knowledge gaps on particular topics of interest.
- Building the IE Team: Users of this module will find out how to set up an IE team, as well as define the roles and time commitments of each team member. By the end of this module, users are expected to be ready to start building and contracting a team that can achieve needed tasks.
- Designing the IE: Steps include:
- defining output and outcome indicators of interest that translate research questions into observable variables;
- determining an identification strategy, including the "arms" of a study that respond to research questions and guarantee a valid comparison group;
- making sure the sample has sufficient power to detect changes across a representative population;
- knowing which data will be needed to measure indicators of interest;
- how the timeline, team composition, and budget must align and adjust simultaneously; and
- how to plan for dissemination activities.
- Preparing the Data Collection: Steps include:
- scheduling data collection;
- reaching formal agreements ex-ante with stakeholders on data ownership;
- making sure participants in the survey are protected by defining a research protocol and getting the study ethics right;
- determining a data entry strategy and what's needed from survey firm(s);
- hiring one or more survey firm(s); and
- adapting and preparing data collection instruments and entry tools.
- Implementing the Data Collection: Users of this module will dig into data collection activities and data quality safeguards, exploring how to define the sample and sampling frame in the field and create unique identifiers for geographical areas and study arms. The module explores the benefits of data collection pre-test and proper data entry management plans. It also teaches users to plan for fieldwork supervision, recruit and train field teams, conduct a pilot test and, how to manage fieldwork and ensure regular reporting on fieldwork activities. At the end of this module, users are expected to know how to provide sufficient training, support, supervision, and communication channels to ensure quality fieldwork activities and data.
- Storing and Accessing Data: This module explains why properly storing and documenting data matters, how this can be done, and who is responsible for it. Users will be introduced to the World Bank data catalog and its key features. They will also learn to distinguish which types of data should be stored, where to store it, and who should access it. By the end of this module, it is expected that users will know how to ensure that survey respondents are not put at risk, how to keep the outcome of the survey safe with minimal risk of damage or loss, and how to provide easy access for the impact evaluation research team.
- Analysing Data and Disseminating Results: This module examines the functions of each IE report (baseline, midline, endline), in addition to providing hints on how to clean the data, create variables including indicators of interest, and validate the design of the IE at baseline. Users will receive guidance on impact analysis to supplement the impact analysis methods presented in Impact Evaluation in Practice (Gertler et al. 2011). Finally, they will learn how to build an analysis in the spirit of efficient dissemination to key stakeholders.
- Monitoring and Documenting the Intervention: This module looks at how monitoring and documenting programme implementation can help inform the relevance of IE design and analysis. It includes examples of programme implementation adjustments that can affect the IE and why it is important to monitor programme implementation. Users will also learn how complementary data sources can help save monitoring costs and time. By the end of this module, it is expected that users will know what to monitor, what can be used to monitor interventions, and how documenting results can help ensure that the IE stays relevant over time.
The toolkit was developed with funding from the Health Results Innovation Trust Fund (HRITF), whose objective is to design, implement, and evaluate sustainable RBF pilot programmes that work to improve maternal and child health outcomes for accelerating progress towards reaching Millennium Development Goals (MDGs) 1, 4, and 5.
Editor's note:
- Disclaimer: "The Toolkit was developed by Christel Vermeersch (World Bank), Elisa Rothenbühler (World Bank), and Jennifer Sturdy (Millennium Challenge Corporation), with substantial inputs from World Bank regional teams. The authors' views expressed in this publication do not necessarily reflect the views of the World Bank."
Emails from the World Bank Task Team and Elisa Rothenbuhler to The Communication Initiative on July 27 2012 and October 24 2012, respectively.
- Log in to post comments











































