Speeches Shim
Locating evaluations does not ensure that the evidence they contain will be utilized. CDCS development team members, busy with other tasks, may not all find time to read them. Having one person or a small team summarize the most important findings of relevant evaluations can help a whole team.
For this purpose, a simple, optional evidence template rather than a long report may be useful. If evidence is provided in a table of this sort, include a URL link to the each report cited.
Evaluation Evidence Summary
- Evaluation: Evaluation of Trade Hubs Located in Accra, Ghana; Gaborone, Botswana; and Nairobi, Kenya. Africa Trade Hubs Export Promotion Evaluation 2013
- Specific Evidence: The Trade Hubs throughout Africa met or surpassed their targets, despite having different performance-based contracts and terms of reference. Trade Hubs, with varying degrees of success, have facilitated intra-regional trade through programs on transport corridors, customs modernization and harmonization, sanitary and phyto-sanitary (SPS) training, etc., and have integrated activities well with different stakeholders when appropriate. Their programs and activities advanced the policy objectives of the AGOA. However, communications strategies in the Hubs were not robust enough to facilitate implementation of multi-faceted, multi-agency programs such as the TRADE Initiative.
- Evidence Strength: Document reviews; site visits; key informant interviews; online client and partner surveys (responses: 81; response rate 23%); survey instruments and tabulations included.
What is Evidence Strength?
USAID'S interest in bringing high-quality evaluation evidence to bear on decisions about future programs is evident throughout its CDCS Guidance, as well as in the distinction USAID's Evaluation Policy makes between the high quality, or strength, of evidence that can be produced through rigorous Impact Evaluations and the relatively low quality of evidence sometimes found in older USAID evaluations. When evaluation evidence about a particular country, or a development hypothesis, is summarized, it is useful to include a brief note that suggests how strong a particular piece of evidence is, and thus, how seriously it should be taken. A review of an evaluation's methodology will usually reveal the degree to which it used formal social science research methods; the number and types of data sources; the data series used and other elements that affect the quality of the evidence it provides. USAID’s Education Office (E3), together with the World Bank, DFID and UNICEF, recently released a volume on assessing the strength of evidence that is highlighted in the “Featured” section of this page.
For additional information on approaches USAID uses to examine the quality of evaluations and the strength of the evidence they produce, see the Monitoring Evaluation Quality Over Time pages on this website.
See Also
Evidence Gap Maps: while time intensive, can be an important tool for informing strategic policy, program and research priorities. Evidence gap maps (EGMs) consolidate what we know about what works in particular development sectors or thematic areas.
For trade in Africa, the Systematic Reviews of Best Available Evidence explores the relationship between the African Growth and Opportunity Act (AGOA) and increased trade in countries.
<< Locating Evaluations | Up | CDCS Evaluation Evidence Template (Optional) >> |
ProjectStarterBETTER PROJECTS THROUGH IMPROVED |
A toolkit developed and implemented by: For more information, please contact Paul Fekete. |
Comment
Make a general inquiry or suggest an improvement.