DRG Learning CoP: Guidance on Addressing Learning and Evaluations Challenges

May 03, 2024

Fri | 01:00 PM - 02:00 PM EDT

Related tags

Help us validate guidance on some of the most common DRG challenges on learning and evaluation. 

During this CoP session, we’ll hear from Social Impact's Luis Camacho, Cloudburst's Kate Marple-Cantrell, and USAID's Daniel Sabet on a draft version of this guidance and have a chance to provide feedback, weigh in on some of the more difficult challenges, and fill in remaining gaps. Come with your good ideas and examples. 

Many evaluations and assessments involve a team of researchers conducting a large number of key informant interviews (KIIs), group interviews, and focus group discussions (FGDs) at one point in time over a three- to four-week period of fieldwork. Yet such studies regularly produce several complaints from commissioners, evaluators, and implementers related to the accuracy and reliability of the findings and the subsequent usefulness of the study. In response, Social Impact (SI), The Cloudburst Group, and the United States Agency for International Development’s Bureau for Democracy, Human Rights, and Governance (USAID/DRG) have been developing minimal expectations and good practice guidance to address seven common challenges and pain points in qualitative evaluation work. These include 1) Case and site selection for small-n studies, 2) Selection of respondents, 3) Social desirability bias, 4) Qualitative data capture, 5) Qualitative data analysis, 6) Evidentiary support for statements, and 7) Clarity of findings to facilitate use.