Deadline: 04-Jun-2024
The United States Agency for International Development (USAID) is seeking applications from all eligible organizations to implement the project, Promoting Impact and Learning with Promoting Impact and Learning with Cost-Effectiveness Evidence (PILCEE).
USAID seeks qualified organizations to proactively and intentionally advance inclusive development to ensure the participation and inclusion of all people in programming, including those who have been historically marginalized.
USAID’s independent Office of the Chief Economist (OCE) supports the Agency in bringing strong economic theory, evidence, and tools to bear to improve the Agency’s programmatic effectiveness and broader global engagement. Two of the Office’s strategic objectives are promoting the use of existing cost-effectiveness evidence in Agency decision-making and the generation of new cost-effectiveness evidence to fill important evidence gaps.
This activity aims to address multiple gaps within USAID as well as in the broader evidence ecosystem.
- Gaps within USAID
- Despite a shift within USAID towards applying cost-effectiveness evidence for improving the design and implementation of activities, for various reasons, many activity planners and Agreement Officer’s Representatives or Contracting Officer’s Representatives (A/CORs) miss opportunities to integrate learnings from the existing contextually relevant evidence base of impact evaluations into their activity designs and implementation.
- Although USAID’s Evaluation Policy outlines USAID’s commitment to impact evaluation and identifies randomized evaluations, when viable, as the strongest method thereof, USAID rarely conducts impact evaluations, including randomized evaluations. Most existing monitoring, evaluation, and learning (MEL) mechanisms are designed to accommodate a wide scope of MEL activities, and are not designed around the specific personnel and process requirements of high-quality, randomized evaluations.
- Gaps in the broader evidence ecosystem
- Despite growth in the global impact evaluation literature over the past two decades, many development and humanitarian challenges face critical evidence gaps with respect to effectiveness and cost-effectiveness of relevant possible interventions and policies. Such gaps relate to, for example, context, outcomes, time horizon, intervention design, population, vulnerability to shocks, and combinations thereof.
- Despite growth in the global impact evaluation literature over the past two decades, long-term measurement remains a persistent gap in the global evidence base for most, if not all, interventions and policy.
Funding Information
- The award’s Total Estimated Amount (TEA) allows a maximum award ceiling of up to $74,900,000, structured as follows:
- A $50,000,000 Leader Award will allow the successful applicant to manage a portfolio of evidence activities as described under the “Performance Objectives”.
- $24,900,000 of potential additional funding may be awarded noncompetitively by USAID Missions or other OUs, to support additional activities that fall within the technical scope of the award. This may include:
- A maximum of $14,900,000 in potential Associate Awards
- A maximum of $10,000,000 in potential Buy-Ins
- Potential applicants should note that the TEA is only an estimated maximum, and not the same as actual obligations of funding into this award. All actual obligations are subject to available funding.
- Total Estimated Ceiling Leader Award: $50,000,000
- The estimated start date will be September 30, 2024. The anticipated period of performance is five (5) years.
Intermediate Results
- Through this activity, OCE seeks to achieve the below Intermediate Results (IRs) to address these gaps for development and humanitarian challenges:
- IR 1: Increase the use of contextually relevant cost-effectiveness evidence in USAID strategic planning, policy-making, activity design, and implementation.
- IR 2: Increase the generation of cost-effectiveness evidence through high-quality randomized evaluations of USAID programs. Randomized evaluations under this activity will yield both actionable cost-effectiveness evidence on the programs being evaluated, including over longer timeframes, when appropriate, and knowledge that expands the global evidence base, serving as a global public good from which other development and humanitarian actors can learn.
- Under IR2, OCE will prioritize randomized evaluations that fit one or more of the below criteria:
- The relevant design and implementation team is proposing a novel or understudied approach to a development or humanitarian challenge or outcome that USAID funds at a high dollar value;
- A randomized evaluation is feasible and an appropriate tool to answer an impact or operational question;
- Plans to collect cost data to enable cost-effectiveness analysis;
- Plans to collect data over the longer term to understand e.g. the question of impact fading out over time; and
- The evaluation could form part of multi-site research.
- To these ends, OCE anticipates that academics from the social and behavioral sciences (and other disciplines, when appropriate) will be involved across the range of tasks envisioned as part of this activity. Academic involvement is valuable for three reasons:
- To be of the technical quality necessary so that work commissioned not only serves USAID informational needs but also contributes to the global public good of knowledge;
- To bring in the most recent innovations in measurement, econometrics, and experimental design so as to improve the quality of the knowledge generated; and
- To foster genuine and potentially long-lasting partnerships with local researchers at local universities, or the equivalent.
Performance Objectives
- Recognizing that evidence use and generation work best in tandem, the PILCEE activity will be capable of engaging academics and non-academic research, policy, and learning institutions with several activity design and implementation teams concurrently in different stages, including these Performance Objectives:
- Strategic Planning and Policy-making
- Provide expertise from leading academics in the social and behavioral sciences (including development economics) to shape strategic planning and policy-making on systemic issues, such as climate change or public health
- Provide related training, convenings, and other fora towards economics-informed strategic planning and policy-making by USAID leaders and partners
- Activity Design and Implementation
- As described, PILCEE intends to support evidence activities that improve the design and implementation of USAID activities. Efforts that may support this include, but are not limited to:
- Synthesize existing, relevant evidence from high-quality impact evaluations in the social and behavioral sciences and cost data to identify the likely relative cost-effectiveness of different interventions in achieving a particular development or humanitarian outcome, and identify contextual and implementation details that may influence the relative cost-effectiveness of different interventions;
- Contextualize existing, relevant evidence to specific USAID operating environments, objectives, and target populations. This key step in the evidence use process requires combining detailed knowledge of local institutions, infrastructure, cultural norms, market opportunities, and other understanding of context with global knowledge on human behavior and causal mechanisms linking interventions to outcomes to customize recommendations to the specific context and population targeted by a particular USAID activity;
- Improve the designs and implementation of activities of other USAID OUs by working with them to understand and apply evidence on which approaches are most likely to achieve desired outcomes in a given context, i.e. taking these four steps:
- Defining the specific mechanism underlying a potential approach to achieving the desired outcome;
- Assessing the strength of the global body of evidence base for the general mechanism;
- Using a thorough, documented process to assess whether the theorized mechanism is likely to apply in the specific local context for the activity being designed; and,
- Using a thorough and documented process to assess whether the potential approach under consideration can feasibly be implemented with fidelity to the global evidence base in the local context, i.e. that the implementers can adapt the interventions and replicate the mechanism locally, thereby yielding a roughly similar level of effectiveness.
- As described, PILCEE intends to support evidence activities that improve the design and implementation of USAID activities. Efforts that may support this include, but are not limited to:
- Impact Evaluation Scoping and Related Research Design
- With the involvement of academics where appropriate, consult with OUs and whenever possible, IPs and other stakeholders on research questions and impact evaluation objectives
- In consultation with OUs, IPs, and other relevant stakeholders, assess feasibility of a randomized evaluation;
- Where beneficial, and in consultation with OCE and relevant OUs, conduct orientations to or trainings on randomized evaluations, including implementation considerations, for USAID Missions, OUs, IPs, and other stakeholders;
- Draft evaluation Scopes of Work (SOWs) as appropriate, including proposed evaluation methodology, sampling design, minimum detectable effect size, statistical power, data collection plan, analysis plan, evaluation team composition, and other relevant components;
- If the relevant OU(s) move forward with a randomized evaluation, then the academics also lead evaluations as part of the research leadership team (i.e. those who provide intellectual leadership and serve as coauthors on any academic ouputs) under (IV. Evaluation Implementation);
- Data collection should be designed to answer key impact and cost-effectiveness questions, and, where possible and desired by OUs, inform timely analyses of intermediate results and/or provide actionable analysis on operational questions
- Where relevant to the intervention and research questions, the design should include long-term data collection or otherwise tracking variables over a long period of time; and
- Impact Evaluation
- Draft work plan, analysis plan, and submit to Institutional Review Board (IRB);
- Coordinate with Mission and IP on Activity Monitoring and Evaluation Plans (AMELPs) to ensure complementarity to and utility for the impact evaluation;
- Monitor activity implementation;
- Manage data collection during the evaluation period;
- Place an embedded research coordinator with the activity IP, wherever feasible and with formal agreement with the IP, to facilitate coordination, data-sharing, and fidelity to the randomization design;
- Evidence Dissemination, Capacity-building
- Organize logistics and present at internal and external events, including academic conferences, workshops, seminars, and other learning and dissemination events;
- Fund publication fees for open access to otherwise-gated papers published in journals, based on evaluations supported under this activity;
- Build capacity of USAID staff, implementers, host-country partners (including government and non-government implementers and researchers), and other stakeholders on impact evaluation methodologies, especially randomized evaluations, as well as the evidence base on humanitarian and development objectives;
- Build capacity of stakeholders on cost analysis and cost data collection, synthesis of evidence, adaptation, and contextualization of evidence to new contexts;
- Strategic Planning and Policy-making
Capabilities Needed
- The successful applicant should have extensive experience and expertise in:
- All USAID sector areas: in terms of subject matter, the activity scope will cover all USAID humanitarian and development objectives, including agriculture and food security, anti-corruption, conflict prevention and stabilization, democracy, human rights, and governance, economic growth and trade, education, environment, energy, and infrastructure, gender equality and women’s empowerment, global health, humanitarian assistance, nutrition, water and sanitation. The applicant, however, should propose focus areas for evidence use and generation that include activity types that are common in USAID and have a relatively weak evidence base;
- Scoping, conducting, and disseminating findings of randomized evaluations in the development and/or humanitarian space;
- Designing and implementing evaluations that include long-term data collection (including intentional tracking protocols and necessary infrastructure);
- Building partnerships between academics in the social and behavioral sciences and policymakers, practitioners, and implementers around conducting randomized evaluations, synthesizing evidence, and applying insights from that evidence to policy and practice.
Activity Principles
- Value of randomizing interventions(s) for measuring impact
- Randomized treatment assignment, when appropriate, is the highest-quality method that current social and behavioral science offers for measuring the causal impact of a given treatment(s) on outcomes. Randomized evaluations are the default for both using and producing evidence under this activity. OCE expects that the bulk of work under this activity will be based on, or be, randomized evaluations, because most of the work will seek to use or produce high-quality estimates of impact of interventions or of alternative intervention designs.
- Evaluation for learning
- OCE views impact evaluation as a tool to generate knowledge that is relevant to inform programming decisions (within and, ideally, beyond USAID). Each evaluation should be designed to inform a specific decision within the Mission/OU, to generate a specific recommendation to the relevant government counterpart(s), to inform approaches and strategic priorities at the Agency level, and/or to contribute to the global evidence base by testing a novel or innovative intervention or theory that has yet to be experimentally evaluated. PILCEE should focus on producing high-quality, decision-relevant evidence on interventions, causal chains, and human behavior, ultimately contributing to improved programming across humanitarian and development sectors.
- Early engagement in the Program Cycle
- It is imperative that evidence use and generation activities undertaken through PILCEE be incorporated as early as possible in the Program Cycle, with particular attention to timelines for activity design and implementation. Existing evidence from the global impact evaluation literature should be brought to bear early on in the activity design process to inform intervention selection and design.
- Beyond single evaluations to bodies of evidence
- No single evaluation should be overly relied upon to determine policy and practice decisions. Instead, the PILCEE activity will promote synthesis and meta-analysis, from either individually conducted but related studies, or from coordinated efforts to answer common questions across multiple sites. Commissioning of such work should be informed by first identifying key knowledge gaps that would inform the programs, policies, and strategies of USAID and other partners, including local governments.
- Contextualization of the global evidence
- There is a common misperception that using global evidence can be at odds with using local information and knowledge. OCE believes these need not be at odds, and in fact are essential to use together. Strong global evidence provides guidance to which contextual factors need to be understood in order to confidently use the global evidence for local policy, or design new, high-quality evidence generation efforts.
- Localization
- OCE encourages the engagement of local research and data collection organizations, as well as local academics, to the greatest extent possible—the latter in leading or joint decision-making roles in this activity, in order to strengthen both evidence use and evidence generation globally in the long term. Applicants should propose approaches to achieve these goals as much as possible, incentivizing the meaningful involvement of local academics and organizations, particularly to include any research analysis and writing phase.
- Diversity, Equity, Inclusion, and Accessibility
- Related to the prior principle, critical to achieving all USAID goals is fully prioritizing, embodying, and advancing diversity, equity, inclusion, and accessibility (DEIA) among the people, partners, and programs—at home and abroad. For example, this may include recruiting research teams with a combination of academic profiles (e.g., tenured professors, graduate students, post-doctoral fellows, or others), demographic characteristics (e.g. expatriate versus local researchers), and other dimensions.
- Gender
- With logic similar to that of the DEIA principle, and recognizing both the impact of funding research on career advancement as well as the importance of diverse perspectives in design and interpretation of evidence, the PILCEE activity also prioritizes advancing women in academia through purposeful inclusion on research leadership teams
- Research
- Ethics In line with USAID’s Scientific Research Policy, when applicable, PILCEE research should first undergo and pass review by a properly constituted ethical review committee (ERC) or institutional review board (IRB, which is most common in the United States) for human subjects protections.
- Data Use and Publication
- In accordance with ADS 579, the PILCEE implementer should at minimum upload (or reference, in case they already uploaded to another publicly accessible database) any research data they collect to USAID’s Development Data Library (DDL) or the forthcoming Consolidated Digital Library (CDL) in compliance with ADS 578, Information Quality Guidelines, where applicable.
Performance Indicators
- USAID/OCE expects a small number of indicators to be useful for managing performance of this leader with associates award. Applicants should propose indicators aligned to the theory of change underpinning their proposed approach. Illustrative indicators include:
- Number and dollar value of USAID activities for which engagement with the PILCEE activity successfully influenced intervention selection, design, or implementation decisions toward more cost-effective approaches
- Number of academics engaged, including as research leadership team members, disaggregated by gender and by whether the academic is local
- Number of evidence reviews that meet agreed-upon standards
- Number and dollar value of USAID activities for which engagement with the PILCEE activity delivered a high-quality randomized evaluation
- Number of peer-reviewed academic publications that were based on evidence or meta-analysis conducted under a PILCEE activity
Eligibility Criteria
- Eligibility for this NOFO is not restricted. USAID welcomes applications from organizations that have not previously received financial assistance from USAID including but not limited to nontraditional partners—including local actors, organizations representing or led by marginalized groups, U.S. small businesses, faith-based organizations, cooperatives, and civil society organizations.
- Faith-based organizations are eligible to apply for federal financial assistance on the same basis as any other organization and are subject to the protections and requirements of Federal law.
- Public International Organizations (PIOs) may not apply for funding under this opportunity.
For more information, visit Grants.gov.