Learning and Evaluation
The Endowment is building a culture of learning where we routinely use data, evidence and experience to adapt and improve strategy and performance. The Evaluation team collaborates with Endowment staff and grantees to support effective grantmaking. We divide our work into four streams:
Strategy development. To ensure that all our grantmaking is done with clear intention, purpose and expectations, the Evaluation team coordinates with the program areas to develop and support a collaborative process for creating theories of change to guide grantmaking decisions. We adopt language and create tools that foster consistency and support collaboration while accommodating the varying circumstances and approaches of the different program areas.
Assessment. This traditional domain of evaluation involves designing and carrying out studies to assess the quality or impact of an intervention, and it remains an important part of our work. We consult on selected projects, often in collaboration with external evaluators, to help build credible evidence of what works.
Data and analytics. Grantmakers depend on data to understand their respective fields and do effective strategic planning. Their work requires timely data and informative visualizations for tracking and supporting ongoing projects and for understanding their impact. The Evaluation team provides support to the grantmaking areas in all these functions.
Learning. The Evaluation team promotes practices and tools to embed learning into our work, focusing on:
- Deepening understanding and use of evidence.
- Creating opportunities to identify and share insights across teams.
- Disseminating what we’re learning with the field.
These efforts are intended to support strategy that is clear and testable, promote data- and evidence-informed decisions, and foster collaborative learning.
Focusing Externally and Internally
The Endowment has a role in measuring and supporting the effectiveness of the programs we fund. Our work includes planning, implementing and assessing our own overall grantmaking strategies, which may encompass many individual projects and continue over several years. Responsible use of our resources requires that Trustees and staff regularly re-examine these grantmaking strategies to see that they remain grounded in our founder’s Indenture of Trust but are also relevant to the changing contexts of the people and institutions of North Carolina and South Carolina.
The Duke Endowment’s Six Guiding Principles for Evaluation
- Evaluation is integrated into all of our grantmaking. It touches planning, implementation and assessment and focuses both externally and internally.
- Our focus is on evaluation as a tool for learning. We also recognize the importance of accountability and improving performance.
- To ensure that results are used for learning, evaluation must be conducted in an open spirit of inquiry, and results must be communicated widely.
- We use a mix of evaluation methods. For evaluations of grantee projects, method choice follows evaluation purpose, which follows purpose of the project.
- To avoid undue burdens on grantees and inappropriate use of resources, we will measure only what we will use, and use everything that we measure.
- Whenever appropriate, evaluations should be participatory and grantee-controlled.
The Duke Endowment commissions evaluations for several reasons. We sometimes support evaluations as an integral part of an overall grantmaking strategy for an area of work. For example, evaluations of innovative technologies or programs may help build evidence about what works or expand the range of effective interventions available to a field.
Evaluations may also be commissioned as a way of enhancing the impact of important initiatives. Such evaluations are focused on gathering data as a program is implemented, facilitating learning and continuous quality improvement.
In this section, you can access reports on evaluations commissioned or supported by the Endowment as well as those focused on supported programs, even if funded by another source. By providing access to this material, we hope to highlight evidence for the impact of successful programs and also share lessons learned.