Evaluation covers a broad range of activities:
Performance management e.g. reporting and monitoring
- Strategic learning
Here, the term is used to represent the suite of foundations’ evaluation related activities focusing on the use of, and demand for, evaluative information rather than solely on evaluation.
The Evaluation Roundtable is a network of foundation leaders in the UK, the US and Canada. The Roundtable aims to improve evaluative practice in foundations by infusing it with cutting-edge ideas, and by providing foundation staff with an opportunity to refine and deepen their thinking and practice.
It is a resource for information on what other foundations are doing, as well as for ideas about where and how foundations might develop their practice. The Roundtable is uniquely positioned to help experienced evaluation leaders refine their practice and consider how evaluation’s role, positioning, and focus within a foundation might best be designed.
Who is the Roundtable for?
Leaders in evaluation, including CEOs, directors of evaluation, learning or impact, senior programme and grants officers with the authority to effect change in their organisations.
A warm thank you to all those who fund our work and to CCLA who host Roundtable events.
Learning & development
We run twice-yearly seminars with facilitated discussions on a range of activities including; performance and knowledge management, and organisational and strategic learning.
Each seminar focuses on a specific theme and one or more members share a teaching case related to an experience within their organisation.
Participants share and generate insights that help them to manage evaluation within their organisation more effectively. Collectively they will also be contributing to developing evaluation practice within the sector.
How do grantees benefit?
The focus of improving evaluation within foundations should always be on the benefits it will bring to grantees, in terms of strengthening the impact of their activities and helping to capture learning in a way that is mindful of grantees’ own time and capacity.
02 February 2017
Learning in responsive grant making – Thinking out loud
In recent months we have been looking into theory and ideas that might help us figure out just what good learning looks like.
01 September 2016
Building a culture & structure for learning
At Pears Foundation, we place a strong emphasis on learning: both on our own learning, and the learning of our partners. It sits at the heart of our funding model, to 'commit, learn and refine', as well as supporting and informing the long-term, core-funding relationships that make up the majority of our giving.
Roundtable focus 2016/17
Commissioning Evaluation, with teaching cases presented by Children in Need and Pears Foundation
May 2017: How to be a learning organisation, with a teaching case from Esmée Fairbairn
Evaluation Within UK Trusts and Foundations
Tanya Beer, Ben Cairns, Julia Coffman, Rebecca Moran
This report presents the first ever picture of evaluation within larger trusts and foundations in the UK.
A big thank you to those who contribute funding and their time throughout the year and to CCLA for hosting Roundtable events.
Improving Evaluation Design
Ben Cairns, Katie Turner
This paper, produced for the Evaluation Roundtable event on 27th April 2016, offers a practical resource for foundation staff in assessing and/or improving their processes for the design stage of commissioning external evaluations.
Guide: Improving Evaluation Design
Gina Crane, Communications and Learning Manager at Esmée Fairbairn Foundation, shares her experience of using the new Improving Evaluation Design guide.
‘Put together by a group of funders, including me, ‘Improving Evaluation Design’ is not based in academic theory, but on direct practical experiences of how UK charitable foundations learn from the work they support.
As someone who doesn’t commission much external evaluation, I wasn’t sure how relevant the guide would be for me, but after it was published I used it twice in the first week! It is surprisingly useful to have a document that takes you back to the basics of evaluation. It’s short, to the point, and can be used to structure a discussion with partners, or as a reference guide to check back with once things are underway.
It was a helpful starting point for Esmée when considering how to learn from a new project – most importantly it prompted us to question the assumption that we must commission an evaluation. Why? Who for? Who would use it?
It is frighteningly easy for an evaluation brief to expand, and become unrealistic, especially where there are many funding partners. The guide has helped me ask tough questions about what we really need from the evaluation.’
The theme of the 2017 Roundtable convening is learning in responsive grant-making. Resources will include a literature review on the idea of a ‘learning organisation’; interviews with around 30 foundations on their experiences of organisational learning; and a teaching case that will tell the story of how learning has been thought about, developed and organised within Esmée Fairbairn Foundation over the last 15 years.
By responsive grant-making we are talking about foundations who, typically, will define to some extent what is to be addressed but allow significant latitude for how that issue will be tackled, taking the view that a funder’s role is to support action in a particular area, or at the grassroots, more than to work towards any particular outcome. Such an approach places a significant emphasis on the relationship between funder and funded partner. Agendas should be explicit, but flexible, with room for adjustment based on learning.
For responsive grant makers, without more tightly bound programme/initiative strategies, we can see that there is a particular challenge: how to make the most of the data available to it when that data is being gathered or offered from quite diverse settings and contexts. Without a proactive strategy and a set of target outcomes, there are fewer, if any, big questions around which more focused learning (and thus data collection) can be hung. This gives rise to the question: what does learning look like in responsive grant making?
Learn more about IVAR