Tweet about this on Twitter Email this to someone Share on LinkedIn
logo-ivar

Closing well: Ending the work of a ‘spend-out’ trust

From the outset, the Trust was designed as a spend-out organisation. From day one, we have been planning for closure. Working as a time-limited organisation creates opportunities, but can also present challenges. Closing an organisation, and ending partnerships with grantees, can be complex and time-consuming. Honest, open dialogue combined with a clear plan and willingness to be flexible can help ensure that grant partners, and the programmes they deliver, are left in a strong position.

Sustainability has always been at the core of the Trust’s programmes, ensuring that the work we deliver with our partners will continue long after we close – as a lasting legacy in honour of The Queen. As such, our approach has focused on integrating our programmes into government policies and supporting work that will be able to continue into the future. We worked with established partners to deliver a number of programmes. Concluding these partnerships, efficiently and effectively, was one of our priorities as we approached closure.

We found it important to state, clearly and unequivocally, that we intended to spend all of our funds and close. We wanted to avoid any level of uncertainty of behalf of our partners. It was most helpful to talk through all the details of our closure plans with partners from an early stage. The more fully partners were able to understand the logistical and legal intricacies of closure the more they were able to prepare. Each step was discussed several months in advance to allow each partners’ financial and legal teams to feed into the process. Although we had a standard process, each relationship was unique and required a bespoke approach.

We chose to close all the Trust’s programmes six months before the Trust’s public closure. This provided us sufficient time to address any challenges and complete the grant closure process. Our experience has shown that we needed the full six months in order to complete closure responsibly. The Trust had a comparatively small portfolio of 28 grants, although several of these involved a significant number of organisations working in consortia in multiple countries.


We maintained frequent contact with all our grant partners throughout the grant period to track spending and ensure that all funds would be responsibly spent by the time the Trust had closed. In the months running up to the planned closure date we needed to be flexible to allow partners to reallocate small amounts of funding. This ensured that all funds could be spent effectively on programme priorities within the Trust’s lifetime.


Our lawyers helped us to put together grant closure letters that summarised various legal, reporting, communications and data issues into one document. We then shared a draft with each grant partner to allow time for their own legal teams to suggest and edits. This process helped both sides to understand their rights and responsibilities. It also highlighted any outstanding issues, such as ownership of intellectual property, which were still to be resolved.


The overall process of closing all of our grants was intended to be comprehensive but straightforward to implement. Final reports from partners would be reviewed at the Trust, followed by a meeting or call to discuss the overall impact and if any issues remained. Once all parties were satisfied that the programme had been completed a letter would be signed by both parties, whereupon the grants would be considered officially closed.


Operating as a spend-out organisation has provided us with a clear focus on what we wanted to achieve. This approach has guided our strategic and operational decisions throughout our lifetime. We have remained focused on our mission and we have been forced to make sometimes difficult decisions about how we can create the greatest impact with the time and resources available to us. Having sufficient time and resource dedicated to closing our programmes, and concluding our relationships with our partners, has enabled us to leave the stage confident in the legacy of work and hopeful for the future of the Commonwealth.

Taking a strategic learning approach to evaluation

 

Nick Wilsdon, Learning & Evaluation Manager at the National Foundation for Youth Music shares his experience of embedding a strategic learning approach to evaluation. He highlights the journey of change in his organisation, and concludes with helpful tips for anyone embarking on a similar journey. 

 

Context

Youth Music is a national charity, investing in music-making projects for children and young people experiencing challenging circumstances. Our projects help young people to develop musically, but they also yield positive personal and social outcomes too. At any one time, we will have between 350 and 400 active projects, working with somewhere in the region of 75,000 children and young people annually.

 

Evidenced based funding practice sits at the heart of our organisation. Impact is central to our business plan, and evaluation and learning are a crucial to our mission. We are dedicated to measuring the impact of our work, disseminating learning which serves to inform our funding practice and to inform music-making practice in the sector through the generation of evidence based resources and outputs.

 

From accountability to strategic learning

Prior to the inaugural meeting of the UK Evaluation Roundtable in 2014, the Institute of Voluntary Action Research (IVAR) published a framing paper which posited three main uses for evaluation: accountability, demonstrating impact and strategic learning.

 

Over Youth Music’s short history, it is possible to trace a trajectory that runs from using evaluation for accountability purposes, through to demonstrating impact and towards strategic learning.

 

Our first monitoring forms from around the turn of the century sought to identify if funding had been allocated in line with funding agreements – in short, accountability.

 

By 2008, we had established an internal Research & Evaluation department, who introduced an outcomes approach and published our first annual impact report shortly after. It was clear at this stage that we were using evaluation to demonstrate impact and understand how our funding had made a difference.

 

By 2012, we had absorbed this ethos into the roles of those managing grants through the creation of the Grants & Learning Officer role during an organisational restructure, embedding learning within the management of grants. We also had established a consistent outcomes framework across our programme. At the time of writing, we are in a position to use evaluation in (nearly!) real time to inform the decisions we make, and we regularly adapt our strategies in response to the changing circumstances around us; strategic learning has become a very real part of how we operate as an organisation.

 

Shifting mindset

There are numerous practical guides about adopting impact practice (see for example NCVO’s excellent wiki on ‘How to build an impact culture’ or NPC’s equally good ‘Four pillar approach’) and as many organisations will attest, this is not something that happens overnight. Whilst envisioning significant organisational change may appear straightforward on the surface, enacting it involves substantial organisational commitment. Moreover, organisational buy-in is a pre-requisite to successfully building an impact culture, so it is vital to engage senior leaders, trustees and colleagues in the process. Resources like Inspiring Impact’s Measuring Up! tool provide an excellent framework to reflect on your organisations current impact practice with colleagues and can help identify and prioritise areas for development. Highlighting the long-term benefits is essential in order to gain the necessary buy-in to overcome the challenges in the short term.

 

On a day to day basis, our priorities for handling information are driven by the three guiding principles which are crucial to strategic learning (as highlighted by IVAR):

 

  1. Asking the right questions and getting the right data
  2. Structuring the work to enable regular use of data
  3. Effectively processing and using the data

     

Recognising that we are a relatively small charity, with access to a comparatively substantial volume of information from our grant holders alone, foregrounds the necessity for efficiency. Likewise, our portfolio contains many small, grassroots organisations often with limited resources – as such it is essential that we collect quantitative and qualitative data that is focused on the questions that we need to answer in line with our business planning.

 

Likewise, we seek to make our data as portable as we possibly can. Our quantitative data is published annually and we are currently developing mechanisms to run automated reports which allow for more frequent and detailed analysis. By coding against our outcomes framework in qualitative analysis software, we have created an index and searchable database of the rich range of experiences of all those involved in our projects. This has allowed us to ground everything from internal strategy documents to external guidance and resources in our evidence base, ensuring it is relevant to our stakeholders.

Cycle of learning

An openness to learning helps nurture an organisational ethos that is open to change. By mapping the ebb and flow of knowledge both internally between teams and externally across stakeholders, inefficiencies and missed opportunities can readily be highlighted. For example, we noticed that our Grants & Learning officer’s daily consumption of information through direct contact with our portfolio was sizeable, yet only a fraction of this was formally captured.

 

By recognising our Grants & Learning Officers as the gatekeepers of this information, and adopting an organisational approach to learning (see Crossan et al.) we devised a light-touch mechanism to support the individual intuition, team interpretation, and organisational integration in a feedforward process. Our Grants & Learning Officers meet on a regular basis to discuss issues of interest to the organisation, distilling key information for dissemination to the wider staff team in all staff sessions. The resultant documentation is indexed for internal use, and our Communications team prepare extracts for external distribution.

 

This process allows for the transfer of intelligence beyond the individual, ensuring that key information exists beyond the each member of the team. 

Taking a strategic learning approach to evaluation

As Crossan et al. identify, the process works in both directions; the feedforward process supports exploration (i.e. assimilation of new learning) and feedback processes allow exploitation (i.e. making use of what has already been learned). By supporting learning across the organisation, nurturing the tension between exploration and exploitation, it is possible build an impact culture that becomes rewarding and close to self-sustaining. Through this process, organisations can build an openness to change which ultimately supports strategic renewal.

 

Tips

  1. Map the flow of knowledge – Who has access to what information? Who does not have access to beneficial information? Who are the gatekeepers of knowledge? How can you easily share that knowledge both internally and externally?
  2. Engage your senior leadership team and trustees in the process – Demonstrate your assets and highlight the untapped potential
  3. Reflect on your organisations impact practice – Tools like Inspiring Impact’s Measuring Up! can help identify strengths and highlight areas for improvement
  4. Optimise periods of change – significant organisational changes can be stressful times, but they can also provide opportunities to lay the foundations for new ways of working.
  5. Seek out light-touch ways of capturing knowledge – Hold team and all staff sharing sessions and think about the potential audiences for all information to maximise its potential
  6. Nurture a culture of learning and allow the organisation ownership over it
  7. Create resources in accessible places and refer people to them at every opportunity. Index your data where possible, and create structure that allows you to cut it in many different ways
  8. Ensure that you have the appropriate skills within your staff resources to process, interpret and analyse the data you collect

 

www.youthmusic.org.uk
@youthmusic
@nick_wilsdon

F
irst published on the ACF website

Getting on the impact bandwagon

Impact is all

In recent years, focussing on ‘impact’ has become de rigeur in the charity world. As funding gets harder to come by, demonstrating impact has moved to centre stage. Everyone is obsessing about it, worrying about it, trying to prove it, producing toolkits about it. Passion and good intentions are no longer enough, says Dan Corry, the CEO of New Philanthropy Capital: ‘In the charity sector, impact is everything’[1]. As charities try to adapt to ‘payment by results’ and come to terms with ‘social return on investment’, advising voluntary organisations on how to measure impact is a growing industry.

 

At a time when extra financial resources to achieve more outcomes are unlikely to be forthcoming, making the resources we have work harder and more effectively is a must. NPC, 2012

 

You know it makes sense

It is hard not to go along with this. It is not only funders (including individual donors) who want to know they have made a difference. So do we, as voluntary organisation staff, volunteers and trustees. Finding an effective way of demonstrating impact is the holy grail when it comes to attracting supporters as well as funders, and maybe reaching potential beneficiaries too. We need to know what works and what doesn’t. We want to be accountable to those who have invested time and resources into our organisation or project.  

 

Or does it?

So, off we go on an internet search to find the best toolkit. And there is plenty of choice. A quick google search throws up countless guides – from funders, consultancies, specific programmes (it seems everyone wants to invent their own) – and several dedicated websites. But in amongst these, there are quite a few health warnings: ‘Let’s be realistic about measuring impact[2]’, says one; ‘Three reasons why I hate impact measurement[3]’, says another, ‘Talk of charities ‘proving their impact’ is dangerous and misleading[4]’ says a third. Is ‘impact’, just another term in the management lexicon – another process to compete for time alongside performance management, quality assurance and evaluation? Is it different from evaluation? And there is a lot of talk about theory of change – where does that fit in?

 

What are the pitfalls?

Measuring impact is not easy – and it is certainly not a quick fix.

 

First, there is the question of how to define impact. What do we mean by it – and do we all mean the same thing? Are we looking for something we can measure in the short-term? If we want to convince funders we probably are – funding cycles are often short – but in the complex worlds in which most of us operate it’s going to take a while to make a difference and there will be hiccups on the way. If we focus on early deliverables, this could well get in the way of the longer-term outcomes we really want.

 

Then we have to think about methods. How are we going to measure impact? There are any number of challenges to demonstrating that what we are doing is making a difference.

 

First there is the question of what can be measured. Not everything that ‘counts’ can be counted and few exercises can measure impact directly – so impact is often confused with milestones, outputs and outcomes.   Most of us are engaged in complex initiatives, which cannot be boiled down into a few simple indicators. The nature of change is more complex and multilayered than the things that can be measured in numbers and statistics can capture. So focussing on targets that are relatively easy to measure may distract us from what really needs to be done. Measures need to be tailored to the specific intervention, not taken off a shelf, if they are to be meaningful. And relying on data may replace the trust and dialogue that learning requires.

 

It is vital that organisations identify and assess impacts that are truly relevant to their work, not simply transferred or taken from elsewhere…

 

There are growing concerns that funders and commissioners requirements are shaping and dominating approaches to impact measurement in the third sector over the needs of service users, beneficiaries and TSOs themselves….

 

Key questions are what implicit values underpin and characterise this environment, and whether and how public sector performance regimes reflect and capture the specific impacts of TSOsBirmingham.ac.uk

 

 

This raises the question of resources: the more rigorous the measurement of impact is, the more resource intensive it is.

 

It’s complicated, expensive and often impractical for early-stage enterprises….It requires a level of research expertise, commitment to longitudinal study, and allocation of resources that are typically beyond the capabilities of implementing organizations. It is crucial to identify when it makes sense to measure impacts and when it might be best to stick with outputs — especially when an organization’s control over results is limited, and causality remains poorly understood.  Alnoor Ebrahim, HBR March 13, 2013.

 

Even when millions are invested in evaluation, complete with control groups, the outcomes can look fairly modest – the evaluation of the New Deal for Communities being a case in point. The control group approaches that some funders still want are often highly resource intensive, methodologically difficult and ethically dubious. A rigorous design focusing on outcomes and impacts determined at the start might squeeze out miss unanticipated developments, unanticipated outcomes. And what happens if you want to change your approach midstream? Change is not linear – A does not necessarily lead directly to B.

Then there is the time factor – when can we expect changes to take effect? In early year’s work we might have to wait for more than a decade. Too often funders want evidence of impact before a particular programme or intervention has even finished. Sometimes a sensible question is simply asked at the wrong time[5]. Rarely is the process of impact measurement introduced right at the start, so there is no baseline to measure progress against.

 

After all this, there is also the question of attribution. If we do manage to demonstrate some sort of impact, is really down to us? Can we really say it is what we are doing that is making the difference? And if we don’t, does that mean we have failed?   There are many factors that go to make changes. Most of us are part of a wider pattern of provision. And the rest of the world doesn’t stand still. Can we isolate what we have done from all the other factors and interventions that have affected people’s lives? If nothing changes, it may be because things way beyond our control have got in the way. Floods, populations changes, austerity, the loss of a local employer may have far more of an impact than what we have done. Sometimes standing still in the face of adversity is an achievement in itself.

 

…development and change are the results of wider processes that are the product of many social, economic, political, historical and environmental factors – including power struggles between different interest groups. Understanding these processes is important, if the changes brought about by a given project or programme are to be properly situated in their broader context. Save the Children Fund (2003) Toolkits: a practical guide to planning, monitoring, evaluation and impact assessment, p. 127

 

So should we forget about impact?

Of course not. Despite all this, thinking about impact is important. It forces us to be clear about what we are trying to achieve and how. It focuses us on ends as well as means; it directs our attention to long-term as well as short-term aims; it can help us to learn and adapt, to test our assumptions about what needs to be done to effect change.

 

But it needs to be done in a way that is realistic, proportionate and fit for purpose.

 

Proof is a big concept and in social science – which is what impact research is – it is almost never found. Scientific investigations rarely prove things – disproving things is much easier – but they nonetheless help us understand how things work and hence make educated decisions. They inform our judgement by reducing uncertainty – but not to nilThirdsector

 

 

Thinking about impact should be about learning and communication. It can be done in a way that is rooted in your values and the way your organisation works and enables you to have a genuine dialogue with your funder and other stakeholders from which you can all learn.   This is where theory of change approaches come in – getting everyone round the table at the beginning to think about what change they want, how they want to make it happen and how they will know whether they are on the right path. Be imaginative about how you can tell your story.

 

If is often useful to have an introductory workshop to discuss what the different interest groups mean by impact. This can help create a shared understanding of the process and help decide which areas of change to focus on.

 

Save the Children Fund (2003) Toolkits: a practical guide to planning, monitoring, evaluation and impact assessment

 

This involves both you and your funder thinking about change as a journey:

 

  1. Are we clear about where we are going and why? Make sure that everyone is on the same page but first allow time for honest discussion so that you are clear about where people are coming from.

  1. Be realistic about what you can find out and what you can do within the resources and time availableWe don’t all have to do random control trials – it is possible to agree indicators that are meaningful to all parties and can provide a plausible account of how the intervention is likely to lead to the intended impact.

  1. Be clear about why you are doing it and make sure that it is relevant and worthwhile. Use the process as an opportunity for reflection, dialogue and learning throughout – an opportunity to take stock together along the way.

  1. Be honest – we learn from what doesn’t work as well as what does. This means funders need to respect that honesty too).

  2. Combine resources. If you are a small organisation with limited resources, it may work to team up with others like you to learn together; if you are a funder, bringing projects together to work on assessing impact with their peers can be a powerful way of learning.

  3. Agree what you are going to do with it.



[1] www.theguardian.com/voluntary-sector-network/2014/feb/24/charity-impact-measurement-results-outcomes.

[2] hbr.org/2013/03/lets-be-realistic-about-measuremnet.html

[3] blogs.ncvo.org.uk/2-15/05/18/three-reasons-why-i-hate-impact-measurement/

[4] http://www.thirdsector.co.uk/talk-charities-proving-impact-dangerous-misleading/article/1376815

[5] blogs.ncvo.org.uk/2-15/05/18/three-reasons-why-i-hate-impact-measurement/