My cheat sheet on ‘Value for Money: Lessons for NGOs’

Today sees the launch of  Assessing and Managing Value for Money: Lessons for NGOs – 68 pages documenting five years of blood, sweat and tears trying to work out what ‘value for money’ (VfM) means for development and humanitarian work.

The report, written by the Value for Money Learning Partnership and Mango, draws on the experience of 28 NGOs which have a Programme Partnerships Agreement (PPA) with DFID. (* See note below for an explanation of the PPA.)

Skimming through the report  – it strikes me that VfM boils down to two things: data and decisions.

Data…

Data. Yikes! That’s more than a ‘thing’. There’s a whole airport of baggage to sift through. The executive summary is helpful here.

And before you get lost in the report, as one of those people who has sweated (if not quite cried) over VfM under the PPA, here’s my take on how to navigate what could otherwise be a lengthy lesson.

Like all good reads, start at the back. The one page Annex encapsulates the framework: four dimensions (leadership, integrating VfM into working practices, skills, tools) fleshed out with most of the report’s main points.

Then maybe flick through the report – each section has a handy, shaded box of ‘key points’ – so the essence of those 68 pages gets distilled in just five and a half pages.

… and decisions

Before you delve in deeper, remember – data and decisions:

  • What are you measuring? What is your value? Some nice illustrations on pages 44–5 of how British Red Cross and World Vision have gone about this.
  • How can you bring your performance and financial data together? OK, I’m skimming over a myriad of systems issue here. Ultimately you will need to steer your own path. But, don’t despair – it needn’t be an expensive, tortuous, comprehensive systems overhaul. Work out some indicators based on what you have (this means going beyond the logframe).  Box 17 (page 53) gives a few examples of VfM indicators developed by British Red Cross. PPA peers – are there more examples out there?
  • Are you being self-critical? For example, do you consider other options at the outset (a VfM basic)? Do you understand your cost drivers (another basic)? There’s a great example on pages 63–4 of how VSO brought together programme and finance teams to achieve this.

And as we’re talking about decisions, I should also recommend a quick flick to HelpAge’s decision making framework (Box 8, page 38).

Won’t we need an expert?

Perhaps the key point that’s missing from the Annex is don’t be put off by thinking you need fancy economic analysis. Yes, there’s a different tool for different occasions (the box on tool key points is useful here) – but you don’t need to dress up in these to be part of the VfM club.

The skills section is quite generic and could apply to any skill set (ie, recommending a constructive, questioning culture, and recognition of when and where specialist expertise is required). But note the gem hidden: although around half of those 28 agencies made use of external consultants (either to develop a VfM approach or to undertake specific VfM programme evaluations). Presumably around half didn’t.

Hopefully this guide will mean that in future even fewer agencies will feel that they can only work out their own approach to VfM by bringing in the ‘experts’. There’s a lot of peer learning now available.

How about investing in VfM programme evaluations? Social Return on Investment was the one notable example where external skills were brought in – something which the report notes is best used when there is ‘a large level of spend and a rich source of data available’.

In other words, in the vast majority of incidences VfM analysis needn’t be expensive and elaborate (see Box 11 page 43 for how WWF sought to build on existing good practice).

What it boils down to

A final note: The report may come across as flipping between the very VfM specific (eg, Action Aid trying out all VfM tools) and the generic (eg, Sightsavers finding out its portfolio didn’t match its strategy – though Box 10 page 42 explains the core approach Sightsavers took to integrating VfM into its systems).

This reflects many of the early discussions of the VfM Learning Group: “Isn’t VfM just good programme management?”

Well, yes, it does boils down to good financial and programme management. After all, the third E is ‘effectiveness’. The challenge now is to prove that our work is what we say it is. Effective. And that takes some thought around data and decisions.

 

* The PPA – or Programme Partnership Arrangement – has been one of the principal mechanisms through which DFID funds civil society organisations. In the current funding round (2011–16), DFID provided a total of £120 million a year to 41 organisations, with grants ranging from £151,000 to £11.2m a year. Through the PPAs, DFID supported civil society organisations that share its objectives and have strong delivery capacity. It provided organisations with ‘unrestricted’ funding, giving them the flexibility to follow agreed strategic priorities. For more on PPAs see the Independent Commission on Aid Impact’s 2013 review of the PPA funding mechanism or this post on the Guardian’s secret aid worker’s diary.

 

Leave a Reply