How far does a $5000 grant go, and how much evaluation can we ask for in return?

By Jen Riley, chief impact officer, SmartyGrants

Small grants play a large role in keeping our communities ticking over. Small grants pay for things like air-conditioning in the community hall, uniforms for the netball team, music stands for the senior citizens band, and the security system in the local art gallery. Small grants are what enable many community-based organisations to run events and connect people. When it comes to having a social impact, small grants punch above their weight.

SmartyGrants chief impact officer Jen Riley
SmartyGrants chief impact officer Jen Riley

At SmartyGrants, one of our main aims is to help funders to understand what works. If you want to improve a community’s health, should you fund small grants that will go towards nutrition education, a food bank, exercise classes, or all of the above? And how will you know when communities are healthier? How are you going to measure health?

Around $80 billion is given out in grants each year in Australia, according to our best calculations at SmartyGrants. Yet in many instances no-one is quite sure what is the purpose of that funding, and no one knows what the result of that funding has been.

This is true of $1 million grants and it’s also true of $5000 grants. Recently we’ve heard from several SmartyGrants users who operate at the lower end of that scale and who want to know:

  • What outcomes can we reasonably expect from our $5000 grants?
  • What measurement and evaluation can we reasonably ask recipients to carry out in return for a $5000 grant?

With small grants, the challenge is getting the right balance between merely reporting outputs (e.g. the number of uniforms given out to netball players or the number of street festivals run) and collecting useful and appropriate outcomes data, while not overburdening grantees. Keep in mind that when we’re talking about $5000 grants, we’re usually (though not always) talking about small community organisations run entirely by volunteers. Volunteers typically don’t have degrees in statistics, and their organisations don’t employ paid outcomes measurement specialists. The hours that volunteers put in come on top of all their commitments to paid work, caring responsibilities and family life.

How far does your funding go?

There is generally a strong correlation between significant, long-lasting outcomes and large budgets. Similarly, there’s a strong correlation between shorter-term effects and small budgets. (The holy grail of change-making is long-lasting outcomes that can be brought about by low-cost inputs.)

Typically, outcomes ranging from short-term to long-term come with the following price tags attached:

5k Funding Table Jen Aug2022

Now let’s look in more detail at a grant in the range $2,000 to $15,000. Below, we’ve listed some examples of potential long-term outcomes (beyond the scope of a small grant), shorter-term outcomes (directly related to a small grant), outcome statements and metrics associated with such a grant.

Outcomes Table Aug2022 Jen

It's perfectly reasonable for a grantmaker to expect to see some change for a budget of $5000. You would be unlikely to achieve all of the outcomes listed, but you could make some decent headway on at least one of them for a carefully defined and targeted cohort of participants.

If your ambition is to change behaviours, attitudes, social norms or social systems, then you will almost certainly need to provide more than $15,000 in funding. The work required to achieve these “deeper” outcomes is likely to require a higher “dose” of your intervention (e.g. multiple financial literacy classes), a higher quality intervention (e.g. not just education classes, but maybe an experience) and further reach to your intervention (e.g. making the initiative more accessible in order to reach more people, including those who need it most).

Small grants: the burden of reporting

The next question is about measurement and evaluation. What information should you be asking recipients of small grants to provide on this front?

In international aid, many practitioners use the 10–20% rule for monitoring and evaluation. That is, for every $100 spent on projects, $10–$20 is spent on the monitoring and evaluation of that project. This rule applies to all NGO funding under Australia’s Department of Foreign Affairs and Trade (DFAT) grant program.

For example, if DFAT granted $300,000 for a project designed to help war widows in Afghanistan to establish small businesses, the department would expect that $240,000 to $270,000 would be spent on the program itself, and $30,000 to $60,000 would be directed towards establishing measures, gathering data, evaluating outcomes, and reporting.

So, what reporting can we expect for our $5000 project? Applying the 10% rule, it is reasonable to expect $500 to be spent on collecting and reporting on evaluation data. In reality, $500 doesn’t go far, but the money could be added to the pool that the organisation uses to employ an evaluation specialist (if an organisation receives 20 grants a year, these small amounts add up). Or that $500 could be used as part-payment for an admin person to contact people who participated in the program, ask them some questions and collate the results.

However, data like this – discrete and very limited in scope – creates a quandary for grantmakers. That small grant of $5000 is likely to be just one slice of a much larger fund – let’s say it’s worth $500,000. As a grantmaker working to the reporting ratios mentioned above, you would expect to obtain $50,000 to $100,000 worth of reporting against your overall fund. But because of the way the funding is distributed, grantees will (and should) provide only a limited amount of data. The whole will not be equal to the sum of the parts.

To overcome this problem, as grantmaker you could increase the size of each grant, and give out fewer of them. Giving five recipients a grant of $100,000 each would result in five relatively comprehensive reports that could be combined into something much more substantial than you would get from combining the disparate data of 100 recipients of $5000 grants.

However, fulfiling your purpose as grantmaker is your first priority. If your aim is to improve netball accessibility and participation across many communities, then it makes sense to offer a uniform support grant of $5000 to 100 communities. Changing those grants to $100,000 each across five communities would not achieve the same outcomes.

The other option is that the granting organisation bears the cost and the work of undertaking the evaluation activities themselves. The challenge with this, of course, is the missed opportunity for the grantees to evaluate and learn about their work.

Grant managers often feel caught at the intersection between their grantees and their board, manager or auditor (whoever they are accountable to). If you’re in this situation, check in with your management stakeholders for guidance on what reporting they expect. Their expectations might not be as demanding as you’d think. One SmartyGrants user told us the story of asking the question and finding out that his stakeholders wanted to know how many people were employed through the programs they funded – nothing more. He set up a schedule to report those numbers and a system to record the data, and now everybody is happy.

On the other hand, if the expectations of the board or trustees are disproportionate to the size of the grants you are giving out, particularly in terms of the work required to collect the data, then you as grant manager may need to educate them about what is possible given the structure of your grants program. Just as you wouldn’t (we hope) require applicants to fill out a 20-page application form to have a chance of winning a $5000 grant, you shouldn’t require them to provide 200 pages of analysis of the local and national impact of that same $5000 grant. Your board or trustees might need to reset their expectations or change the structure of the grants program and therefore the reporting.

Providing evaluation support to grantees

We’re often asked what responsibilities grantmakers have to help or enable grantees to undertake evaluation. There are many things grant managers can do on this front:

  • When creating forms, limit the number of outcomes that grantees can report against
  • Provide some measurement and evaluation training to potential applicants, such as an outcomes 101 session
  • Work with grantees to help them ensure that the board of the recipient organisation has reasonable expectations when it comes to reporting
  • In collaboration with grantees, identify realistic outcomes and metrics that grantees can report against (or establish and communicate these)
  • Be very clear about your reporting expectations, providing as much guidance and training as possible, with examples.

All of these things hold true for grants of all sizes.

For small grants, though, the greatest act of support you can show to your grantees is understanding. Show them that you’re aware that every dollar you ask them to spend on evaluation, or every hour you ask them to dedicate to the task as a volunteer, is that much less currency they have for other things. At the same time show them, with compelling examples, the importance of the task, and how they can carry it out in a proportionate way. Measurement and evaluation shouldn’t be onerous for grants of any size, but especially not for small ones.

Ask Jen more about this topic

SmartyGrants’ chief impact officer Jen Riley has more than 20 years’ experience in the social sector, having worked with government and large not-for-profits of all kinds in that time, and been part of leading firm Clear Horizon Consulting. She’s a specialist in social sector change with skills in strategic planning, program, and product design and management. If you’ve got a pressing question about evaluation and outcomes measurement, ask here! You'll find the answers on the SmartyGrants forum (available to SmartyGrants users)

Sign-up to our newsletter