Anecdotes are stories people tell about their experiences. When we talk about anecdotal evidence we refer to the fact that someone is trying to prove something, built only on the stories they’ve come across as and when. The trouble is, those stories might not represent the whole picture. This is why it’s best not to rely only on anecdotal evidence, but to gather information in a more systematic way by checking if we are hearing from all sorts of people and asking them questions in the same way.


In this guide we’ve used the word ‘approach’ to mean the way you set about measuring impact. We use this word instead of tool or framework because those words usually relate to ‘named approaches’ like Social Return on Investment, Social Accounting and Audit, The SOUL Record, The Outcomes Star. Named approaches are only half the story. You can also work out a ‘bespoke’ way of exploring impact – that is to say an approach that is tailored to your needs that builds on general social research methods.


Assumptions are the things that have to be true for your findings to be true. Calling these underlying things assumptions can sound bad in contrast to calling them facts, but it recognises that one person’s fact can be quite different to another’s. People’s understanding can also change over time when new evidence comes to light. To make sure everyone knows what is underlying your judgements, its good practice to be clear about what your assumptions are.


An indication of how things stand before you carry out your activity. For instance - if you want to know if you’ve had an impact on people’s eating habits, you have to know what their eating habits were before you tried to change them. Without this basic knowledge of what came before, there is no way of telling whether anything has actually changed.


Tailored to your organisation’s needs (see ‘Approach’ above)


The process of looking at a large amount of unstructured information – such as notes or transcripts from interviews – and noting down whether the contents relate to recurring topics. By doing this you start to see which topics are common to all instances and which are particular to individuals. Coding can be more complicated than this, but it’s essentially about identifying themes from a mass of information.

Community Food Initiative

We use the term ‘community food initiatives’ to cover a wide range of organisations, associations, projects and enterprises. Community food initiatives can be charities, social enterprises, private businesses, community groups, public initiatives and projects. They can work on farming or growing activities, food processing or retailing, catering, training or awareness-raising, activities to support community members in general or particular members of the community with a health or social disadvantage. The thread running through all of these organisations is that their activity involves food and that they aim to have an effect on a particular set of people and / or a place.


Engaging with people is more than just asking them for specific information.  In the context of measuring impact, it’s about making sure you ask them what they think and not setting limits on the type of thing they can tell you, listening to the answers carefully, taking on board the feedback and asking people to be involved in an ongoing process.  In a wider context, engagement means involving people in the development of your activities.


The rules and good practice to follow to make sure that what you are doing does not harm you or the people you are studying.


Using data – whether monitoring, quality or outcomes data – to make a judgement about how the initiative or project is doing. Impact evaluation focuses on change, whereas process evaluation focuses on things like satisfaction, met expectations and communication.


Someone – whether from inside or outside the initiative – who makes sure that group sessions run smoothly by checking that people are in the right place, are doing what they should be and have all they need to carry out their tasks.  The role is about being objective and ensuring that everyone gets a chance to have their say.


A framework tells you what each step in your approach to assessing impact should look like, but it doesn’t tell you which exact data collection methods to use or questions to ask. Social Return on Investment is a framework approach.


An understanding of the effect of all the different outcomes created by your organisation as a whole – encompassing positive, negative, expected and unexpected change.
In this guide we focus on ‘social impact’ – what happens as a result of your initiative for people, organisations and society as a whole.  You might be measuring environmental impact as well.


Indicators are propositions that you can test to help pin down whether something has changed. For instance you might be hoping to help people overcome agoraphobia. A change in the number of times the person leaves their house after your initiative, compared to before their contact with you will tell us whether they are overcoming agoraphobia. This is an indicator.


Monitoring data is the information you collect about what has actually gone on within your initiative – visitor / customer / beneficiary numbers, information on gender, age, ethnicity etc, course numbers and types, weight of vegetables sold etc. While this gives you a good, solid understanding of your activity, it’s important to remember that it doesn’t tell you what happened as a result. To know that, you need data on change – outcomes data (see ‘Outcomes’ below).

Named approach

See ‘Approach’ above.


A method of data collection that doesn’t seem to be hard work or time consuming to either the people being asked to give their feedback or the people being asked to process the feedback once it’s been collected.


The individual changes that happen as a result of your initiative. These can be things like:
• Volunteers eat more vegetables
• Customers buy more local food
• Supported employees improve their catering skills to professional level
• People cook more meals from scratch


In this guide we often refer to ‘processing’ data rather than just analysing it. This is in recognition of the fact that to be able to count things, calculate statistics or draw out themes the raw feedback that people provide needs to be collected together in one place and organised – either on a spreadsheet or word document – to make sure that the data can be worked with. It’s important to make sure you assign responsibility within your initiative to someone who will process the data in this way – having someone willing to do the analysis will be no good if the data is still on 100 separate paper copies of a questionnaire sitting in a cupboard.


A word that’s used a lot to describe research that you can rely on. How the research is carried out, the assumptions people make and the methods for gaining feedback all contribute to how robust the research is. Being able to justify your decisions and explain your assumptions when carrying out the research is part of making the case that your approach is robust.


Quality assurance and quality standards are about the way things are done in an organisation. Most commonly they are not also about what results, but here are a few tools and frameworks available (for instance PQASSO) which combine the two. People quite often mistake information on satisfaction (quality data) for information on impact (outcomes data).


Literally – people who have a stake in your organisation. Your stakeholders are:
• The people who get something out of your organisation, including pay (your staff), skills and experience (your volunteers), support or training (your beneficiaries), goods or services (customers).
• The people who put something in to your organisation, including money (funders, investors, customers), time (volunteers and partners), goods in kind (donators) and referral or partner agencies.
• Any people they interact with who also experience change or put in effort to make the initiative happen.


See also ‘coding’ above. Themes are recurring topics that are apparent when you read through a mass of complicated information and start to try to make sense of what it is telling you.


See ‘approach’


When you write up your findings, you need to let your reader know how you reached your conclusions. This includes writing down what you collected data on, what method you used, what questions you asked, how many people you could have contacted, how many you did contact, how many replied, the numbers of people who said what, the way you processed and analysed the data, any method that you have the impression might not have produced clear feedback and what you intend to do with the findings now you have them. In this way, your research methods and assumptions are transparent.

 MLFW Logo