Our three golden rules for reporting

Street League’s fantastic new campaign calls for more charities to address low public confidence in the sector by being more transparent around impact.

To truly support those in need we too have to accept that sometimes we don’t always get the results we want and we have a duty to tell this with accuracy and transparency. Balancing the powerful stories of our young people alongside the bare facts is a key part of this. Which is why like Street League we also pledge to uphold to the three golden rules for reporting:

1. We will never over claim what we do:

We all need to be very aware of this in how we use statistics. There is increasing pressure from funders to be seen to be bigger and working with larger numbers.

We specifically target young people most at risk of future unemployment, who need the most support in order to make a successful transition from school to sustained employment. Nearly three years ago we made a decision to halve the number of young people our Progression Coaches work with in order to allow them to provide the support necessary to support this high risk group.

Instead of making claims of working with large numbers, we instead focus on the ‘distance travelled’ and intensity of support needed to achieve this.

We work with external, independent evaluators to undertake studies of the impact of ThinkForward. We have recently received the result of two evaluations, and will be sharing some of the findings and learnings from these in blogs over the next few days.

2.      All percentages are backed up by absolute numbers to avoid being misleading:

We have a huge range of data on the more than 1400 young people we have supported over the past five years. This includes information on their improved attendance and behaviour at school, results in qualifications, progression to college or employment, and whether this was sustained. We report on this to funders and other stakeholders, and look to provide both absolute numbers and context (eg survey response rates) when doing so.

3.      All our outcomes could be independently audited:

We ask all our Coaches to collect evidence from third parties regarding outcomes. This might be letters from employers confirming jobs, copies of pay slips or exam results, or school data on changes in attendance or behaviour.

Previously, a large proportion of our funding came through a social impact bond, which required us to submit the above evidence to receive funding. This ended in 2015, but we still feel it is vital to retain this external evidence of the impact of our work.

Using external organisations to undertake evaluations of the impact of ThinkForward helps ensure these studies are independent.

 

ThinkForward reports on the impact of our activities in a range of ways which are consistent with Street League’s Call for Clarity and with the broader responsibility which we, like Street  League, believe that the charity sector has to be clear and transparent when reporting our impact.

– Article written by Luke McCarthy Programme, Development Manager at ThinkForward

ThinkForward Evaluation: cost benefit analysis

The cost benefit analysis explored the financial implications of outcomes achieved by young people supported by ThinkForward. Data from ThinkForward London schools was included and we used the New Economy Manchester  model for financial implications of outcomes achieved.The study took a very conservative approach to assessing potential savings, for example only treating young people who had previously had two or more exclusions from school as being at risk of permanent exclusion.

The evaluation concluded that every £1 spent resulted in cost savings of £2.48. The biggest future savings are through the following average per participant savings over the course of their working lives:

  • £768 to schools/local authorities due to young people being less likely to be permanently excluded
  • £544 to Department for Work and Pensions through reduction in Jobseekers Allowance
  • £410 to criminal justice system through reduction in crime-related costs due to lower likelihood of young people being involved with criminal activity

This evaluation was undertaken by an Executive Masters of Public Administration candidate in Global Public Policy and Management offered jointly by UCL in London and NYU in New York as their Capstone project. This was supervised by an expert in Cost Benefit Analysis at NYU.

 

ThinkForward Evaluation: outcomes study

We have completed two studies into the outcomes that ThinkForward achieve.

Education Endowment Foundation Randomised Control Trial:

ThinkForward took part in the randomised control trial (RCT) to help the EEF test the effectiveness of two different RCT methodologies which would then inform the design of a larger randomised control trial. The RCT was conducted over a two-year period across two academic years. Randomisation took place at both school and pupil level. Regarding the methodology, the trial found that the most effective randomisation should be at a school level. On the educational attainment of young people, the study did not find evidence of improvement in GCSE attainment. However, teachers, Coaches and young people from across both schools reported that they believed the programme was beneficial and the study notes improvements in young people’s behaviour.

For more information on the RCT please click here.

London School of Economics Outcomes Study:

As a result we ran a second outcomes study exploration the correlations between behaviour, attendance, attainment, Ready for Work capabilities and baseline factors such as gender and ethnicity.

We ran this across three Thinkforward schools in which we have implemented an updated programme design in which our Progression Coaches work with the most at-risk ten young people per year group and begin doing so from the beginning of year 9. This change was based on our initial analysis of the first two years of delivery of ThinkForward and a “Theory of Change” process to improve the way the programme was designed to provide even better outcomes for young people through lessons learned.

The key findings of this study were:

  • statistically significant increase in attendance in year 9 of +8.5% (79.3% to 87.8%)
  • statistically significant improvement in behaviour points in year 11
  • statistically significant worsening of behaviour at one school in year 9 (we are currently exploring the reason behind this with the Progression Coach in the school, including whether there were particular factors which contributed to the observed changes or the school’s approach to behaviour shifted during the period)
  • increases in all Ready for Work capabilities in Y9

This evaluation was undertaken by a postgraduate student studying Management Science at London School of Economics, supervised by an experienced statistician and with input from Project Oracle. We are currently working with Project Oracle to get this evaluation validated against their ‘standards of evidence’.