05.12.2016 Views

Is headspace making a difference to young people’s lives?

Evaluation-of-headspace-program

Evaluation-of-headspace-program

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

2. Evaluation Methodology<br />

initial K10 score. For this category, significance is measured at the 5% level. A clinically significant<br />

change occurs when the change in K10 is both reliably significant and also moves the <strong>headspace</strong><br />

client below or above the threshold K10 that represents a benchmark for the general population. A<br />

clinically significant improvement can be regarded as a change sufficient <strong>to</strong> revert the client <strong>to</strong> a level<br />

of psychological functioning that is consistent with that of a functional population.<br />

Thresholds for clinical significance require information on the distribution of K10 scores for a general<br />

population as a comparison. The CSC analysis presented in this report derives threshold K10 scores<br />

from information in the comparison surveys. The analysis presented in Table 4.1 shows the mean<br />

change in K10 scores for each category of change. This analysis shows for example that those<br />

<strong>young</strong> people in the clinically significant improvement category had a mean reduction between first<br />

and last recorded K10 of 14.6 points; those in the insignificant improvement category had a mean<br />

reduction of 4.8 K10 points; and those in the clinically significant decline category had a mean<br />

increase in psychological distress as measured by the K10 of 14.1 points. Results of these methods<br />

of analyses (DID and CSC) are reported in Chapter 4.<br />

2.5 Evaluation Limitations<br />

Like all evaluations of complex human service programs, a number of unanticipated challenges were<br />

encountered during the course of the evaluation. It is important <strong>to</strong> consider these challenges and<br />

limitations when interpreting the findings presented throughout this report. This section provides a<br />

summary of the issues that were of particular significance for the evaluation.<br />

Attribution<br />

Attribution is a challenge in any evaluation, particularly those without an experimental design such as<br />

a fully specified Randomised Control Trial (RCT). Due <strong>to</strong> the diversity of the <strong>headspace</strong> treatment,<br />

clients, and service providers, it is neither feasible nor reasonable for an RCT <strong>to</strong> be conducted <strong>to</strong><br />

assess the overall impact of <strong>headspace</strong> services on <strong>young</strong> people accessing <strong>headspace</strong> centres<br />

across Australia. Given this, one approach <strong>to</strong> attribution is <strong>to</strong> exploit the existence of a ‘natural<br />

experiment’. This method seeks <strong>to</strong> compare the relative progress of a ‘comparison’ group of <strong>young</strong><br />

people who can be considered similar in their economic and social circumstances, and with similar<br />

presenting conditions, <strong>to</strong> the <strong>headspace</strong> treatment group. To achieve a closer degree of alignment<br />

between treatment and comparison groups, the evaluation team matched the two samples on a set<br />

of observed characteristics using propensity score methods (see Appendix C for further information).<br />

This can provide some degree of identification of effectiveness in principal, although there are<br />

limitations with this approach relative <strong>to</strong> full RCT methods.<br />

Comparison groups<br />

As outlined above, the <strong>young</strong> people surveys were completed by a sample of <strong>headspace</strong> clients<br />

and two comparison groups: a sample of 12-17 year olds who participated in Young Minds Matter, a<br />

national survey of children’s health and wellbeing; and a sample of 18-25 year olds sourced through<br />

a national online panel. For purposes of the evaluation, the comparison group was separated in<strong>to</strong> a<br />

‘no treatment’ group of <strong>young</strong> people from the general population who had not accessed <strong>headspace</strong><br />

or any other treatment for a mental health or drug and alcohol condition, and an ‘other treatment’<br />

group who received alternative forms of mental health care between the two waves of data collection.<br />

Due <strong>to</strong> data limitations, the evalua<strong>to</strong>rs are not able <strong>to</strong> assess the type, intensity or duration of the<br />

alternative treatment received by <strong>young</strong> people in the ‘other treatment’ group.<br />

In order <strong>to</strong> attribute changes in the intervention group <strong>to</strong> <strong>headspace</strong>, the comparison group should be<br />

as representative as possible of the <strong>headspace</strong> population in terms of demography and wave 1 levels<br />

of psychological distress. Comparative analysis of demographic data showed that the 18-25 year<br />

old comparison group was somewhat different <strong>to</strong> the <strong>headspace</strong> population. To address this issue,<br />

the evalua<strong>to</strong>rs under<strong>to</strong>ok propensity score matching of survey groups. This method allows for a<br />

closer comparison between the ‘<strong>headspace</strong> treatment’ and comparison cohorts but it is not a perfect<br />

comparison. The evalua<strong>to</strong>rs were unable <strong>to</strong> match on more than four variables without significant<br />

<strong>difference</strong>s in distributions, and the method does not account for unobserved <strong>difference</strong>s between<br />

the treatment and comparison cohorts.<br />

Social Policy Research Centre 2015<br />

<strong>headspace</strong> Evaluation Final Report<br />

16

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!