Level 3 Learning and Development - Month 6
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
Evaluating the Impact of <strong>Learning</strong> –<br />
Delivering on Business Needs<br />
<strong>Month</strong> 6 - Theme 3<br />
<strong>Level</strong> 3<br />
<strong>Learning</strong> <strong>and</strong> <strong>Development</strong><br />
Practitioner<br />
Apprenticeship St<strong>and</strong>ard
CONTENT<br />
03<br />
Introduction to <strong>Month</strong> 6<br />
Welcoming you to <strong>Month</strong> 6 of your L&D Journey<br />
Welcome to <strong>Month</strong> 6 of your <strong>Learning</strong> & <strong>Development</strong> Practitioner journey<br />
where, this month, we unpick the evaluation process <strong>and</strong> explore the theory<br />
<strong>and</strong> common practices behind the evaluation stage.<br />
04<br />
Reviewing learning from <strong>Month</strong> 5<br />
This month you will explore:<br />
06<br />
10<br />
12<br />
14<br />
16<br />
22<br />
26<br />
Why Bother with Evaluation?<br />
Back to the Basics of Evaluation<br />
Evaluation Theory <strong>and</strong> Thinking<br />
Evaluation Processes <strong>and</strong> Practices<br />
Planning your Approach to<br />
Evaluation<br />
Getting Creative <strong>and</strong> Flexible with<br />
Evaluation<br />
Ensuring your Chosen Tools are Fit<br />
for Purpose<br />
Scan the QR Code to read<br />
through this magazine<br />
on your mobile or tablet<br />
device.<br />
• Why we bother to evaluate<br />
• How <strong>and</strong> what we should be evaluating<br />
• Things to consider when analysing the data <strong>and</strong> presenting your findings<br />
Much like <strong>Learning</strong> Needs Analysis (LNA), the evaluation process can often be<br />
overlooked by practitioners <strong>and</strong> undervalued by stakeholders, yet it is a critical<br />
piece in underst<strong>and</strong>ing the value of an intervention.<br />
If we have not resolved the declared problem or met the need, we have not<br />
moved the organisation towards achieving its declared objectives.<br />
If we have not checked learning transfer or performance improvement, then we<br />
have wasted our resources; expertise, time <strong>and</strong> money.<br />
27<br />
28<br />
31<br />
Collecting Information on<br />
<strong>Learning</strong> Transfer & Performance<br />
Improvement<br />
Presenting your Findings <strong>and</strong><br />
Feedback<br />
Over to You - Your Tasks to<br />
Complete This <strong>Month</strong><br />
Icon Key<br />
To support you in underst<strong>and</strong>ing<br />
all the elements of the programme,<br />
everything has an icon to denote<br />
what kind of activity it is:<br />
Taught Content to<br />
Learn/Information<br />
Questions<br />
Tips<br />
Research<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
02<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
03
Let’s do some light-touch evaluation of your own learning!<br />
So, in service of checking learning transfer, let’s<br />
begin by checking in on some of your learning<br />
from last month.<br />
Reflect back <strong>and</strong> answer the following on<br />
designing interventions:<br />
• There are nine key stages in the<br />
design process; name them <strong>and</strong><br />
describe which resonated most with<br />
you <strong>and</strong> why?<br />
• How would you describe the difference<br />
between Aims <strong>and</strong> Objectives?<br />
• We discussed three levels of<br />
objectives; can you describe what they<br />
were <strong>and</strong> how they are all useful in<br />
their own way?<br />
• How does David Rock’s SCARF model<br />
impact learning design?<br />
• What were the 2 key takeaways that<br />
you took from Module 5, why were<br />
they impactful for you <strong>and</strong> how will you<br />
transfer these back into the workplace?<br />
Capture your responses in your Learner<br />
Journal.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
04<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
05
Why Bother with Evaluation?<br />
If you think back to the Training Cycle, evaluation is literally at the heart of the<br />
training process. Yet, much like LNA, it is frequently missed or relegated to a<br />
marginalised role, possibly because like most of an iceberg, it is less visible.<br />
The CIPD factsheet on evaluation is a great summary of evaluation processes<br />
<strong>and</strong> practices, but more insightful is the honest article by Jo Faragher. And this<br />
is the reality of evaluation; there is a process <strong>and</strong> some tools, but the next level<br />
of learning around evaluation centres around the associated challenges <strong>and</strong><br />
tricks to making it work for you.<br />
You can learn the “what” pretty easily; <strong>and</strong> we will. The challenge is developing<br />
your thinking around the “so what”, so you can optimise its impact for you, the<br />
organisation <strong>and</strong> the learner.<br />
Evaluation can, underst<strong>and</strong>ably, be seen as dry <strong>and</strong> disengaging; it can also not<br />
be understood. So, if our experience of evaluation processes <strong>and</strong> their success<br />
is negative or marginal, it is underst<strong>and</strong>able why there is limited appetite to<br />
engage with it more rigorously.<br />
Yet evaluation is the mechanism through which we ensure learning <strong>and</strong> its<br />
effective transfer in order to improve performance. Not only does it ensure<br />
that the need has been met, but also that the resources expended have been<br />
effectively used.<br />
A more active <strong>and</strong> inclusive way of ensuring evaluation is alive <strong>and</strong> well, <strong>and</strong><br />
front <strong>and</strong> centre in your thoughts, is to have it as an integral check in point<br />
at each point in the training cycle; during LNA, design <strong>and</strong> delivery stages. It’<br />
makes sense <strong>and</strong> it removes some of the stigma around seeing evaluation as a<br />
dry, transactional experience.<br />
It remains ironic that with time <strong>and</strong> money being premium resources <strong>and</strong><br />
organisations needing to work hard to engage, enable <strong>and</strong> retain their people,<br />
that evaluation either remains an afterthought or is a transactional experience<br />
focusing on performance in role.<br />
This raises two thoughts:<br />
1.<br />
Should we be<br />
bothering with<br />
evaluation at<br />
all?<br />
2.<br />
Can we do it<br />
differently <strong>and</strong><br />
reset our collective<br />
relationship to<br />
evaluation?<br />
Matthew Channell presents an interesting argument for the need to reset our<br />
view on evaluation being the Holy Grail.<br />
It is without doubt that evaluation presents benefits, such as:<br />
• Measuring impact on performance , knowledge <strong>and</strong> skills<br />
• Where deltas still exist <strong>and</strong> therefore what needs to be addressed<br />
• Targeting better use of resources<br />
• L&D doing its part to meet business objectives<br />
• Employee engagement <strong>and</strong> retention<br />
But if we are not measuring the right things accurately, there is an inevitable<br />
argument to be had as to whether we should aim to evaluate interventions, or<br />
at least in the way we have traditionally.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
06 07
Further Research:<br />
Undertake some research of your own. Explore<br />
thinking out on the internet about the role<br />
evaluation has as a useful part of the training cycle.<br />
And capture your thoughts on the following:<br />
1. What are the key themes you are observing<br />
about evaluation?<br />
2. Talk to a colleague in your organisation <strong>and</strong><br />
underst<strong>and</strong> practically, what evaluation exercises<br />
are <strong>and</strong> are not undertaken <strong>and</strong> why?<br />
3. What thoughts are emerging for you on<br />
evaluation <strong>and</strong> its changing role within the L&D<br />
Function?<br />
Capture your responses in your Learner Journal.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
08<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
09
If evaluation is about measuring <strong>and</strong> reviewing aspects of an intervention, there<br />
are other processes which are useful to also underst<strong>and</strong>, such as:<br />
• Validation – Which establishes a truth or corroborates something. For<br />
instance, was an objective met. Evaluation generally includes validation, but<br />
evaluation is larger in scope.<br />
• Assessment – Centres on progress <strong>and</strong> measurement against set criteria.<br />
For instance, an exercise to measure learning level <strong>and</strong> retention before <strong>and</strong><br />
after development<br />
• Testing - A more controlled <strong>and</strong> exact type of assessment.<br />
• Monitoring <strong>and</strong> reviewing – A series of activities to underst<strong>and</strong> a learner’s<br />
progress on a journey. A more dynamic <strong>and</strong> less exact approach, but likely to<br />
offer rich insight.<br />
Let’s Go Back to the Basics of Evaluation<br />
Evaluation within the L&D environment is about measuring <strong>and</strong> reviewing<br />
different aspects of an intervention to establish if it was effective <strong>and</strong> valuable<br />
in delivering a need. It can cover aspects such as impact, it’s value <strong>and</strong> the<br />
extent to which it was engaging.<br />
There is an argument to see it as the final stage of the training cycle which<br />
looks back at what has been undertaken. A more dynamic <strong>and</strong> relevant<br />
approach is to consider it both real-time <strong>and</strong> as a stage which is future focused,<br />
much like LNA.<br />
Evaluation does not need to be seen as a series of spreadsheets <strong>and</strong> data, it<br />
can also be seen as insight being collected real-time in sessions which is then<br />
used to inform adjustments in the design activities to ensure learning transfer<br />
takes place <strong>and</strong> is continuously improved.<br />
Seeing it as part of a continuous process of adjustment, moves your evaluation<br />
practice into a more agile <strong>and</strong> manageable space, which is more engaging,<br />
relevant <strong>and</strong> impactful for you, the learner <strong>and</strong> your stakeholders.<br />
• Return on Investment (ROI) – A measure of financial return. A higher level of<br />
evaluation which has mixed success in being able to easily <strong>and</strong> accurately<br />
assess success.<br />
• Return on Expectation (ROE) – This measures stakeholder satisfaction<br />
<strong>and</strong> whether it met their expectations, which may be varied. An important<br />
approach <strong>and</strong> one which requires early effort <strong>and</strong> ongoing stakeholder<br />
management.<br />
The key message here is<br />
that there are numerous<br />
ways to capture data,<br />
which can be transferred<br />
into insight for use.<br />
If it is not the right<br />
insight, or if it is not<br />
subsequently used<br />
to effect, then the<br />
process of evaluation is<br />
redundant.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
10<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
11
Over recent years, alternative models are being developed, such as Robert<br />
O Brinkerhoff’s The Success Case Model, which moves more towards using<br />
qualitative data.<br />
Interestingly, it uses surveys learners to identify transfer of learning, <strong>and</strong> then<br />
moves to the next stage where it uses story telling for learners to unpick their<br />
actual learning transfer, rooted in their workplace experiences.<br />
What is interesting about use of storytelling <strong>and</strong> direct links to experiences,<br />
is that it is engaging, directly relevant to their predicament <strong>and</strong> demonstrates<br />
impact; some of the key themes in adult learning theory.<br />
Whilst not quantitative, it does allow for that aspect of enjoyment <strong>and</strong> a more<br />
human aspect to learning.<br />
If you’re interested in this, here’s a short article on the Success Case Model.<br />
Let’s Look at Evaluation Theory <strong>and</strong> Thinking<br />
The main character in the history of evaluation is Donald Kirkpatrick <strong>and</strong> his<br />
Model of Evaluation, <strong>and</strong> it is his model you will typically hear referred to within<br />
L&D circles.<br />
There are four levels of evaluation from Reaction, to <strong>Learning</strong> to Behaviour <strong>and</strong><br />
then Results. The model is a helpful structure, which in recent years has been<br />
developed to suggest a 5th <strong>Level</strong> by Philips <strong>and</strong> Philips, which represents the<br />
Financial ROI.<br />
Whilst not perfect, it remains a helpful guide to realistically track your approach<br />
to evaluation.<br />
Separate to this is the Anderson’s Value of <strong>Learning</strong> which focuses on ensuring<br />
the learning intervention is aligned with <strong>and</strong> focused on the strategic needs of<br />
the business. A three staged process, it aims to suggest best fit for purpose<br />
interventions to meet needs.<br />
Further Research:<br />
Make yourself familiar with<br />
the three models mentioned<br />
about <strong>and</strong> share your initial<br />
thoughts about the benefits<br />
<strong>and</strong> disadvantages of each in<br />
comparison to each other.<br />
Capture your thoughts in your<br />
Learner Journal.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
12<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
13
Let’s Look at some Evaluation Processes <strong>and</strong> Practices<br />
In a bid to make evaluation useful, engaging <strong>and</strong><br />
values, the trick is to keep it relevant <strong>and</strong> simple,<br />
<strong>and</strong> importantly, to keep it as a dynamic, real-time<br />
experience.<br />
Typically, the CIPD suggests a 10 point plan to<br />
planning <strong>and</strong> implementing evaluation practices:<br />
Planning Evaluation<br />
Implementing Evaluation<br />
7. Collate your insight or information<br />
8. Analyse it as per your structured criteria<br />
9. Draw your conclusions <strong>and</strong> proposed actions<br />
for improvement<br />
10. Present your findings <strong>and</strong> recommendations to<br />
your stakeholders<br />
1. Identify the activity to be evaluated, i.e. the<br />
“scope”<br />
2. Be clear on the purpose of the evaluation, i.e.<br />
what are you trying to achieve (the criteria) <strong>and</strong><br />
why<br />
3. Work out what information you will need to<br />
establish each criteria<br />
4. Choose the right method <strong>and</strong> the right time to<br />
evaluate<br />
5. Design a tool, i.e., survey, interview, observation<br />
It’s a sensible approach within<br />
which you can overlay your<br />
insight into the context <strong>and</strong> the<br />
organisation to establish the most<br />
effective way forwards.<br />
Let’s look at them all.<br />
6. Embed this evaluation plan into your learning<br />
intervention plan <strong>and</strong> engage your stakeholder to<br />
agree your approach<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
14<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
15
Planning your Approach to Evaluation<br />
As we know, evaluation should not be an afterthought, but an ongoing part of<br />
your thought processes. When planning it, the key elements to consider are:<br />
1. Scoping out what to evaluate<br />
You can choose what it is you want to evaluate. It might be a single skill or<br />
practice whether you are running a single session or a programme. The size of<br />
an intervention does not necessarily dictate a need to evaluate it all.<br />
For instance, I was running a programme of Skill Bites across a community,<br />
but because my driver was primarily to re-engage these employees as part<br />
of a retention strategy, the primary evaluation focus was on engagement <strong>and</strong><br />
wellbeing, not the skill build or learning transfer.<br />
2. Clarifying your purpose<br />
More often than not, the main purpose is to see if your Aims <strong>and</strong> Objectives<br />
have been met, which infers the desired impact on business performance.<br />
As you know, there needs to be a direct correlation between LNA, aims <strong>and</strong><br />
objectives through to evaluation criteria.<br />
Using the example above, the LNA presented a disengaged, disempowered,<br />
disenfranchised workforce. The Aim was to “make them feel loved, valued<br />
<strong>and</strong> invested in”. The single common Objectives across the suite of nine Skill<br />
Bites was “to empower the individual with skills to feel empowered to manage<br />
their challenges in relation to xxx”. It followed, that although individuals were<br />
enhancing knowledge <strong>and</strong> skills in nine different areas, the evaluation exercise<br />
centred around their ability to feel empowered <strong>and</strong> therefore engaged.<br />
Scope is important as it allows you to focus in on the element to be evaluated<br />
without distraction or influence. It also provides clear context when you are<br />
presenting your findings. It is not unusual for stakeholder <strong>and</strong> business leaders<br />
to become excited about insight <strong>and</strong> seek more; setting boundaries prevents<br />
curiosity creating scope creep.<br />
Having clarity around scope <strong>and</strong> purpose, the evaluation criteria were:<br />
• The impact of development investment on their level of engagement with<br />
work<br />
• The extent to which they feel empowered to take action in challenging<br />
circumstance<br />
• The impact on retention over time of our people<br />
More typically, the evaluation would focus on one of the skills or knowledge<br />
builds. For instance, if one Skill Bite was “How to Build Trust”, the purpose of the<br />
evaluation might be :<br />
• The extent to which the Aims <strong>and</strong> Objectives of that session were met<br />
• The learner’s response to development <strong>and</strong> methods used<br />
• The cost <strong>and</strong> efficiency of the learning activity<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
16 17
From our scope <strong>and</strong> purpose, we<br />
establish our evaluation criteria. Using<br />
the Trust example, criteria might<br />
include:<br />
• The extent to which the behavioural<br />
changes relating to building trust<br />
have been exhibited<br />
• The workplace improvements<br />
resulting<br />
These evaluation criteria direct us to<br />
the information we will need to collect<br />
<strong>and</strong> the questions we will need to<br />
ask. The most appropriate methods<br />
to gather data are dependent on the<br />
context within which you are operating,<br />
accepted norms, digitalisation <strong>and</strong><br />
culture.<br />
A cautionary note continues throughout<br />
this module, which is that evaluation is<br />
not an exact science, <strong>and</strong> it is sensible<br />
to manage your own expectations <strong>and</strong><br />
those of others accordingly. It is also<br />
useful for you to be confident, as a<br />
Practitioner, to be able to st<strong>and</strong> your<br />
ground when you are clear <strong>and</strong> have<br />
chosen what you plan to evaluate <strong>and</strong><br />
how.<br />
3. Establishing the information you will need<br />
Having your Scope <strong>and</strong> purpose clear, you have your criteria established. The<br />
information needed can then be confirmed.<br />
Types of Information<br />
For instance, using the re-engage, empower <strong>and</strong> retain example, we would<br />
need to gather data for these three criteria. To ensure we can show causal<br />
impact, we would need to show the following:<br />
• <strong>Level</strong>s of engagement before <strong>and</strong> after the programme<br />
• <strong>Level</strong>s of perceived worth <strong>and</strong> empowerment before <strong>and</strong> after the<br />
programme<br />
• Attrition or “likely stay rates” before <strong>and</strong> after the programme<br />
With this example, qualitative interviews, surveys (both engagement <strong>and</strong><br />
wellbeing surveys), interviews <strong>and</strong> pulse “stay rate” surveys were the most<br />
appropriate approach to data mining.<br />
The example of empowering <strong>and</strong><br />
engaging a disenfranchised workforce<br />
as the primary driver was met with<br />
resistance from the Executive of an<br />
organisation who were accustomed to<br />
more obvious aim <strong>and</strong> outcome from<br />
L&D interventions, until I reinforced<br />
the context <strong>and</strong> highlighted the critical<br />
need to have their people feel invested<br />
in.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
18 19
Sources of Information<br />
Information can come from multiple sources depending on the organisation.<br />
Equally, the nature of the issue <strong>and</strong> the culture can play a part in you deciding<br />
where to gather the data.<br />
Rich sources of data can include:<br />
• The learner themselves, their experience, the impact of their learning<br />
• Line managers, what they observe<br />
• The L&D Practitioner <strong>and</strong> what they see, test <strong>and</strong> experience with the learner<br />
• Business intelligence <strong>and</strong> financials<br />
• Customer or user experience survey/feedback, qualitative <strong>and</strong> quantitative<br />
• Peers <strong>and</strong> stakeholders, both internal <strong>and</strong> external around the experience<br />
they have<br />
• System <strong>and</strong> process adherence<br />
It is useful to also look at it from the perspective of what has worked before<br />
in an organisation <strong>and</strong> what hasn’t. Talk to colleagues, mentors <strong>and</strong> external<br />
providers to get others’ views on what has worked for them <strong>and</strong> why. Don’t<br />
be afraid to consider alternative approaches <strong>and</strong> resist measuring everything<br />
simply because you can.<br />
4. Picking the right evaluation method<br />
Again, let’s keep it simple, impactful <strong>and</strong> engaging; we want people to become<br />
advocates of evaluation!<br />
Let’s look at a few methods available to you:<br />
• The programme end questionnaire – that one you used to complete as you<br />
were running out of the room; the “happy clappy sheet”<br />
• Pre & post-testing or assessment – be they tests or survey feedback<br />
• Learner self-assessment – providing rich information as directed on skill<br />
enhancements, gaps, confidence <strong>and</strong> ongoing management of barriers to<br />
further learning<br />
• Line manager assessment – where structured <strong>and</strong> consistently applied, this<br />
can provide insight <strong>and</strong> subtle observation feedback on knowledge, skills<br />
<strong>and</strong> behaviours<br />
• Surveys – potential for use in a plethora of ways, they offer qualitative <strong>and</strong><br />
quantitative data which can be tracked over a period of time<br />
• Focus or <strong>Learning</strong> Groups – useful insight real-time in a structured or<br />
unstructured format<br />
• Metrics <strong>and</strong> Research – Depending on the organisational setup <strong>and</strong> data<br />
capture practices, often data is readily available, or can be manipulated to<br />
allow you to extract insight<br />
Whilst these are all established sources, consider other ways of capturing data<br />
or insight. Speak to experts <strong>and</strong> consider what else is available to you.<br />
Don’t be constrained by current practice <strong>and</strong> be comfortable with data that is<br />
qualitative, rather than clinical. Insight is insight.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
20 21
Getting Creative <strong>and</strong> Flexible with Evaluation<br />
In order to engage yourself <strong>and</strong> others in the evaluation process, don’t be afraid to get creative; <strong>and</strong> don’t’ get hung up on being quantitatively driven.<br />
• Having learners self-assess<br />
their levels of engagement<br />
or knowledge around topics<br />
before <strong>and</strong> after, using<br />
emojis, selfies or cut-out<br />
images, gets them active,<br />
creative <strong>and</strong> reflective.<br />
• A learning equivalent of<br />
a pub quiz, with teams<br />
competing <strong>and</strong> answering<br />
questions in groups to<br />
check learning.<br />
• Undertake the 3-2-1 test, where<br />
the learner shares three aspects<br />
of their learning which impacted<br />
them & why, two people they<br />
plan to discuss their learning<br />
with <strong>and</strong> one thing they plan<br />
to do differently tomorrow as a<br />
consequence of their learning.<br />
• Ask for the cohort to<br />
provide a single feedback<br />
questionnaire between them<br />
on learning, application<br />
in the workplace <strong>and</strong><br />
continuing gaps for further<br />
self-development.<br />
• Evaluation cafes are engaging,<br />
seemingly free-style sessions<br />
where learners <strong>and</strong> their<br />
managers unpick learning<br />
that has resonated <strong>and</strong> been<br />
successfully applied in the<br />
workplace, the impact on their<br />
performance/expertise <strong>and</strong><br />
continuing gaps for further<br />
development.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
22 23
Whilst these may not signal obvious measurables, you will be getting both<br />
qualitative feedback <strong>and</strong> encouraging further refection which helps consolidate<br />
their learning, often in a social setting. Whilst you always need to make sure the<br />
approaches are appropriate for the group <strong>and</strong> the competence you are looking<br />
to evaluate, just remember it is also about assessing their levels of engagement<br />
<strong>and</strong> interest in the learning experience.<br />
When you open your mind to the broader range of evaluative approaches, you<br />
can also release yourself from the constraints of when to evaluate learning.<br />
Recognising that options include, before learning, at the beginning of learning,<br />
during <strong>and</strong> immediately after the learning intervention, as well as months later,<br />
you can be as creative in choosing when to evaluate as you can be with how to<br />
evaluate.<br />
Considered at the design phase, you may need to assess learning before<br />
moving to the next stage of learning, as previously mentioned, or simply to mix<br />
it up <strong>and</strong> retain engagement <strong>and</strong> stimulation.<br />
Questions for you:<br />
If you were creating three 90 minute skill bites on Listening Skills,<br />
Questioning Skills <strong>and</strong> Coaching Skills:<br />
1. When would you plan to assess their levels of:<br />
a) Reaction to the learning (level 1 Kirkpatrick)<br />
b) <strong>Learning</strong> itself (level 2 Kirkpatrick)<br />
c) Self-assessed future development need in these three skill sets<br />
2. How would you choose to creatively assess items a-c? Why?<br />
3. When would you propose to <strong>and</strong> why?<br />
Capture your suggestions in your E-Portfolio. In terms of exp<strong>and</strong>ing<br />
your thinking, consider previous examples you have experienced,<br />
talked to L&D professionals or forums on-line.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
24<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
25
Collecting Information on <strong>Learning</strong><br />
Transfer & Performance Improvement<br />
Accepting that you may be collating data that may be qualitative or quantitative<br />
in nature, it is important that you ensure you are familiar with <strong>and</strong> respectful<br />
of the relevant data protection legislation, in terms of permissions, usage <strong>and</strong><br />
storage.<br />
When it comes to actually analysing the data to turn it into useable information,<br />
<strong>and</strong> from there gather insight, you need to consider how to store <strong>and</strong> organise<br />
the data in order to “cut it”.<br />
Ensuring your Chosen Tools are Fit for Purpose<br />
Where organisations do undertake evaluation, you can find yourself in a<br />
position where you are simply using pre-existing evaluation tools because they<br />
are in place <strong>and</strong> accepted.<br />
Spreadsheets are a h<strong>and</strong>y way of cutting the data; but again, it is important to<br />
know exactly what you want to analyse. Just because you can analyse data<br />
does not mean that you should, so actively avoid scope creep <strong>and</strong> fall into<br />
analysis paralysis; stick to the core metrics <strong>and</strong> measures you actually need to<br />
review.<br />
It is sensible to double check whether they continue to be fit for purpose.<br />
Sometimes, the objectives of a learning intervention change over time, but the<br />
evaluation mechanism or approach does not.<br />
Other things to consider are:<br />
• Whether virtual, self-directed or blended learning dem<strong>and</strong>s a change in<br />
approach to evaluation<br />
• Whether social learning can encourage greater creativity in evaluation<br />
• What an organisation actually values in development <strong>and</strong> therefore what it<br />
wants to evaluate now, in comparison to historical approaches<br />
• The agility dem<strong>and</strong>ed in development interventions which might dem<strong>and</strong><br />
greater creativity, speed, diversity of evaluation techniques<br />
• The extent to which the manager should <strong>and</strong> wants to be involved in<br />
evaluation<br />
All of the above <strong>and</strong> more, should encourage you to continuously challenge<br />
yourself <strong>and</strong> others, as to whether the evaluation practices, <strong>and</strong> core intent,<br />
need continuous review <strong>and</strong> improvement.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
26 27
The choice you have to measure the data around learning outcomes are many <strong>and</strong> varied, as<br />
are the ways in which you can cut that data..<br />
In terms of learning outcomes, you can look at changes in learning/knowledge, or learning<br />
transfer into the workplace, or impact on performance improvement. You can even consider<br />
simply assessing shifts in levels of confidence in a subject or task.<br />
How you can cut the data, could consider assessing the data in terms of:<br />
• The different cohorts, environment or delivery factors (such as delivery methods, cohorts,<br />
facilitators, environmental factors, such as time of year or transformation)<br />
• Performance improvement by cohort, individuals, functions or manager<br />
• Greatest <strong>and</strong> least knowledge or skill improvements<br />
• Most impactful learning transfer to the workplace assessed at different timelines<br />
What other ways to look at data might you consider using?<br />
What are the factors you would take into account when deciding?<br />
Choosing how to cut the data will depend on the original need for change, the extent to which<br />
the organisation is interested <strong>and</strong> what you agreed with the stakeholders initially.<br />
A top tip is to avoid scope creep <strong>and</strong> consider where the key impacts are on performance<br />
improvement or achievement of the key business objectives.<br />
By having good, relevant data, you will gather advocates amongst your stakeholder groups<br />
which will smooth future support for well defined <strong>and</strong> managed learning interventions.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
26<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
27
Presenting your Findings <strong>and</strong> Feedback<br />
Depending on how your<br />
stakeholders what the insight<br />
delivering, you can expect it to be in<br />
report format, through presentation<br />
at meetings or in one to one<br />
sessions.<br />
Again, consider the needs of your<br />
stakeholder; ensure you underst<strong>and</strong><br />
how they like their insight delivering.<br />
It is important that it is digestible<br />
<strong>and</strong> accessible for their needs; so<br />
underst<strong>and</strong> the common practices<br />
in the business <strong>and</strong> the audience.<br />
Some like written word, others prefer<br />
data tables or charts or PowerPoint.<br />
Consider your audience <strong>and</strong> not only their personal preferences for insight, but<br />
also consider their work based perspective, For instance:<br />
• Line Managers might be interested in learning transfer to drive performance<br />
improvement, <strong>and</strong> their opportunity to further coach in learning to gain<br />
further improvements<br />
• Learners might want to see their own relevant performance improvement or<br />
how it supports the development career pathways<br />
• Executive Sponsors might want to see impacts on retention, satisfaction <strong>and</strong><br />
insight impacting future resource allocation decisions<br />
Depending on your own objectives for the presentation, you can provide<br />
insight in data format <strong>and</strong> remember that creative ways of providing this insight<br />
can create engagement, so long as the content is meaningful, such as video<br />
montages from participants, word collages or voice over insight.<br />
The importance is that the content is clear, relevant <strong>and</strong> accurate <strong>and</strong><br />
appropriate for your audience; this way you will use delivering the evaluation<br />
process to gain future sponsors <strong>and</strong> advocates.<br />
Typically, any presentation of insight<br />
requires clarity around the scope of<br />
the evaluation <strong>and</strong> why it has been<br />
evaluated this particular way. Then<br />
a summary of how evaluation has<br />
been undertaken positions you to<br />
share your findings. From there,<br />
you can articulate clearly defined<br />
recommendations.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
As with all stakeholder management<br />
practice, you need to underst<strong>and</strong> the<br />
differing needs of a diverse audience.<br />
You could consider meeting key<br />
stakeholders off-line ahead of the<br />
formal feedback session in order<br />
to have them briefed <strong>and</strong> engaged<br />
ahead of time. You may choose to<br />
share pre-reading to ensure all needs<br />
are met in terms of the data provision,<br />
allowing the presentation session to<br />
be a deeper conversation <strong>and</strong> debate.<br />
So, you have completed <strong>Month</strong> 6.<br />
Next stop … Delivery! So, be sure to be up to date <strong>and</strong> talk with your<br />
<strong>Development</strong> Coach or Manager on any areas where you are unsure.<br />
Well done on completing this session <strong>and</strong> we hope you’re motivated to ensure<br />
Evaluation remains on your agenda at all times.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
28<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
29
Over to You -<br />
Your Tasks to Complete This <strong>Month</strong><br />
Discuss your learnings so far with your <strong>Development</strong> Coach, seeking clarification<br />
as necessary.<br />
You need to see your Workplace Manager or Mentor for two separate sessions<br />
this month<br />
1. Mentoring Discussion – How can we ensure our learning & development<br />
interventions are positively impacting the Employer Value Proposition?<br />
2. Coaching Conversation - How can we improve the evaluation of our learning<br />
interventions?<br />
<strong>Month</strong> 6 – Your Work Based Activity.<br />
Critically review <strong>and</strong> evaluate the Evaluation Process undertaken within your<br />
Organisation for the same specific learning intervention or short programme<br />
(previously identified).<br />
Hold meetings to explain your activities <strong>and</strong> gain information.<br />
Create a presentation <strong>and</strong> present your findings <strong>and</strong> recommendations to your<br />
Manager <strong>and</strong> relevant parties. Content to include:<br />
• Explanation of current evaluation practices use; reference evidence, reports or<br />
analysis found<br />
• Assessment of the evaluation processes used<br />
• Explanation of possible evaluation models for use<br />
• Recommendation of improvements or proposal for implementation of a specific<br />
evaluation mechanism, <strong>and</strong> why<br />
Complete your Learner Journal for <strong>Month</strong> 6, capturing key learnings <strong>and</strong><br />
reflections arising from this activity <strong>and</strong> interactions, collaborations with others.<br />
Remember it is critical you capture all your learnings <strong>and</strong> reflections as you<br />
progress. Your End Point Assessment (EPA) is based upon the quality of your<br />
Learner Journal.<br />
<strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6 <strong>Level</strong> 3 <strong>Learning</strong> <strong>and</strong> <strong>Development</strong> Practitioner| <strong>Month</strong> 6<br />
30<br />
All rights reserved copyright©2021 Remit Group<br />
All rights reserved copyright©2021 Remit Group<br />
31
www.remit.co.uk<br />
0115 975 9550