How to conduct a training evaluation

Written by Robin Hoyle

Why should you evaluate training

Improvement in an observable skill will have an impact on its outcome. Put simply, when training doesn’t alter the ability of a learner it becomes pointless regardless of how you evaluate training.

A few Learning and Development professionals still don’t understand this. They look to source a training provider who can deliver the right number of days per year, to the right number of people at what seems like a good price. They spend less time considering what the training needs of the team are and how they can effectively evaluate the provider by the impact they’re having.

Discover how to make your training programmes more effective.

Training providers must take responsibility for this issue too. I’ve been exhibitor, speaker, chair and attendee at the World of Learning Conference. For the second year running I’ll be chairing the conference, a process of planning and engagement with the team. A process that started just as seminar rooms were being dismantled a year ago.

As with other years I expect to see a wide range of offers by those providing training and learning services. Some will be fads and bandwagons. Over recent times I’ve seen all sorts of technology bandwagons come and go. Adopted by some, rejected by most and ultimately overtaken by the ‘next big thing’. I’ve seen trumpeted breakthroughs in learning theories. Some with genuine merit and others destined to be overtaken by the next, ‘next big thing’.

The approaches which have genuine merit tend to be well grounded. They are based on something tangible. They recognise that some things don’t change, that evolution takes millennia and that we learn pretty much as we did when humans first started walking upright.

We may seem more sophisticated but certain truths hold good. We learn from each other, we need to practise and we respond to feedback, we want a clear reason to adopt new knowledge and change our behaviours and above all we need to know that what we are asked to do will work.

Evidence seems to be the biggest differentiator when evaluating whether training will help with things that won’t.

Knowing what makes one person effective and another person average is fundamental to changing performance. It is the bedrock of effective training and learning.

It will be impossible for me to attend all the presentations at this year’s World of Learning Conference. I’ll be introducing speakers and managing questions from the floor. But as well as being master of ceremonies I’ll also be a learner. And like all learners I’ll be asking questions. Where is the evidence? How do you know this works? How does this approach apply to my challenges? I’ll be asking these questions because from time to time someone else arrives in the market with a new sales methodology. They attempt to show that human nature when applied to persuasion is somehow different.

Guess what, it isn’t.

This isn’t an excuse for us to stand still. Our clients’ needs still evolve. That’s why we brought together many elements of training, coaching and reinforcement into the SPIN® Suite to form a package to help companies to achieve greater consistency.

There have been many changes in the content of SPIN® Selling over the years. Our constant stream of new research, to take just one example, on the role of Procurement professionals in the buying decision means that we develop the way we deliver training.

Thankfully, our clients share these useful new insights that others offer along the way. However, the core need they have is deeply rooted in the kind of human behaviours that Huthwaite International focus on.

This insight forms the basis of one of the most important short publications we’ve issued in recent years. A productivity case study of evidence-based case studies that we’ve collected from clients as proof that SPIN Selling is as effective today as ever.

I’ll be looking for robust research. I’ll be looking for case studies of substance. I’ll be seeking solutions which are built on solid foundations. I’ll be asking for evidence in workplace learning, the initiatives that work have a sound evidence base. Without these things training is pointless.

How do you assess a training design?

What makes one skills programme outstanding while another is mediocre? One thing we all know, is if a skills development project doesn’t reach its potential, it can be a huge drain on resources. Any programme which can meet eight or more of the following criteria has a high chance of bringing about a significant and last improvement in skills performance.

Valid success model
Any skill or behaviour which a programme suggests being effective can be defined as a success model. Lots of models are based on assumptions. Don’t waste your time teaching the wrong things.

Emphasis on the basics
you will often find exercises in training which set out to improve skills by encouraging learners to ‘ask more and better questions’. In contrast, successful programmes develop quantity first and quality after.

Low threat learning environment
Good programmes build a low threat learning environment which, right from the start, makes people comfortable and confident enough to try new and unfamiliar skills.

Appropriate use of technology
Technology cannot replicate all the elements of a successful skills based learning. The ability to observe performance and provide objective feedback relies on the personal exchange of individuals.

Incremental building of skills
The most common fault of the average skills programme is that it tries to do too many things. Most programmes would be twice as effective if they contained half as much.

Learner centred approach
This is difficult to define but it’s important. Excellent training puts the learner front and centre. Involve your learners in decisions about activities, pace and priorities.

Frequent and objective programme checks
How can learners achieve a sense of progress? Feedback. It provides frequent and objective information and how the skills are being acquired. Without feedback trying to apply training becomes more difficult.

Maximum practice opportunity
Practice alone can’t develop a skill. You need a special type of practice, practice combined with objective feedback, to improve performance. Most of the outstanding designs we reviewed contained at least 10% of programme time where an individual could practise and receive feedback.

Specific performance models and examples
Use of specific models, cases and examples to help learners apply concepts to their roles.

Exciting to teach and to learn
Effective designs tell stories they unveil plots which gradually unfold, they usual have some elements of surprise. Unless it’s fun to teach, it will suffer.

Work-based reinforcement.
Wherever possible learners should be enabled by the training design. Be specific about how a learner can continue to practise, gain feedback and can take control of their own continuous improvement.

Discover how to make your training programmes more effective.

Learning and development insights

Robin Hoyle and Tony Hughes discuss key insights into Learning and Development, highlight the positive effect of it's proper integration with technology, Explain a simple way to reflect on learning and examine key challenges for learning and Development professionals evaluating training listen to the insights below.

Tell us your perspective