Browse Tag by analysis

Knowing me, knowing you … A-ha! The key success

What makes an effective learning solution?

I’ve asked the same question many times over the years with the following responses:

  1. Relevant
  2. Realistic
  3. Interactive
  4. Goal based
  5. Flexible
  6. Challenging
  7. Structured but not controlling

Although all of these elements are important they pale into insignificance without one vital consideration because without it, learning doesn’t hit all the marks.

When discussing a hypothetical situation recently, it was suggested that if we were to produce a specific training programme within the given timescales, within the given budget, using the given resources, to the large number of learners, the only way to get this done in time was to forego the analysis of the audience’s needs, experience and characteristics! The reason given was that there would just not be the time.

Looking back at the first word in the list above (and this is more often than not the top-most mentioned word), then how can you produce a learning solution that is relevant if you are not fully aware of the current situation. Without knowing your audience, how can you design the most appropriate solution for them. What you’d actually end up with is the usual blunderbuss approach i.e. blast it out and hope you hit the target!

Unfortunately, and sadly, this seems to be a common decision and subsequently, is the reason why a lot of training solutions, ‘e’, classroom or blended, can suffer.

Today I attended an eLearning Network event where the theme was ‘truly effective eLearning’. The key ingredient for its success running throughout the discussions was the need to be more learner-centred. Without knowing your audience, how could eLearning (or indeed any learning) be learner-centred?

Then tonight, by chance, I also read something Clive Shepherd posted on an Onlignment blog post ‘making transforsmation happen: analysis and design‘ which reinforces how imperative the analysis is.

So as the song goes… “Knowing me, knowing you is the best I can do”!

How do we ensure competency?

Is training really the answer?

I’ve just watched Craig Taylor’s excellent Pecha Kucha ‘Using technology to enhance an assess-train-assess approach’ in which he shares examples of how assessing competency levels before automatically mandating everyone attend the same annual refresher has had a positive impact on business.

https://www.slideshare.net/CraigTaylor/using-technology-to-enhance-and-assesstrainassess-approach-kucha

When I hear people talking about the need to design a course here may be some reasoning behind it:

a) there is an update
b) compliance -staff are required to attend refresher training every year whether they need it or not
c) there’s some new approaches to working practise

However, before you automatically go through the usual motions and go down the ‘we’ve got to design a new course’ why not ask yourself the following questions:

How much do they know already?

How often would they carry out that work?

and the biggie…. What REALLY tells you whether they are competent or not?

Why do we insist on putting everyone, no matter how experienced they are in the subject, through a course before establishing whether they actually need it? Even when the instructional design is top notch including relevant task based interactive activities, it’s a waste of resources and staff time if they already know the subject matter and are applying successfully. Of course we need to maintain quality and adhere to legal requirements but is herding us all through one-size-fits-all courses the most efficient or, indeed, effective way of doing this?

It seems we often pay more attention to recording ‘bums on seats’ – virtual or otherwise instead of establishing the quality of work performance. So our workforce are all too often taken off their important jobs and attend compulsory training where there is limited flexibility in what they can choose to do. There is a simple, logical and very effective solution – assessments not courses.

As I said in a comment to Ryan Tracy’s blog post ‘online courses must die‘ “why force individuals to go through the same mandatory content year after year when all they may need is a yearly, skills based assessment. If that assessment highlight skills gaps then a more flexible learning programme will make sure individuals learn only what they need not what they don’t”

Now I’m not saying that we’ll never need formal courses ever again. This would be ridiculous and untrue. Besides, I’d be talking my way out of a job if I do that. There are many reasons why someone will need formal courses. But before we decide, we do need to be more analytical before designing how to facilitate our workforce’s learning paths. Yes, it may mean more hard work gathering all the information you’ll need. Yes, it will mean we would need to encourage ownership of learning more to the individuals themselves and help them develop their meta-cognitive skills. And yes, it will mean L&D professionals would then become more cultivators of learning.

When reflecting on why this ‘herding’ approach occurs so frequently, I was reminded of a conversation I had recently around the reluctance in considering just assessing staff to prove competence before deciding whether anyone needed more formal training. It appears it all boiled down to the quality of the assessment – or rather the poor quality of the assessment. This meant that everyone had to be forced to attend the same training course to make sure the content was covered (not I didn’t say learned) and which could be tracked for statistical purposes and to prove attendance.  Now, correct me if I’m wrong but the whole point of an assessment is to test whether a person is competent in the subject matter.

If you spend the valuable time and effort in creating great learning programmes, whether they are formal courses or a collection of learning nuggets on-demand, the only way learning can be confirmed is by completing an assessed activity.  If that assessment can easily be ‘guessed’, then the learner doesn’t have to use any problem solving techniques to analyse and apply.  If you honestly have little confidence in the assessment at the end of a learning programme, of course you won’t want to put it out there on its own.  It will about as much use as a chocolate  teapot!

We’ve often discussed what makes good learning, ‘e’ or otherwise. What now begs the question is “what is good assessment?”