The disruptive technologies we use today — Amazon, Uber, Spotify, Google, AirBnB — have all got one thing in common: they’re user-centric.
“Putting the user first means focusing on solving their real problems, rather than pushing a programme, product or service.”
L&D is traditionally skills- and knowledge-centric, looking at different ways of solving these problems with content and programmes.
Tom Asher, SVP of customer experience at Humach says:
“There is a reason that Amazon is achieving results that others are not. They focus heavily on friction-free customer experiences. Now they have incredible loyalty, and they continue to utilise agility in their model to make adjustments so customers are happier and they are more efficient. I am confident they are not having long meetings while overanalysing tactics. They are making rapid adjustments regularly, learning from mistakes, and willing to take risks. It’s paid off.”
We must put ourselves in the shoes of our clients and experience, first-hand, what they do, and seek to really understand. Without understanding their friction, there is no starting point for your digital learning strategy. ‘Friction’ is the key. The traditional approach to L&D may have been to develop an end-to-end programme to train people on what needs to happen. When we’re focused on friction, we’re solving real (rather than perceived problems) in the context of the work itself, to deliver the results our clients are accountable for.
As well as going into the field, which can be both time consuming and misleading, a discovery session can help you to uncover points of friction on an employee’s journey and work with them on prioritising and addressing their friction.
Remember that experimentation is the name of the game…
Start with ten new managers, ten new starters, ten front-line sales staff, or whichever distinct employee group requires assistance, and work out their user journey — in relation to their work now, transitioning into their role, or whatever they tell you they need help with. If you have a hypothesis — for example, that new managers need coaching skills — then run that past them to discover how these skills might help them in the context of their work.
You could do all of this with post-it notes and a large wall…
Write any hypothesis on a flipchart and have your target group challenge it to find out if it really would help them and fully explore its merits. Is it worthwhile? Will it deliver results for your target group? What results would that be?
Then map a user journey and highlight points of friction.
Explore opportunities to reduce friction by asking them what they think they need at each point – not in terms of courses and programmes but in terms of information, know-how and insights. You want to empower them to perform with more confidence and competence, not to be delegated to with a request for training. Decide how you will all know that the agreed support and guidance has been worthwhile with agreed outcomes and milestones.
Once you’ve walked in your client’s shoes and you have your user-journey, you have solid foundations on which to run an experiment. In your experiment, all you need to do is offer a prototype, or Minimum Valuable Product / Solution, to see if it helps with what your users are trying to do.
You should aim to get your prototype ready to test in a matter of days — and you can do this by choosing two methods (to add value and minimise waste):
Digital resources address specific situations, challenges and questions highlighted in the discovery session. Resources can be created in minutes and accessible to the target group for testing immediately.
In addition, these can be iterated quickly to increase their value to users, based on engagement and feedback. Focused on the work itself, rather than for ‘learning’, the ultimate outcome is for better more confident ‘working’ and address specific situations, challenges and questions of the employee group, such as:
As long as resources directly address friction points, they can be created using local know-how from inside an organisation or curated using valuable source material that is available on the web. All that is needed in addition is the context of what the content is addressing and what the user should do with that information or insight in the context of their role
We all know there is huge value in bringing people together to learn from each others’ experiences but, rather than courses and workshops, these conversations are focused to address specific areas of performance.
These require no more preparation than a room, a host and an invitation to the right people. Conversations are laser-focused on addressing parts of the user-journey where it was highlighted that support and/or guidance was required. And they are focused on results.
Conversations provide an opportunity to discuss real friction points and gain insight from colleagues in similar situations. Conversation topics may include:
If a series of contextually rich resources and conversations aimed at the target audience are tested — and measured against the desired outcomes — and not deemed sufficient, then additional experiments can be run, that require more resource behind them, i.e. workshops for the opportunity to practice.
But even these should be tested before fully procured as a finished product. More often than not, resources and conversations will be sufficient to equip users with the tools and insights they need to perform.
If employees record no change in their performance, explore why not before spending time, money and effort on traditional L&D solutions. Not achieving the desired change in performance is most likely to be due to ‘will’ rather than ‘ability’ so explore this first.
Read more in your free copy of the ‘L&D Disruption Playbook‘.
Experiments can — and should — take into consideration the resources you have available already.
If your target group points out that they need support on their journey around systems training and you already have some elearning, then test it out by packaging it up in a format that directly addresses the need (rather than seeks to educate on an entire platform).
If first-line managers are struggling to address poor performance and you have a segment of a training course on that topic, then refine it to address the need and run it as an experiment. Seek feedback on how appropriate and useful it is. If you have a whole library of content, then see how it can be repurposed to address elements of your user journey.
This audit is to ensure you’re not throwing the baby out with the bathwater by seeing what could be used and useful in the context of your user journeys.
Audit Checklist:
The number one reason why people want to learn online at work is to do their jobs better and faster. It’s not a surprise really. If they are fortunate enough, people choose a vocation — a professional discipline — and seek to achieve within it.
Many people don’t have this luxury and find themselves in a job rather than a career. But beyond the salary, the things that motivate us are the opportunity to achieve, to feel appreciated, to try things out and to make a difference.
The opportunity for L&D, with campaigns — and rather than creating ‘digital learning spaces’ — is to get out to where they are and help them with what they are trying to do. Whether at their desks, on the shop floor, in meetings, or during their commute.
We need to stop believing in the ‘build it and they will come’ delusion of the LMS / Academy approach, because they won’t come. They’re working. So, we need to get to where they are — and this is where campaigns come in.
Like smart digital companies, you can find out how your target audience want to be engaged with (in the discovery session) and sell them ‘value’ — not programmes — to help them with what they are trying to achieve — which is what they are hired to do.
Campaigns may be:
Your campaigns should be experiments too, finding out how best to get to your target audience and finding out what they respond best to. Just like best practices in marketing, A/B testing is “a way to compare two different versions of something to figure out which performs better.
Elements you will want to A/B test include:
L&D professionals are quick to see what technology is available to them but are quickly limited by what they can do with it when it comes to developing a digital learning strategy. We need to start adjusting our expectations of technology and use it to support and guide what we’re trying to achieve (solving real problems) and stop replying on a one-size-fits-all solution.
Learn more. Get the free ebook: L&D Disruption Playbook.
Look out for the final article in this series (pt 5) where we’ll dive deeper into how you can scale your digital learning strategy based on the outcomes of your experiments, A/B testing and iterations.
Ready to solve real business problems using digital? Run an experiment with Looop.
Chat to our L&D experts and find out how Looop by 360Learning gives you your days back