One of the most common phrases I hear from L&D professionals is:
“We want to modernise and use more digital but our clients expect, and just ask for, training.”
So, how do you change their minds to be able do something else - something more effective - instead?
The short answer is:
You may not be able to just do what you want (and quite rightly so) but you really should be more focused on the outcomes rather than the activities.
When a client asks you to run a training course for them, they are playing their part a long-established conversation. For years, L&D had been asked for help and the stock response had been:
“We could run a course for that!”
And not because it will guarantee improvement or enhancement but because the course has been our stock-in-trade for decades. For their part, our clients have been solutioneering on our behalf. Aren’t we all told in our formative working years: “Don’t come to me with problems, come to me with solutions...”
But in what other profession is the activity valued over the results? Certainly not in medicine:
“Doctor! These are all the things I have wrong with me and I wish to have an operation...”
Even in Finance: Imagine clients approaching your Finance Director and asking to change the bottom line numbers on their department’s budget. The FD certainly wouldn’t comply without challenge. If they didn’t laugh the client out of their office, they may be calling for Security.
For at least 15 years, L&D have wanted to become more than just order-takers. But if we’re going to continue to value activities over outcomes - or expect different outcomes with the same activities - then nothing will change.
Here are three things you can do to help influence your stakeholders and have them focus on doing the right things in pursuit of an outcome rather than simply the activities they are used to.
Readiness need not mean just your ‘early adopters’ - or those who would be up for trying ‘something new’ - but those who really need your help. If you’ve got a stakeholder who wants email training, they don’t have a real problem for you to help solve. They have a small problem, at best, that they could sort out on their own. However, if you have a stakeholder whose team is missing critical KPIs and wants your help, you have something to work with. I’ve spoken before about real problems and perceived problems and how L&D too often get involved in perceived problems, such as:
“We don’t have a course for (x)”
However, there are so many real problems waiting to be solved that you don’t need to go hunting for them for long. If you understand the goals and pain points of your key stakeholders, then these will be prominent in conversations. But here’s a clue anyway: These problems are core to what the organisation is trying to achieve.
When you have a stakeholder who really needs help in overcoming business performance problems, you often have license to do what it takes to enhance performance.
The only way you’ll modernise in a way that is meaningful and makes sense to your organisation - and your stakeholders - is by running experiments. Work where there is a real business performance problem to address and an appetite to make things better.
Well, let’s start with what experiments are not. Experiments are not pilots. We’re not talking about piloting a course to see what needs tweaking before we hit the masses. That’s just normal practice. No innovation, No modernisation. Just status quo.
Experiments begin with understanding the friction being experienced by your target group, and working with them on a ‘Minimum Valuable Product’ that will equip them to perform with more competence and confidence than they could before you got involved. Digital resources and conversations are the tools of an experiment, meaning they are repeatable experiences in pursuit of your target group’s goals. Once you’ve run a successful experiment, you can scale your solution to benefit more employees with the same - or comparable - problem.
As opposed to pilots, which are generally experienced once whilst everybody else, following the pilot, experiences a more refined one-and-done experience, experiments are run to solve the performance problem not the ‘learning problem’. To help make sense of this, let’s look at number two...
When L&D think in terms of programmes and content, there is always a solution that just needs tweaking before being launched upon an audience. Generic, silver-bullet solutions that are well-received but miss the mark. You know the ones: You’re still trying to prove the ROI are on them several months (or years) after they were delivered.
Instead of making the link between ‘need’ and ‘training’, take some more time to understand the actual performance problem that is being experienced.
After years on the fringes of L&D, Performance Consulting is becoming more prominent as a way of both understanding what’s not working and also what needs to be done about it.
At its essence, Performance Consulting is a series of questions and a collaborative intent to solve real business performance problems with, and for, the client. Performance Consulting gets to the heart of what is being experienced as a problem and brings both the practitioner (L&D) and the client a long way in recognising what needs to be done. Performance Consulting helps you to understand four key things:
An exploration of these questions is likely to highlight, to both you and your client, what the gap between the reality of now and the ideal is, as well as recognising the factors that need addressing to resolve the issues. You’ll recognise what are: Skill; Will; Structural; Environmental; ‘Political’; and Cultural factors and help you lead the conversation towards experiments, i.e. what you can try in order to move the needle. Is there an area that is primed - and truly in need of some assistance? Do they recognise the problem? And are they all experiencing the same friction as they strive to achieve their goals?
You can only then progress by understanding the business performance problem from the perspective of the main actors, so work with them to refine the ‘goal’ (what they are trying to achieve in relation to the problem) and what friction they are experiencing in service of that goal. What is uncovered is what needs working on - and could be as simple as a process map and printed checklist. Or it could be a series of digital resources and regular conversations as you progress through defined milestones.
Whatever happens, your efforts will be addressing what is being experienced by those who are accountable for results, taking them on a journey from ‘not performing to expectations’ to ‘performing to expectations’, which is a compelling case for trying something based on outcomes over activity. But if your stakeholder wants cold, hard data...
L&D departments today are exploring ways in which data can not only be used to ‘prove the effectiveness of a learning intervention’ but also to know whether they need to intervene at all. Stakeholders - and L&D - can be very quick to explore a ‘learning activity’ without truly understanding the extent (or context) of a particular problem.
Data can be used to identify whether there is a problem that is worthy of significant investment (of time, money, energy and attention) or whether it’s an irrelevant detail that is magnified by an aversion held by the stakeholder.
Imagine being told by a stakeholder that meetings at your organisation never start on time and that there should be some training to make people aware of meeting etiquette. On the surface, there will be anecdotal evidence that backs this up. We’ve all been to meetings that start late or that aren’t the most efficient use of company time. But does it really warrant the design and delivery of training - and would that really address the issues?
An article published in Harvard Business Review, titled ‘How to Start Thinking Like a Data Scientist’, is the best articulation, that I’ve read, of the potential application of data in L&D. The following bullets summarise the steps recommended, much of it in the words of the author:
This approach can be applied to most of what L&D is accountable for, including:
Now this might seem like a lot of work up front but certainly takes less time than trying to find ROI that doesn’t exist on programmes delivered by request.
Data analysis is already one of the least developed skills in L&D and yet one of the most sought after skills in business, so it’s coming. But I believe it is accessible, if we apply the methodology above to understand both ‘should we work on this?’ and ‘did our intervention work?’ The alternatives are continuing to do what we’ve always done and chase ROI down another rabbit hole - as well as make ourselves irrelevant whilst the rest of the business world embraces data.
We needn’t concede power to stakeholders in the activities we undertake but we must let outcomes dictate activities and not let activities remain as the outcome. When stakeholders request training they are either playing out a long-established routine or they don’t really care about outcomes. Our role is to determine which it is and apply our attention and resources to business outcomes and spend as little as possible on activities that deliver little (if any) returns.
Invest in the upfront conversation to find out what the real problems are; how you may intervene to make a demonstrable difference; what moving the needle really looks like and will mean; and what data you can use at the very outset in order to modernise, add real value and grow as an L&D practitioner in a profession that is crying out for development in these areas.
We have to just stop this silliness of working on something because we’ve been asked to - or because it’s the latest thing in HR - and bolt ‘evaluation’ on the end to justify our existence. Investing up front in understanding what’s really going on and relying on data will be a huge leap for L&D.
Chat to our L&D experts and find out how Looop by 360Learning gives you your days back