As conferences, blogs, and forums are abuzz with talk of extracting better reports from learning, big data, analytics, and ROI, a common barrier in thinking emerges time and time again. How do we get from where we are today to where the business wants us to be?
Sometimes in order to see the best way forward, we first need to understand how we arrived where we are today. If we cast our minds back to the 1980s, we saw the first significant movement outside of the education sector toward the commercial adoption of technology being used to enhance training. The focus was on the science behind improving learning efficiency, retention, and ultimately the return on investment. This spawned a new industry with incredible potential.
The teams formed to tackle this new evolution in training comprised skilled adult educators, behavioral psychologists, and computer science graduates. The early tools used to build training content were driven by the science behind learning coupled with a motivation for tangible results.
As the industry grew, we settled on the term “eLearning,” and key infrastructure around learning management became central to an organization’s ability to harness this process. eLearning became noted for its potential to track compliance, and reporting standards soon followed. The LMS quickly became the single largest capital expense within corporate training departments, and the standards they dictated drove the requirements of the training they delivered.
As organizations pushed to take advantage of their investments, the need for quicker development and delivery methods followed. LMS reporting focused on pass rates, completions, scores, time, and attempts. Given these targets, justification for compromised learning design in favor of development speed naturally followed, which paved the way for rapid development tools. When learners started to complain of unengaging experiences, we looked toward the gaming and movie industries for clues to successfully engage our audiences. Tool vendors began to respond to the request for enhanced media and interactivity, and we saw a further broadening in the target users of these tools. The role of the instructional designer blurred to embrace web and media developers, graphic designers, and PowerPoint enthusiasts, and we saw a move away from the early emphasis on the science behind learning design.
As other industries now reap the benefits of innovations in the cloud, big data, analytics, machine learning, and artificial intelligence, it can sometimes feel like our industry is well behind—and we are, but we have some difficult history to overcome.
Joining the dots
If we teleport ourselves to imagine a future point where training is truly effective and measurable, and where it represents a solid voice in business strategy, it becomes a little easier to see why our industry is struggling to join the dots. Almost four decades after realizing the potential of pairing technology with learning, the industry has come to rest on its ability to efficiently scale delivery. The key piece of learning management infrastructure used by most organizations of size has shown little sign of innovation. The standards and thinking that dominate our industry still focus on tracking completions when, in order to advance, we really need to measure understanding, behavioral change, our ability to execute knowledge, and return on investment. The tools we’re using to build content lie in the hands of programmers, graphic designers, and scriptwriters while skilled educators are often relegated to a back seat, if they haven’t already been dropped off at a bus stop along the way.
The inability to demonstrate our worth has led corporate training divisions to become the only major division within an organization unable to clearly demonstrate tangible financial impact. Due to the lack of science involved in the creation of learning content today, we now commonly see learners having to digest linear information filled with meaningless assessments lacking statistical credibility.
More than ever, the ability to break the cycle seems unachievable. Today, we see an industry, once filled with incredible potential, completely broken by a lack of foresight and understanding of the true function of training in a corporate environment. A training department exists to increase profit. When we deliver sales training, we do it so our teams can sell more products. We train our leaders and managers to increase team morale, productivity, efficiencies, and reduce employee churn—all of which aim to have a positive financial effect. Even topics like compliance, health and safety, and code of conduct aim to reduce risk that eats into company profits. When we consider this in relation to the type of tracking our LMS delivers, we gain further clues into the problems we need to overcome.
So, while breaking this cycle may be difficult, it’s also inevitable. As technological advances such as the cloud, artificial intelligence, machine learning, big data, and analytics drive unparalleled innovation across industries, training will not go untouched. So the question is: How do we prepare?
Finding the catalyst
Reinventing our industry requires us to revisit our purpose, fully comprehend our problems, and reconsider the way training should be perceived. We need to find direction that will embody not only solutions to today’s innovation drivers, but solutions that provide us the ability to catch up. This means identifying the catalyst capable of facilitating rapid and effective change.
As much as we may not like it, the reality of our current position is that we’re unable to truly measure the impact of training on business. Armed with a check list of staff completions, post-training survey forms, and smile sheets, we’re still a long way off from any credible insights that can garner the attention of our executive stakeholders.
It’s undeniable that training needs to become accountable. Any business division unable to demonstrate its worth resigns itself to a pole position for cutbacks and faces an uphill battle to attract budget. Without budget, it’s difficult to influence change. This brings us to a realization that the catalyst we’re looking for relates to measurement.
Critical thinking needed for change
It makes sense that if we can demonstrate business impact or even a willingness to become accountable, we possess a business case executive-level managers can’t ignore.
“Great,” you say, “but how does knowing this help us get from here to there?” It doesn’t directly, but it highlights the goal we need to work toward. When I speak with CLOs actively seeking solutions to this problem, a pattern emerges. Our ability to join the dots is clouded by our investment, reliance, and view of the world through the eyes of our existing tools, infrastructure, and skill sets. It’s a little like being invested heavily in a fleet of cars, skilled drivers, and mechanics armed with the specialized tools needed to maintain them, then realizing you want to go places faster—a lot faster. You find yourself trying to figure out how to make your cars travel at the speed of a Boeing 777, rather than thinking about a strategy to transition your method of transport.
Measuring learning is scientifically and technically complex – extremely complex. But that doesn’t mean it’s unachievable, out of reach, or even difficult to implement.
Just for a moment, let’s reflect on the mapping and GPS solutions commonly found on the smartphones we have in our pocket. It’s easy to forget the complexity that’s occurring under the seemingly simple interface. When we ask for walking directions from point A to point B, algorithms quickly identify the route that best achieves our priorities. Over time, an interesting thing starts to occur. The estimated time to walk from point A to B changes—in actual fact, it becomes more accurate, not just for us but for everyone. This is due to data. As we follow its directions, the GPS continually tracks our position. This data is captured, and over time the software begins to learn not just our walking speed, but obstacles, traffic, and even variables (such as weather) that impact our efficiency. It’s easy to overlook this complexity. The systems that capture, store, and process this data are well hidden under what we see as a simple “estimated walking time” value. This complexity under the hood is a key element we need to recognize and think critically about.
Measurement—where do we start?
The LMS is the obvious place to look for better measurement. This infrastructure promotes reporting as a key feature, so it makes sense. In actual fact, this is where most of us are looking for answers. Surveys have identified nearly 50 percent of participating organizations are looking for a new LMS to obtain better reporting.
This begs the question—with such a high number of organizations looking to move, why have LMSs not provided better reporting to date? This simple question reveals an answer that LMS vendors may prefer to avoid discussing: They can’t. Better insights require better data. An LMS’s reporting functionality is based on the data provided to it. This data comes from the content via the standards the LMS has adopted. This leads us to ask: Can we retrieve better data from our existing content? At first, it would seem that the answer could be yes. We’re now seeing several vendors that are providing solutions with this line of thinking—wrap the existing content in another layer able to retrieve further insights.
Survey form analytics—taking advantage of the LMS conundrum
Wrapping existing content with a post survey form is technically easy—it will still allow the LMS to manage the content, and it will even work alongside our existing tracking standards. On the surface this looks like the easy solution that we’ve been looking for. It should be about now that our common-sense filter starts to twitch. Surely deep and meaningful business insights can’t be that easy. If so, why didn’t anyone come up with this a decade ago?
Return on investment, behavioral change, competency revelations, ability to execute, understanding—these are extremely complex to measure. For the scientifically minded, logic tells us it would be absurd to think we could somehow get this level of insight by asking the trainees 10 or 15 general questions at the end of a lesson. Unfortunately, vendors making these claims are either ignorant of their flawed approach or are knowingly taking advantage of the growing number of companies looking for an easy solution. Solutions that further suggest their methodology could prove ROI from old training content consisting of some slides and a handful of multiple-choice questions should be turning that twitch into an event in need of medical intervention. Achieving these types of insights from existing content is comparable to painting a Model T Ford red and then expecting it to perform like a Ferrari. While we can all live in hope, the well-known saying “If it seems too good to be true, it is” certainly applies here. In actual fact, basing any strategic decisions on data captured in this way is not only counterproductive but also dangerous. However, these solutions are appearing due to the lack of understanding and answers around the big nut we’re all trying to crack—how do we measure learning?
The path to data
Better measurement starts with the tools. First we need sophisticated tools designed for creating a new generation of measurable training content. Next we need to recognize the elephant in the room—if we continue creating the same style of content we’re used to building on these new tools, what will the data show? Unfortunately, it won’t be the magical rainbow of joy we’re hoping for. The efforts by the likes of the Serious eLearning Manifesto have been working to educate us around this issue. While the reality may be hard to swallow, investing in significant education and potential restructuring to regain the science-driven learning design skills that we need at the center of our training development teams is going to be essential if we want to see positive results when these new tools reveal the measurement we’ve been dreaming of. As for the LMS, we need to consider one question: With content able to deliver unprecedented levels of data, how will today’s LMS analyze such complex data without full context of the learning design? Significant and rapid change happens through disruption. My money is on the emergence of a new generation of companies getting ready to overturn the status quo.
Want more?
Glenn Bull will be presenting two sessions on this topic at The eLearning Guild’s Learning Solutions 2017 Conference & Expo, March 22 – 24 in Orlando, Florida:
- “The Essentials of Getting Your Organization Ready for Advanced Analytics” (1:00 PM on Wednesday, March 22)
- “Adaptive Learning: Using Measurement and Analytics to Customize Training” (4:00 PM on Thursday, March 23)