This year (2016) has been a year of reflection for me. It marks 10 (yes, 10!) years since I graduated with my master’s degree. Graduate school was intense, and I loved every minute of it. Of course, I learned about theories and instructional strategies, and I studied the works of some of the greats in our industry. That served as a solid foundation; however, I learned the most from the practical assignments—and they were the crux of the program.
We designed instruction—a lot of instruction. We then had opportunities to present our designs to the class and the professor. This is where we received rich feedback; this is what shaped me as an instructional designer.
There is one day that stands out, and it has influenced the way I approach instructional design and how I teach it to others when I’m mentoring. Allow me to share that day with you.
Understand this! A true story
It was the day that learning objectives had new meaning. It was the day that I realized the importance of learning objectives as they translate to performance outcomes. Of course, we spent a great deal of time learning the theories and best practices of learning objectives—i.e., they should be measurable and observable; the verbs are important; the methods for writing them, and so on.
This particular day marked the very last time I wrote a learning objective using the verb “understand.” I had used it many times before, and each time my professor patiently asked me to select a different verb. He would explain why, but it never sank in.
On this day, his patience ran out. When he saw that my objectives contained the verb “understand,” he stopped me. Then he slammed a big book on the table and calmly said, “Understand this.” The sound of the book slamming on the table reverberated around the classroom for what felt like an eternity, and then he calmly began to explain (again).
He knew I could do better. He knew I could be a better instructional designer. He was continually pushing all of us to do more, to be better, and—most importantly—to produce sound instructional strategies and learning solutions. Then, he explained how objectives play a role in getting from learning to performance-based outcomes, how to make objectives measurable, and why this is an integral part of making instructional design actually work in the real world. It finally made sense.
The dump
Often, as instructional designers and developers, we get hung up in the content and forget about what the content is supposed to do for the learners—prepare them for something on the job. Too often, I see solutions that are only surface-level knowledge transfer, also known as knowledge dump.
We can be so focused on content that we forget the importance of solid performance outcomes as a foundation for a solid instructional strategy. In theory, we are to begin with the end in mind. Often, we default to thinking that the end is the assessment within the solution; we forget that the end—the ultimate assessment—is performance. In practice, if we begin with the end in mind, it is with knowing what the learner is expected to be able to do as a result of the training. This means objectives align with assessments; and assessments align with expected performance.
It’s true that the content is at the heart of learning. Acquisition of knowledge is a part of the formula that leads to the performance outcomes, but knowledge is too often the only outcome. This, my friends, is one of many reasons why training gets a bad rap. Learners too often walk away bored, or sometimes overwhelmed, by the knowledge dump that they will soon forget, saying, “Well, that was a waste of my time,” or, “OK, now what?”
For example, I once had a client with a common problem: Learners were frustrated with training in general because it never applied to their day-to-day activities. There was no guidance on what to apply or how to apply the concepts. It was out-of-the-box, shallow, and turning off the learners even to the concept of training; it was doing everything we should be preventing as L&D practitioners. My goals with this client were to destigmatize training, redefine it, and to have people learning without realizing it. To achieve this, my team and I created a customized learning solution that taught concepts through a scenario-based approach that was relevant to the learners’ day-to-day activities.
In another example, I had a client whose existing learning objectives all started with “identify” and “understand.” They couldn’t figure out why the training was not delivering results, even though the expected outcome was performance-based.
The expectation for performance, in the client’s view, was captured in the objective that read, “Apply concepts to day-to-day activities.” Can you guess whether the training had any application-based activities within it? If you guessed not, then you’d be correct! Now, many things can influence a lack of behavior change after training, including motivation and environment, but in this case the training was not designed in such a way that performance would be the intended outcome. The objectives and overall course design—the instructional strategy—were missing the mark, and correcting that was the first step on the way to get the client’s learners to a performance-based outcome. The new solution, once again, was a scenario-driven program.
Getting to performance-based objectives
Here is an important caution to keep in mind. Before following these steps, you must determine that there is a deficient performance outcome, the reason(s) for the deficiency, and the actual steps and actions required to correct it. Those steps and actions may or may not include training. There can be many factors that contribute to the deficiency, including hiring or assigning the wrong people; bad supervision; poor or no processes, standards, or procedures; no tools or the wrong tools; lack of motivation; and lack of available guidance or references for employees to consult. Lack of skills and knowledge is far down the list of causes. As Robert Mager famously asked, “Could they do it if their lives depended on it?” If the answer to that is “no,” then you might have a skill or knowledge deficiency. If you do have a skill or knowledge deficiency as part of the reason for the inadequate performance, here are the steps to correct it:
- Study the content. When and if there is existing material—instructional material, reference material, or job aids—study it. This is when I put on my learner hat and attempt to learn the knowledge or the skill the same way the learner might be expected to learn it. Along the way, I’m assessing whether it’s learnable at all and whether the instructional strategy (or, sometimes, lack thereof) may be affecting desired results.
- Ask a lot of questions. The goal here is to determine what the learners are expected to do as a result of the training. In other words, what is the desired performance on the job? Listen and determine whether what’s expected is realistic, and whether the current training in its current state facilitates learners in reaching those expectations; and, if not, why not.
- Define the current state of performance and the desired state, then identify the gaps—both perceived and actual. When you’re asking questions and learning from your SMEs and stakeholders, take note of how they describe the current performance state and the desired performance state. Then, you’ll be able to determine what the gaps are and how to bridge them. You know the expression “perception is reality”? Ask SMEs and stakeholders what they perceive to be the performance gaps, and then assess whether their perceptions are, in fact, reality.
- Listen for action verbs when you’re talking to the SMEs and stakeholders. If their expectations are performance-based, then take that into consideration when you’re writing your objectives and, ultimately, when you’re designing the learning solution. The goal is to use those verbs (and expectations) and incorporate them into your performance-based objectives.
So what?
Actionable, measurable, outcome-based objectives (or performance-based objectives) result in actionable content, which results in performance outcomes. When framing up the objectives, ask yourself and your SMEs, “Upon completion of this course, learners will be able to…” DO what? Then think about what that looks like. From there, you should have an idea of whether the objective is performance-based.
It’s true that theory and practice are sometimes contradictory or misaligned. In school you learn many things, but it’s in the real job world that practice forges theory, and you discover which things really matter as you master the craft of instructional design. Without a doubt, a focus on performance objectives has shown itself to be a key to success in defining and delivering work that is effective.
My professor slammed a book on a table and told me to understand it, and in that moment I got it. That verb was not descriptive enough; it was not (is not) measurable—it is observable only when you know what understanding looks like. Today, if he were to tell me to understand that book, I might ask a few questions, such as: What would my understanding of the book look like? What would I do to demonstrate understanding? What do I need to do with or to the book to understand it—do I even have to open it to understand it?
In that moment, theory became practice. It was ingrained in me. And, while it might be true that my eye twitches when someone asks me to understand something after the book-slamming experience, I’ll always remember—and be thankful for—my professor slamming that book on the table. For me, it drove home the necessity of performance-based objectives and their related outcomes as the ultimate end for the means of training. Here’s to beginning with performance in mind.