If the sticker on the box says the product is, “Engaging,” is it?
The word “engaging” is probably the most overused word in the ever-evolving world of e-Learning. Vendors like to say that their courseware is “engaging.” But is clicking on a button labeled “Next” after reading through an onscreen passage engaging? Is answering a simple multiple-choice quiz engaging? What about listening to a narrator read words on a screen? Is that engaging? Do any of these things capture the attention of our audience and sustain it? Do the typical interactions found in most e-Learning courses pull learners into an experience and entertain them? In most cases, do our learners even remember the experience after they’ve clicked the “Exit” button? Sadly, many probably don’t.
Of course, no one ever sets out to create a “bad” course. But, faced with corporate constraints of time, budget, and the occasional pesky manager, we often fall into our comfort zone — and create what we know will get the project off our plates. That usually means creating a course that follows the traditional, linear instructional model of telling and then testing.
This is too bad, because there’s another model out there that truly is engaging. It’s a model that immerses its participants into experiences that demand purposeful interactions. It’s a model that gives participants concrete, real-world goals to strive towards. And it’s a model that its participants willingly embrace. So much so that many of them will wait in long lines at big electronic gadget stores and pay serious money to embrace it.
Of course the model I’m referring to here is the modern video game model. So, are your eyes rolling yet? I hope not, but some people seem to hold the belief that learning can’t take place if fun and games are involved.
I’m one who happens to disagree with that position. This isn’t to say that I think our learners should be shooting aliens while learning about Sexual Harassment or chasing cute, multicolored ghosts while learning about something like Workplace Violence. “Fun” in my mind doesn’t mean cute characters or abstract animations. Fun, to me, is the absence of boredom. Adopting a modern video-game model does not mean alienating learners who are turned off by games. It means adopting a model that incorporates some of the most compelling elements of today’s immersive, simulation-based video game experiences.
Playing the policy game
In a lot of today’s video games, players do battle with ogres, or aliens or other incarnations of evil. Well, the incarnation of evil that we had to deal with at Allstate was something much worse. It was the dreaded Human Resource Policy Guide!
Our team needed to develop a curriculum of required courses based on these infamous, yet vitally important, HR policies. The courses in the curriculum (eventually called the Allstate Policy and Compliance Curriculum or APCC) include: Information Technology Usage, Ethics and Integrity, Sexual Harassment/Non- Discrimination, and Workplace Violence.
To create the APCC, our team
designed and implemented a courseware approach we dubbed “Digital Experiential Learning”
or
During this article, I’ll use the
“F” word a few more times, while explaining the approach we took at Allstate to
deliver online human resource policy training. I’ll get into the instructional
elements we used to help enable retention, the gaming elements we used to
sustain motivation, and the simple in-house technology we used to build these
courses. I’ll also talk about the reactions our audience had to the
Why the new approach?
There were four facts driving the development of this curriculum. The first fact has to do with the way the policies are written. They’re written in the preferred language of most policy documents — “legalese.” And for those of you who can read legalese — congratulations! But for the rest of us, me included, legalese is more difficult to wade through than a passage from a high school Latin book. So the courses needed to be easy to understand. The second and most important reason we were looking for a new approach to delivering this content was that we wanted to be sure employees “got it.” This was especially important in light of the fact that they could be terminated if they didn’t “get it.” The third driver in our overall development was we wanted to be sure that our audience retained what they learned.
But there was also a fourth fact, and it had to do with motivation. If someone tunes out of our course after the first few screens because they’re bored, then nothing else really matters. We’ve lost them. We often assume that adult learners can find a way to “self-motivate” when they feel that the content benefits them in some direct way. But in this situation, the content really tested that assumption. In our opinion, traditional e-Learning wasn’t going to cut it. We needed something more.
We needed something “engaging.” More importantly, we needed an approach that kept learners motivated while also enabling content retention. We needed something that was FUN. But it also needed to be something that would be taken seriously, as the subject matter demanded. It needed to be an approach that was grounded in the real world, not just a video-game fantasy.
The recommended approach, and the one
agreed to by our internal clients, was the proposed “Digital Experiential Learning”
approach. I’ll describe some of the key elements of the
Instructional elements of a approach
The most important outcome of any e-Learning course is for people to remember what they learned. That’s a “no-brainer.” The APCC plays an integral role in ensuring that employees are aware of and in compliance with important company policies. Due to the nature of these policies, and the fact that employees could be terminated if they were not in compliance with the policies, content retention was an imperative. We couldn’t just dump this content into one of our traditional e-Learning templates and assume retention would magically take place. Learners needed to retain corporate policy knowledge, comprehend its meaning and be able to apply it back on the job. To help ensure that all of this takes place, the courses have several instructional elements at their core:
- Simulation-based
- Levels, not lessons
- Cognitive apprenticeship
- Conversational writing style
- Driving learners to the tools
Simulation-based
Many e-Learning courses follow the traditional “tell” and then “test” model, meaning that learners go through a few screens of content and are then tested over it. If they’ve transferred the content to their short-term memories, they’ll do well.
Our approach effectively reversed this, arguing that transferring facts to your short-term memory may be just fine if your goal is to get your learners to pass a multiple choice quiz. But if you want your learners to be able to apply what they’ve learned six months down the line, performance, not rote memorization, is what your goal really ought to be. In the design of the APCC courses, we took an approach that was, essentially, “test” then “tell.” Learners are immersed into situations where they’re forced to perform. This is the “test” part. They’re free to make mistakes, in a safe, online environment. The “tell” part comes later, in the form of immediate feedback from a “guide” character who is the learner’s mentor throughout the course. It is in these situations where “teachable moments” are likely to occur.
What, exactly, is a teachable moment? Well, to me, a teachable moment is the ultimate attention grabber. It’s what happens to you the moment after you’ve gotten your hands caught in the cookie jar — or opened up a virus that crashes your hard drive. Essentially, a teachable moment is an uncomfortable situation that urgently focuses your energies, and attention, on making sure that the uncomfortable situation is never repeated.
One such “teachable moment” became clear to us during a user group debrief session we conducted. During that session, we had a group of about twenty people go through our course on Ethics and Integrity. After they completed the course, we talked to them about their experiences. One user went on at length about how she got “Sue” into trouble because she was unsure of Allstate policy on accepting gifts from vendors. “Sue,” was a character in the course. But she learned from her experience with “Sue,” and later on was able to deal appropriately with another character in the course in a similar situation (see Figure 1). Simulations allow us to set the student up into uncomfortable situations that can lead to the teachable moments.
Figure 1 An example of a “teachable moment.” A simulation from Level 3 of our course, Ethics and Integrity.
Levels, not lessons
In a traditional e-Learning course, content is often modular, or “lesson-based.” Learners start out in the first lesson, and progress through various topics until all have been completed. Each lesson covers a new piece of content.
Our courses approach learning quite differently. The APCC courses are “level based,” not “lesson based.” Each course is made up of three levels. For each “level,” learners experienced all of the content. In succeeding levels they are exposed to the same material, but at a slightly higher level of difficulty. This idea of levels, while fitting in nicely with the entire video-game feel, actually had to do entirely with Bloom’s Taxonomy (see Figure 2).
Figure 2 The six major categories in Bloom’s Taxonomy. Here they are listed in order starting from the simplest behavior (Knowledge) to the most complex, (Evaluation). Our courses progress through only the first three.
In Level 1 of a course, learners progressed through all of the content, but at a knowledge level. For example, Level 1 of the Ethics and Integrity course covered basic facts and terminology (see Figure 3). In Level 2, learners moved through the same content, but at a comprehension level. We were looking to see if they grasped the concepts and could predict consequences. Finally, in Level 3, learners engaged in simulations to see if they could apply the content in real world situations. (See Figure 4.) The end result of a level-based design is that learners go through the same content, but see it in varying contexts.
Figure 3 An example of a Level 1 interaction from the Ethics and Integrity course.
Figure 4 An example of a Level 3 interaction. In this example, the learner needs to apply what was learned in Levels 1 and 2.
The level-design approach was key to learner retention in the APCC courses because our audience was able to practice their skills in varied and increasingly more complex contexts. Brenda Sugrue, in her 2002 article, Practice Makes Performance, states that a key element to knowledge transfer includes opportunities to practice skills in varied contexts, with monitoring and feedback that identifies and corrects misconceptions and faulty reasoning.
Cognitive apprenticeship
So we incorporated Bloom’s taxonomy into the overall instructional design of our courses and let them apply their new skills in real-world simulations. Another element of our instructional approach was the idea of a cognitive apprenticeship — a term defined by Kevin Oliver in his 1999 article, Situated Cognition and Cognitive Apprenticeships.
As Oliver explains it, a cognitive apprenticeship is designed to build expertise through online, guided discovery. Just as it’s often easier to learn a foreign language through immersion, we felt that the best way for our audience to fully grasp mundane policy knowledge was to immerse them into realistic situations. In our courses, learning often took place just beyond what our audience knew, but not at a level of impossibility.
For example, there was one situation in a course where learners had to help a colleague who is constantly being passed over for promotions. This colleague happens to be gay. While the learner may not yet know how to solve this complex issue on their own, they had the chance to ask for a hint, or search for help in the course library. If they were unable to find the answer, or answered incorrectly, they would either see the scenario resolved properly, modeled by the course mentor character, or proceed down an alternate path where consequences are shown. In either case, a cognitive apprenticeship helps learners transfer these new skills over into the real world.
Conversational writing style
While there may be some very sound reasons for using a “legalese” writing style, employees tend to cringe when they have to read through the policies. Working in tandem with our legal and HR compliance departments, we were able to capture the meaning of the policies and translate them into something that everyone could understand. Let me give you an example.
Here are two passages. One is taken from the Human Resource Policy Guide. The other is a passage taken from our course. See if you can guess which one is which.
Passage One: “You should not accept any money, property, gift, benefit, service, loan, credit, special discount, favor, entertainment or other items of value from any person with whom Allstate does business, with whom the company is seeking to do business, or from any person seeking to do business with the company. The term “person” includes, but is not limited to, policyholders, claimants, financial institutions, or any business or professional firm or corporation.”
Passage Two: “The rule of thumb here is simple. Don’t accept gifts from a person/ supplier Allstate does business with or from someone who wants to do business with Allstate. An example of a situation you’d want to avoid would be to attend a sporting event paid for by a potential supplier.”
If you haven’t guessed, the second passage is from our course. The tone of these courses needed to be conversational. We wanted the tone to be conversational not only because the content needed to be easier to understand, but also because we wanted our learners to feel immersed in the situations. We wanted them to feel as if they were being talked to, as if the characters in the course were their colleagues. By doing this we made the experience more compelling, because the learner has been directly involved in the action — and the outcomes.
Driving them to the tools
As much as we may not appreciate the style in which corporate policies are written, we felt that, once the learners understood their meaning, that they would be excellent tools. A “Library” section was included in each course that contained links to the actual policies and other resources. Learners were reminded that, as they ran into challenges or scenarios within the course, they should refer to the online guides.
The Library was an important instructional component. First, it helped learners to find success within the course — always a good thing when you’re doing simulations. Just as a game designer would never throw players into a dangerous situation without a sword (or at least a way to get one), we didn’t want to frustrate our learners by throwing them into simulations without some help. The Library also served to reinforce behavior that we hoped would transfer over after the course was completed. Simply put, we wanted employees to know they could go to the HR policies on our Intranet if and when they should run into future similar real-life situations.
Gaming elements of a
DEL
approach
While the instructional elements would lead learners to content retention, it was the gaming elements that would keep learners tuned into the course. Briefly, here are the four gaming elements included in the courses.
Scoring: Perhaps the greatest gaming element
of a
Figure 5 A learner (Melissa) loses 50 points for picking the wrong “Expert” in Level 2 of our Sexual Harassment/Non-Discrimination course.
Simplicity: Learners should be focused on the content, not the games. Games that seemed familiar and were easy to learn were included in the design. One game, which was set in the employee cafeteria, was based on the trivia games popular in many bars and restaurants. In those games, players compete against each other in answering timed, trivia style questions. The longer they take to answer a question, the less the question’s point value becomes. Our game is similar. Learners encounter various people in the cafeteria who would ask them multiple-choice style questions. But instead of a timed response, learners encounter a person on the screen who, in effect, “sells” them hints. For every hint the learner uses, the total possible point value for that question goes down.
Relevancy: This wasn’t “Sexual Harassment Space Invaders” or “Ethics and Integrity Pac-Man.” Instead, games and gaming elements were woven into the experience without our learners being overtly aware that they were engaging in a true game. Beyond the trivia-style games that took place in the cafeteria we also had match-game style games that took place in a videoconference room and “To Tell the Truth”-style games that took place in a manager’s office. The attempt here was to use games as a tool to keep learners motivated. Again, the focus was on learning, NOT on the game.
Higher Production Values: To engage our audience, a
Technology elements
Every effort was made to keep things simple in regards to development. We may have been going for a video game-like experience, but we absolutely did not have a video-game-like budget. Resources were limited and we could not keep the client waiting. There was some upfront development time as the interface and graphics for the first course were created. But everything was templated and designed to be repurposed. The development time for other courses in the curriculum was then cut substantially. Not counting the reviews and user tests, it took about two months to develop each course utilizing (or not utilizing) three selected technology elements.
HTML-based: The programmer used Dreamweaver, an HTML editing tool, along with some small amount of java scripting, to design this course. No plug-ins were needed to run these courses.
Impactful Graphics: Images and backgrounds were selected that pulled the learner into the experience. The images had a modern video-game feel to them, but did not include any animation. Basically, our graphic artist simply took digital pictures of various locations around Allstate, and then doctored them up in Photoshop to give them a video game “edge.” We just as easily could have used the digital images in their original, un-doctored state. The goal, regardless of how we did it, was to give learners the impression that they were part of the scene. As far as he characters we had in the courses, they were just people from a clip-art library. Our graphic artist simply found ones that had a range of expressions and plopped them onto our digital backgrounds.
No Audio: In order to lower the bandwidth requirements; no audio was used in these courses. This always seems to be an issue with many learners. For some people, they miss it when it’s not there. For others, it’s just a distraction. Surprisingly, as learner focus groups revealed, audio in our courses wasn’t missed.
From our user group sessions, we learned that users enjoyed the conversational tone of the courses, and found them easy to follow. What also seems to work in our courses is that per-screen word counts are kept to a minimum. If a block of text can’t fit into an on-screen speech bubble, it means we need to cut it down. By doing these simple things, the experience becomes more enjoyable, and the audio isn’t missed.
Learner reactions to digital experiential learning
Thorough user tests were included as part of the design process since this curriculum contained several new elements. In total, about 60 learners actually took the courses while being observed. Afterwards, the HR Education team met with the participants and had hour-long debrief sessions to collect their feedback.
The general reaction of the learners
to the
- Learners did not require elaborate “video game-quality” 3D animations. Static backgrounds seemed to capture their attention and hold their interest.
- Learners didn’t require audio. One of our biggest worries was the trade-off we made in our decision to use high quality graphics over audio. Nearly all of the people in our tests stated that audio often just “gets in the way” and that they didn’t miss it. They enjoyed the conversational tone of the text and liked the fact that reading, for each screen, was kept to a minimum.
- Scoring was a motivator. Just about all of the participants expressed the opinion that it gave the courses a more competitive feel. Many people said that it made them want to re-read material or look at the actual policy guides (which were accessible to learners through a “Library” section in the course) in order to get the answer “right” and receive more points.
- Learners made connections that enabled retention. One of the most interesting things that I observed with our participants was that many of them remembered the content because they associated it with a particular character in the course. One person said that a character that appeared in a cafeteria scene reminded her of a conversation she had recently about a situation involving receiving gifts from vendors. And because of what “Sue” (the character in the course) told her, she’d know how to handle that situation in the future. Another user stated that she thought we “taped her lunchroom conversations because content matched up so well to many of the concerns she was familiar with.
Conclusion
The time invested up front to
develop the
While a
References
Janz, Brian & Wetherbe James
(1999). Motivating, Enhancing and Accelerating Organizational Learning:
Improved Performance Through User-Engaging Systems. The
Oliver, Kevin (1999). Situated Cognition and Cognitive Apprenticeships. Retrieved September 12, 2002, from Virginia Tech, Educational Technologies Web site: http://www.edtech.vt.edu/edtech/id/models/powerpoint/cog.pdf
Postman, Neil (1995). The End of Education.
Sugrue, Brenda (2002). Practice Makes Performance. Retrieved September 13, 2002, from the American Society for Training and Development Web site: http://www.learningcircuits.org/2001/oct2001/sugrue.html (Editor's Note: As of February 8, 2010, this article appears to have been removed from the Web.)