I watched the clock on my laptop screen creep into the final minutes of the hour, a sense of impending dread growing in my stomach. It was about to happen again; I could feel it. I'd spent the entire last hour asking meticulously planned questions, straining my ears to detect any sign of information that sounded useful. I pored over my notes: Half sentences. Inexplicably bolded terms. And a smattering of technical details that may as well be hieroglyphics. My training analysis meeting was about to become a huge waste of time.

How many times have you left a training analysis conversation feeling less informed than when you started?

Very often in training analysis conversations, a client will come prepared to discuss the situation with confidence, but the info they provide will be conversational, anecdotal, and not necessarily the result of careful analysis. I often describe what I hear in training analysis conversations as an ethereal cloud of information. The client is giving us a lot of relevant information, but it's just floating around; it's not packaged in a way that makes a solution obvious.

Prepare to achieve better outcomes

To help turn that information cloud of ether into solid training targets, I approach early client meetings prepared with a questioning strategy and a conversation plan that will help me figure out what the client really needs. Using any information that I have in advance, I prepare questions that will help me understand the real performance gap—what barriers are preventing people from performing correctly, and what solution(s), training or otherwise, can most effectively solve the problem.

I center my conversation strategy around looking for information that supports three specific outcomes of the training analysis:

  • Identifying action statements
  • Identifying barriers to performance
  • Identifying solution-shaping factors

Identifying action statements

As the client discusses their perception of the performance issue, I listen for details that point to specific actions being described that will be expected from the learners: What specifically do people need to DO with this information?

Part of an effective questioning strategy should help the client give specific details about exactly what actions they need their learners to be taking that they are not doing now. I always have questions prepared, but often the most helpful information I get from the client comes from follow-up questions that let me drill further into the details.

Here are some of my favorite questions:

  • Can you describe a scenario where an employee perfectly executes the task or process in question? What exactly are they doing?
  • Can you give examples of common mistakes or oversights made by employees in this area? What should they be doing differently?
  • You've said that your employees need to understand (concept x). How would you know if they did understand it properly? What would you see them doing?

More generally, a habit of asking some semblance of "What exactly does that look like?" for any descriptions the client provides is usually helpful.

Often, clients won't state actions directly but rather will allude to them or say things that hint at specific actions. A helpful skill here is the ability to interpret vague or indirect info provided by the client and translate it into specific, observable actions. This can come from a combination of careful listening, considering all known information about the role, process, performance environment, and other characteristics that may be influencing performance, and then making some inferences or educated estimates about what specific action statements should be.

By 45 minutes into a training analysis conversation, I've often noted half a dozen potential action statements, which I can focus on as we move forward. It's important to verify that the action statements I've noted are accurate, aligned with the client's perception of the need, and fully describe the anticipated outcome. Once confirmed, these action statements will closely inform the learning objectives and performance assessments that go into the design of the learning solution.

Identifying barriers to performance

Part of my questioning strategy looks for performance barriers: What is working against us to keep people from doing things perfectly? What elements exist that are preventing learners from performing the actions that I've identified? What makes it challenging for people to do these things?

Sometimes, barriers to performance will be purely knowledge- or skills-based; these can hypothetically be solved with a training solution, something that gives the learner information and an opportunity to practice a new skill.

Frequently, though, performance barriers are based on other factors. People may have difficulty finding or using the tools or resources they need to successfully perform a task. Perhaps they get limited or conflicting feedback from their leaders about their performance. Or (rather commonly actually), maybe people are faced with contradicting motivations: They may be told to meet expectations for safety and quality, but they are awarded pay bonuses for increased productivity, for example.

By listening for and documenting performance barriers, I try to understand whether a training solution alone will enable learners to perform the stated actions or if the barriers point to a need for other—non-training—solutions, such as modifications to tools or resources, management coaching strategies, etc.

Red flags

Hearing a version of any of these statements from a client raises a red flag, as they can indicate an unidentified performance barrier:

  • This isn't that hard; I don't understand why employees aren't doing it.
  • The previous training was ineffective; we'll need everyone retrained on this.
  • Everyone is passing the assessments in training, but once on the job, they're just not getting it.
  • Our employees' understanding of processes seems to decay after about a year.
  • Despite our training efforts, we're still seeing resistance to the new system.

If a client recites lines like these, it could point to an issue with incentivization, available tools, individual motivation, or even hiring standards. In these cases, I'm going to dig in further to try to understand which factors are in play.

Identifying solution-shaping factors

In a perfect world, a client would come to a training analysis and give me carte blanche to design the perfect learning solution for their situation. The learning event would meet learners' needs exactly; stakeholders would pop champagne bottles as they see their business goals exceeded; and learners would write songs with my name in them about how engaging and effective their training was. And yet, to date, not a single statue has been erected to honor my learning experience design.

In our somewhat less-than-perfect world, clients are overwhelmingly likely to come to the table with some specific type of learning solution already in mind: "We need training. It needs to be for this many people, it needs to be an eLearning, and it needs to be ready next month." Duly noted.

The reality is that learning solutions have to be designed for this imperfect world. While many of a client's preconceived ideas about a training solution can be negotiated, there are usually some challenging factors to work around.

The training analysis meeting is a great opportunity to uncover some of these other solution-shaping factors, and start to understand how much wiggle room these might have: What's influencing the learning solution other than learner needs?

I listen for information about the audience: The size, the variety of roles, and the location where they work can all have implications for the learning modalities that may be realistic. Similarly, the timeline of the project can have a huge impact on what deliverables are feasible. Sometimes, it might be as simple as "the CEO asked for an eLearning, so we're going with eLearning."

It's always fair to provide consultative guidance to try to nudge a client in the right direction. However, ultimately, it may come to pass that these factors are solid. By recognizing what is immobile at the outset, I'm in a better position to design an effective learning solution within the parameters I've been given. If I have to make the wrong kind of training, I'm going to make the best wrong training they've ever seen.

Try it out

The next time you're preparing for a training analysis meeting, think about your preparation, notes, and questions within the lens of these three outcomes: identifying action statements, identifying performance barriers, and identifying solution-shaping factors. Keep these outcomes in mind as you drift through the ether of client information and see if you can pull down bits of information that you can condense into solid factors that fit into these key categories.

Join us to learn more!

The Learning Guild has two upcoming events where you can learn how to demolish barriers to improved performance, learn from peers tackling the same challenges, and sharpen your skills. Join us at: