105 Getting Radically Better Data from Your Learners
10:45 AM - 11:45 AM Tuesday, March 26
Salon 3
Bad data creates faulty decision-making. To improve your learning designs, you need good data. Unfortunately, many learning-evaluation methods are seriously flawed.
In this session, you’ll learn how to use a radical new research-inspired approach to getting feedback from your learners, whether in eLearning or in the classroom. This performance-focused approach will be introduced by its inventor, who will share the lessons he’s learned in implementing it at global corporations and not-for-profits. You’ll also examine a new model of learning evaluation, the Learning-Transfer Evaluation Model, which is designed to help your organization build a more effective learning-evaluation strategy.
In this session, you will learn:
- About the research that shows why you should stop using traditional evaluation methods
- Why Likert-like scales and numeric scales are harmful
- About the importance of measurement that reveals information about learning effectiveness
- About the Performance-Focused Smile Sheet methodology
- How to translate the new method for eLearning
- How to use the Learning-Transfer Evaluation Model to develop an effective evaluation strategy
Audience:
Designers, developers, managers, and senior leaders (directors, VP, CLO, executive, etc.)
Will Thalheimer
Founder
Work-Learning Research
Will Thalheimer, PhD, MBA, is a world-renowned speaker, writer, researcher, and consultant focused on research-based best practices for learning design, learning evaluation, and presentation design. Will wrote the award-winning book Performance-Focused Learner Surveys (second edition); created LTEM, the Learning-Transfer Evaluation Model, the Presentation Science Workshop, and co-created the eLearning Manifesto. Will has the honor of being a Learning Guild Master.
206 Data Before Design: The Inspiration Behind Performance-based Leader Learning
1:00 PM - 2:00 PM Tuesday, March 26
Salon 15
It's not every day L&D professionals get to spend time in the trenches conducting a true needs analysis, learning firsthand about the needs of leaders and the barriers that impact their performance. We did and survived to share what we learned, translating this analysis into customized yet scalable performance-based solutions. The strength of leadership at the mid-level is critical to employee engagement and organizational success.
At Trinity Health approximately 7,000 middle managers support the majority of colleagues, address complex healthcare challenges, execute strategy, and navigate transformational change. Before our design team could hope to successfully meet their development needs, we needed to really learn about our target audience. This session will explore how we collected voice-of-the-customer data and analyzed key research on leadership development and high-performing learning organizations.
In this session, you will learn:
- How to conduct an in-depth needs analysis and determine what type of needs analysis makes sense for your organization
- Tips for generating ideas on how to use primary and secondary research to create performance-based leadership development
- Strategies for scaling and customizing learning to fit the needs of today’s leaders
Audience:
Designers, managers, senior leaders (directors, VP, CLO, executive, etc.)
Shannon Young
Senior Design Consultant and Learning Strategist
Trinity Health
Shannon Young is a senior design consultant and learning strategist at Trinity Health. She has over 25 years of experience in consulting, instructional design, education research and analysis, curriculum and program development, program management, process improvement, and distance education. Shannon has created custom learning solutions and performance support materials for corporate, nonprofit, and academic clients. She holds a BA in English and an MA in literacy, language, and learning from the University of Michigan. At Trinity Health, Shannon is the learning strategist and architect for the mid-level leader development program. She is responsible for needs analysis design, data collection, analysis, and reporting.
Shelby Dria
Manager, Instructional Design
Trinity Health
Shelby Dria is a manager of instructional design at Trinity Health. A CPLP, Shelby is dedicated to continually uncovering better ways of creating, delivering, and measuring relevant learning and development experiences. With 20 years of combined expertise in team leadership and internal consulting, she has worked for various for-profit and nonprofit companies including Trinity Health, Columbia Sportswear, The Great Indoors, and Target. Shelby has spent much of her career developing expertise in leadership development, training and performance analysis, and virtual training and management. Her career highlights include designing local and global leadership development programs for new leaders, mid-level leaders, and high potentials.
SMM104 CANCELLED - Building the xAPI Ecosystem of Your Dreams
1:00 PM - 1:45 PM Tuesday, March 26
Expo Hall: Management & Measurement Stage
You’re excited about the promise of an xAPI-enabled world, but you’ve got a learning management system, a catalog full of SCORM-based courses that you need, and a handful of learning tool vendors that don’t use xAPI. What if you could get the most out of an LMS and an LRS at the same time as you move to your next-generation learning and performance infrastructure?
This session will start with the learner-facing tools that will capture your xAPI data: eLearning, mobile tools, performance support, social and informal activities, and data sources from the business. You’ll review your options when it comes to learning record stores and how they work (or don’t work) with your LMS. Will you work with a stand-alone LRS? A front-end xAPI solution with a built-in LRS? Or an LRS that is aligned with your LMS and your current learning infrastructure? You’ll hear real-world stories of three different xAPI implementations to help you plot your organization’s course toward your next-generation learning ecosystem.
In this session, you will learn:
- How to synthesize activities from a variety of front-end learning tools into a coherent picture of learning and performance
- How to discover possibilities for your next-generation learning and performance infrastructure
- How to identify key partners in your business to engage all along the way as you migrate from SCORM to xAPI
- How other organizations have implemented xAPI, and lessons from their experience
Audience:
Developers and managers
Technology discussed in this session:
xAPI, learning management systems, and learning record stores
Rob Houck
Head of Technology Innovation
UL Compliance to Performance
Rob Houck is the head of technology innovation at UL Compliance to Performance. He has provided strategic direction for learning and talent management software, managed software development and support of technology products and services, and overseen software implementations for more than 3.2 million users in 73 organizations. Rob has worked in technology for more than 25 years and has consulting experience ranging from small business to Fortune 100 clients.
305 Demonstrating the Value of Training to Your Organization
2:30 PM - 3:30 PM Tuesday, March 26
Salon 16
Demonstrating the value of training and eLearning is a common challenge for instructional designers and eLearning developers. ROIs and cost-benefit analyses are almost never done, and KPIs often aren’t used to measure performance changes. However, using data to understand the results of your projects and approaches and then share them with others can do a lot to help others see what benefits come from your team’s work.
This session will dig into simple techniques you can follow to prove the value of training—from completing an initial training needs analysis, to calculating the cost-benefit of training, to evaluating success using key performance indicators (KPIs). You’ll also learn how to identify when training isn’t the solution to the business problem, and what to do instead.
In this session, you will learn:
- How to identify KPIs
- How to do a cost-benefit analysis
- How to do a training needs analysis
- How to measure performance
- How to propose non-training solutions
Audience:
Designers, developers, and managers
Nicole Legault
Community Manager
Articulate
Nicole Legault is a community manager at the software company Articulate. Nicole has a varied skill set that includes expertise in instructional design, eLearning development, and more. She has written hundreds of articles on the topic of eLearning and instructional design. She is a skilled public speaker and has delivered many hours of training and presentations on a variety of topics related to training. Nicole strives to create engaging sessions based on practical skills that can be used immediately on the job.
412 Evidence of Impact: How Metrics Drive a Learning and Performance Ecosystem
4:00 PM - 5:00 PM Tuesday, March 26
Salon 5
Do the executives who fund learning and development care how many courses you have? Or how many students? Or the number of class hours you’ve delivered? Actually, they may react negatively to those numbers. Most of the time, when people are in training their productivity is zero. The key question is how to get to Level 4 and measure actual impact.
Learning and performance ecosystem solutions tend to be more direct, effective, and instantly available, especially when they include components built into the workflow. These solutions are capable of generating a good deal of data. The trick is to identify what data is most useful in building a chain of evidence that explains the solution’s impact on business productivity. This session will introduce a framework for identifying the right business metrics and targets, deciding what learning and performance solution data to track, and developing evidence of the solution’s impact.
In this session, you will learn:
- How to describe a learning and performance ecosystem
- How to work with a customer or sponsor to articulate a human performance problem
- How to discover the business metrics negatively impacted by the problem
- How to identify solution data that could provide evidence of positive impact
- How to use analytics to monitor the solution’s impact over time
Audience:
Designers, developers, managers, and senior leaders (directors, VP, CLO, executive, etc.).
Technology discussed in this session:
Learning management, talent management, performance support, knowledge management, expertise location and management, social networking and collaboration.
Steve Foreman
President
InfoMedia Designs
Steve Foreman is the author of The LMS Guidebook and president of InfoMedia Designs, a provider of eLearning infrastructure consulting services and technology solutions to large companies, academic institutions, professional associations, government, and military. Steve works with forward-looking organizations to find new and effective ways to apply computer technology to support human performance. His work includes enterprise learning strategy, learning and performance ecosystem solutions, LMS selection and implementation, learning-technology architecture and integration, expert-knowledge harvesting, knowledge management, and innovative performance-centered solutions that blend working and learning.
505 Design with the End in Mind: Getting Measurable Results with xAPI
10:45 AM - 11:45 AM Wednesday, March 27
Salon 17
As xAPI gains traction in the learning space and is incorporated into more authoring tools, apps, and enterprise systems, the specification is making a transition into easy, widespread use. However, if you’re looking to better use it at your organization, you may be wondering how to develop a strategy to implement meaningful xAPI.
This session will explore how to develop a strategy for data you want to capture by starting with the end in mind. You will examine why organizations have adopted xAPI, potential sources of xAPI data, and the impact of xAPI on existing resources. You’ll begin by considering your end goal and learn how to develop a strategy to get useful data from xAPI to support adaptive learning, reports, and visualizations. You will leave with an understanding of xAPI, common pitfalls in implementation, and tools to support good use, as well as a host of free resources to get started today.
In this session, you will learn:
- The definition of xAPI, and its differences from SCORM
- About common sources of xAPI data
- How to outline a data strategy for implementing xAPI
- How to identify resources to get started with xAPI
Audience:
Designers, managers, and senior leaders (directors, VP, CLO, executive, etc.)
Technology discussed in this session:
Learning record stores and forced direction graphs
Art Werkenthin
President
RISC
Art Werkenthin, president of RISC, built his first learning management system (LMS) in 1988 and now has over 25 years' experience working with LMS in the oil and gas, retail, finance, and other industries. Art is keenly interested in the xAPI specification, and RISC was an early adopter of this technology. Interested in expanding the xAPI to the LMS, Art has served for the past three years on the ADL cmi5 committee. In 2015, RISC demonstrated the first implementation of a cmi5 runtime engine embedded in its LMS. Art has presented on cmi5 at several conferences, including mLearnCon, DevLearn, and xAPI Camp.
Duncan Welder
Director of Client Services
RISC
Duncan Welder is a director of client services for RISC. He is an educational technology geek, having spent over 20 years implementing learning management systems, domestically and abroad, to manage regulatory compliance. As an xAPI evangelist with a career grounded in instructional design and eLearning, Duncan has provided presentations to professional organizations including the Connections Forum, The Learning Guild, and the Association for Talent Development. Duncan is an active member of the Houston ATD, currently serving as director of special interest groups.
SMM202 Demystifying xAPI with Immediate Strategies for Learning Analytics
11:00 AM - 11:45 AM Wednesday, March 27
Expo Hall: Management & Measurement Stage
You continue to hear about xAPI and how learning analytics can bring new insights about your training, but you aren’t quite sure how to get started. You may also feel like xAPI is only for developers and requires a lot of technical knowledge and coding skills. Because of these misconceptions, you are missing out on valuable learning data that you could be using to improve your training.
In this session, you will explore practical ways to get started with learning analytics. You’ll find out which basic tools you need to start collecting learning data and gain insights on how your learners are interacting with your training. You will also take a look at different learning record providers (LRPs) that send xAPI statements out of the box, and learning record stores (LRSs) that can provide deeper insight into those statements.
In this session, you will learn:
- How to plan for learning data in the design phase
- About free and low-cost solutions to start collecting learning data
- What to do with the learning data you’ve collected
- Immediate ways to gather learning data in your next project
Audience:
Designers, developers, and managers
Technology discussed in this session:
Learning record stores (LRSs) and learning record providers (LRPs)
Phil Littleton
Lead Manager—Digital Delivery Services
Association of International Certified Professional Accountants
Phil Littleton is a lead manager of digital delivery services for the Association of International Certified Professional Accountants. Phil is primarily a professional learner but also practices as an eLearning developer, web developer, learning analytics explorer, learning experience designer, and learning technologist. He has been in the L&D field, officially, for over 10 years as a trainer, instructional designer, and LMS administrator.
605 Correlation Is Not Proof: Gaining Legitimate Insight from Learning Data
1:00 PM - 2:00 PM Wednesday, March 27
Salon 15
L&D has wrestled with the idea of measuring impact for decades. Unfortunately, most of the methods used as “proof” of impact are simple correlations. And most people know the adage “correlation does not equal causation.” This session will help participants understand different, more practical types of analytics, and how they can provide much more impactful insights than simple correlations.
This session will examine practical examples of data sets, and how insights can differ greatly based on the analysis chosen. Using the same data sets for different analyses, you will learn how simple analyses lead to broad—and often dangerously inaccurate—insights. By learning how to best structure data for analytics—using simple, everyday tools—you will see the benefits of proper analysis and the dangers of a simplistic analysis of the same data. Most importantly, you will then see how logically a strategy for performance improvement flows from proper analysis—and how easy it is to continue improving your organization.
In this session, you will learn:
- About the risks of using correlation to “prove” impact—and how common that is
- The difference between relative, comparative, and distributive analytics (in plain English)
- How to identify statistically significant differences
- The difference between a statistically significant event and a fluke
- Practical steps to identify L&D overachievers and underachievers
- How to easily craft a prescriptive strategy from those findings
Audience:
Managers and senior leaders (directors, VP, CLO, executive, etc.)
Technology discussed in this session:
Microsoft PowerPoint and Excel
A.D. Detrick
President
MetriVerse Analytics
A.D. Detrick is the president and founder of MetriVerse Analytics, a leading provider of L&D/HR measurement and analytics consulting. He is a recognized expert in the areas of learning measurement, assessment, evaluation, and human capital analytics. In his role, he oversees the design and implementation of measurement and analytics strategies for many of America’s largest and most technically innovative companies. He is a regular speaker at industry events and has contributed to numerous books on learning and analytics. Before founding MetriVerse, A.D. helped design measurement strategies as a consultant for Xerox Global Learning Services, Intrepid Learning, and JPMorgan Chase.
705 Prototype to Implementation: Building Organizational Buy-In for xAPI
2:30 PM - 3:30 PM Wednesday, March 27
Salon 13
You’ve heard about xAPI but wonder what comes next. Getting from this initial position of interest to widespread organizational buy-in can be a huge challenge. How do you justify taking resources away from creating and curating learning experiences to build something new and unproven? It can be a challenge to identify the first steps needed to start convincing stakeholders that the investment is worth it. xAPI is a complex solution, and there is no road map that an organization can follow. Everyone is looking for the best ways to use project management strategies to leverage the resources they have access to, so they can achieve those first “wins” in the process of implementing xAPI.
This session will tell the story of how a team at LLamasoft went from creating SCORM content to developing a learning ecosystem. Using low-cost resources, they created a proof of concept consisting of prototype, pilot, and implementation. You’ll learn about the steps that made this a reality—steps ranging from the simple (picking a text editor) to the complex (selecting an LRS) and even the frustrating (working with, and sometimes against, the code). After the organization gained the ability to track video usage in its help system (and more!), xAPI became a no-brainer to stakeholders. This session will explore how real-life victories and setbacks shaped what they were able to achieve. You’ll walk away prepared to establish a proof-of-concept xAPI project that drives value to your organization.
In this session, you will learn:
- How to find the resources to make xAPI work for your organization
- A process to establish an xAPI proof of concept that drives value
- Product management principles and terms to gain organizational buy-in
- An incremental approach to develop xAPI competency
- How the LLamasoft team got their first “wins”
Audience:
Designers, developers, and managers
Technology discussed in this session:
xAPI and reporting software
Andrew McGuire
Learning Experience Designer
dRofus
Andrew McGuire is a learning experience designer at dRofus, where he specializes in developing engaging content and tracking learner experiences. He has been working in eLearning development for the past five years. Before joining the world of eLearning, Andrew taught English at the college level for seven years. He has an MA in English composition from Northeastern Illinois University.
Ryan Hicks
Director, Learning Design and Education Services
Workforce Software
Ryan Hicks’ unconventional path to becoming a learning professional includes years as a musician and band manager, a BS in industrial engineering, and a decade in supply chain design. His balanced approach of optimism and skepticism has led to the development of multiple learning & development organizations and professional credentials. As a lifelong student, he embraces the adage that “change is the only constant.”
807 Show Me What You Got: Simulation as Assessment
4:00 PM - 5:00 PM Wednesday, March 27
Salon 11
You’re an instructional designer who cares about the efficacy of your course—in other words, you want to make sure your users are actually learning something. You start writing some multiple-choice questions and throw in some true/false questions for good measure. This is just what you do for educational assessment, right? Stop here for a second. Do you understand why you’re doing that? Are you interested in some alternative methods for assessing your learners?
This session will introduce you to the fundamental concepts of educational assessment. You will explore alternative methods of educational assessment in the form of simulations. Additionally, you will learn how different fields are turning to simulation-based assessments, and you’ll learn about some tools used to develop them in the eLearning community. Finally, you will see a live example of a simulation-based assessment developed in Storyline.
In this session, you will learn:
- Fundamental concepts in educational assessment
- Why simulations work as assessments
- How different fields are using simulation-based assessments
- How scoring methods work in simulation-based assessments
Audience:
Designers and developers
Technology discussed in this session:
xAPI, iSpring, BranchTrack, and Storyline.
Jenny Saucerman
Online Learning Instructional Design Manager
Credit Union National Association
Jenny Saucerman is an online learning instructional design manager for Credit Union National Association. She joined CUNA in May 2018. Jenny has over 10 years of experience in the eLearning space, with a focus on simulation and game-based learning, assessment, and learning analytics. She holds a master's degree in educational psychology from the University of Wisconsin- Madison.
905 None of the Above: How Good Intentions Create Bad Assessments
8:30 AM - 9:30 AM Thursday, March 28
Salon 9
As more learning and training is delivered online, instructors and trainers no longer have the opportunity to get to know all their learners by meeting them face-to-face. This disconnect often leads to missing or ineffective assessments as eLearning developers increasingly hope their learners understand the content, but lack the tools to adequately measure the level of understanding. How do you know if your eLearning assessments are effective? Are you testing learners’ content knowledge, or do your assessments measure their deduction skills? In this session, you will explore common test-writing pitfalls and discover how smart test-takers excel without learning the content.
This session will help trainers and eLearning developers recognize high-quality, content-driven assessments and provide an opportunity to experience poorly written test items from the point of view of the test-taker. Learn multiple-choice question-writing “rules,” such as including plausible distractors; learn to recognize common test-item pitfalls, like convergence; and find out how smart test-takers use grammatical cues and distractor length to make educated guesses. Combine all of these best practices, and you will take your eLearning assessments to the next level!
In this session, you will learn:
- Best practices for multiple-choice assessments
- About common assessment-writing pitfalls and how to avoid them
- How to connect assessments to stated learning objectives
- How to identify the depth of knowledge (DOK) level of a learning objective
- How to match the DOK level of an assessment item to its stated learning objective
- About the consequences of relying on poorly written assessment items
Audience:
Designers, developers, trainers, and other training professionals
Sean Hickey
Lead Curriculum Developer
Ohio State University
Sean Hickey is a curriculum developer and instructional designer at Ohio State’s Center on Education and Training for Employment (CETE). As part of his role, he facilitates item-writing workshops for statewide career-tech end- of-course tests and industry credentialing exams, and he develops eLearning materials for teachers and subject matter experts. Sean was previously an instructional designer at McGraw-Hill Education, where he partnered with Apple in the creation of the first generation of interactive iPad textbooks. He has taught educational technology courses and is actively involved in several instructional design groups and associations at both the state and national level.
Cara North
Learning & Development Leader, Speaker, & Author
Medical Mutual
Cara North is an award-winning learning leader who has worked in both corporate and higher education settings, as well as an independent consultant. Cara currently manages the learning and performance function at Medical Mutual. She is the author of Learning Experience Design Essentials and serves as a lecturer at Boise State University in their Organizational Performance Workplace Learning (OPWL) masters and certificate program.
1005 Are You xAPI Ready? Best Practices for xAPI and Rapid Development
10:00 AM - 11:00 AM Thursday, March 28
Salon 9
xAPI has been around for a few years. You’ve been hearing about what it is and how it’s used in a conceptual sense. But what about the real world? This session will discuss a company that has made a long-term investment of time and money in existing learning technologies. They were interested in leveraging xAPI but had questions: What could they track with xAPI? Would they need to scrap everything and redesign from scratch? Could they use their current rapid development tools, or did they need to learn a new technology? What benefits could they realize from xAPI?
Rapid development tools, LMSs, and LRSs are helping in the transition to a period of widely used xAPI. In this transition, what are the design considerations and best practices for incorporating xAPI into online content? How can you maximize existing development tools while taking your courses to the next level? This session will go beyond concept and look at how you would apply xAPI using three development tools. After that, you will look at the results in a couple of LMS/LRS tools. You’ll learn how problems were solved by identifying the right content and enhancing it using rapid-development tools. After feeding the content into the LMS/LRS, you will see what reports you could get. You’ll see real-world examples that can be used to make business decisions.
In this session, you will learn:
- How to incorporate xAPI into your existing rapid development toolkit
- Why you should incorporate xAPI into your next rapid development project
- Where to start looking for relevant xAPI data
- Strategies to ensure consistent and valuable data beyond completion status and passing score
Audience:
Designers and developers
Technology discussed in this session:
Rapid development authoring tools, learning record stores, and learning management systems
Duncan Welder
Director of Client Services
RISC
Duncan Welder is a director of client services for RISC. He is an educational technology geek, having spent over 20 years implementing learning management systems, domestically and abroad, to manage regulatory compliance. As an xAPI evangelist with a career grounded in instructional design and eLearning, Duncan has provided presentations to professional organizations including the Connections Forum, The Learning Guild, and the Association for Talent Development. Duncan is an active member of the Houston ATD, currently serving as director of special interest groups.
Debbie Richards
President
Creative Interactive Ideas
Debbie Richards, president of Creative Interactive Ideas, is a learning architect, self-proclaimed geek, and early adopter of learning technologies. For over 30 years, she has helped enterprise teams design, develop, and deliver immersive learning programs with measurable impact. Passionate about working with and mentoring other learning professionals, Debbie is a director at L&D Cares. The nonprofit group provides talent development professionals with no-cost coaching, mentoring, and resources to help them thrive and flourish in their careers. She is the past president of the Association of Talent Development, Houston chapter, and a past national advisor for chapters. Debbie has authored two TD at Work guides, Seeing the Possibilities With Augmented Reality and Preparing Your Organization for New Technologies.