SMM101 The Future of Learning Data
10:00 AM - 10:45 AM Tuesday, March 27
With the advancements of machine learning, multiple data points now reveal previously hidden connections between what your employees know and the results your business will achieve. This revolutionary technology now allows you to reach Kirkpatrick’s Level 4—organizations can measure behavior change and predict future business outcomes based on learning data. These learning insights provide the opportunity for businesses to take strategic action before a predicted negative outcome becomes reality.
In this session, you will hear how getting to Kirkpatrick Level 4 is possible. The session will outline how learning data is captured and how it predicts business outcomes. You’ll also find out how best-in-class organizations like FedEx, At Home, and Walmart are using learning data to predict and change business outcomes, and you’ll hear about the results they’ve achieved.
In this session, you will learn:
- How to capture learning data that you can use to predict business outcomes
- Why the L&D field is undergoing a fundamental shift
- How organizations are changing predicted negative business outcomes before they happen
- How organizations can measure behavior change and predict business outcomes based on learning data
- How getting to, and moving beyond, Kirkpatrick Level 4 is possible
Audience:
Novice to advanced managers, directors, and senior leaders (VP, CLO, executive, etc.).
Technology discussed in this session:
Axonify.
Carol Leaman
CEO
Axonify
Carol Leaman is the CEO of Axonify, a disruptor in the corporate learning space and innovator behind the world’s first employee knowledge platform. Previously, she was CEO of several other tech companies, including PostRank, a social engagement analytics company she sold to Google. Carol is a thought leader whose articles appear in various publications; she also sits on the boards of many organizations and advises high-tech firms. Carol’s awards include the Waterloo Region Entrepreneur Hall of Fame Intrepid Award (2011) and the Sarah Kirke Award (2010) for Canada’s leading female entrepreneur. She is a finalist for the Techvibes Entrepreneur of the Year Award (2017).
SMM102 Using Learning Data to Predict Workplace Success
11:00 AM - 11:45 AM Tuesday, March 27
Expo Hall: Management & Measurement Stage
You already know the value of tying your people development activities back to the “holy grail” question: “Did anybody even use the thing?” But, at times, that can be a daunting task fraught with pitfalls and challenges that often make people resort to a “happy sheet” or “tick in the box.” There must be a better way!
In this case study session, you’ll learn about a process that designers undertook to identify the correlation, if any, between the way people behaved during an online development program and how they applied what they had learned back in their workplace. While the specific outcomes from this research highlight a specific L&D topic, the process is of broader interest, as it will give you insight into how you can make that all-important connection between learning and application.
In this session, you will learn:
- How to apply the Brinkerhoff Success Case Method in your own context
- How to bridge the gap between input (learning) and output (performance)
- How xAPI can provide valuable insights into audience behavior
- How you can use these insights to enhance and enrich your design and facilitation, as well as provide evidence of the impact of your people development activity
Audience:
Novice to advanced designers, managers, directors, and senior leaders (VP, CLO, executive, etc.).
Technology discussed in this session:
Curatr, Learning Locker, xAPI, Google Forms, and Skype.
Craig Taylor
Customer Success Manager
HT2 Labs
Craig Taylor, a customer success manager for HT2 Labs, has been involved in the training/L&D field since 1993, when he cut his teeth in the training-delivery world while serving in the British Army. His subsequent learning and development roles have been in the rail, nuclear, healthcare, and financial sectors, where he has worked to help organizations understand the value that current and emerging technologies can bring.
204 xAPI: An Introduction for Instructional Designers
1:00 PM - 2:00 PM Tuesday, March 27
Salon 9
As adoption of xAPI takes hold, the convergence of working and learning offers instructional designers the opportunity to do more than ever before. xAPI allows for more robust tracking of the learning process, including learning that happens outside the LMS and on the job. As actual data is integrated with learning metrics, you can tailor the process to individual needs and draw useful conclusions about the learning as a whole.
xAPI lets you offer and track learning experiences that are outside the LMS box. As an instructional designer, are you ready to step up to this challenge? This session will cover three key areas that impact instructional design: (1) identifying learning data needs, data sources, and meaningful visualizations that answer organizational and L&D questions; (2) making choices about infrastructure—how and when to work with your LMS, your LRS, or both; and (3) models for taking advantage of xAPI across a variety of learning vectors: formal and informal, social and private, formative and summative, predictable and variable.
In this session, you will learn:
- About new challenges in your work as an instructional designer
- About the impact that xAPI can have on your organization’s learning and performance strategies
- How to identify data needs and likely sources within the organization to meet them
- How to choose one or more initial projects that leverage xAPI’s capabilities beyond what’s available in SCORM today
Audience:
Novice to intermediate designers, developers, managers, directors, and senior leaders (VP, CLO, executive, etc.).
Technology discussed in this session:
xAPI, SCORM, LMS, and LRS.
Megan Torrance
CEO
TorranceLearning
Megan Torrance is CEO and founder of TorranceLearning, which helps organizations connect learning strategy to design, development, data, and ultimately performance. She has more than 25 years of experience in learning design, deployment, and consulting . Megan and the TorranceLearning team are passionate about sharing what works in learning, so they devote considerable time to teaching and sharing about Agile project management for learning experience design and the xAPI. She is the author of Agile for Instructional Designers, The Quick Guide to LLAMA, and Making Sense of xAPI. Megan is also an eCornell Facilitator in the Women's Executive Leadership curriculum.
302 Evaluation for the Real World
2:30 PM - 3:30 PM Tuesday, March 27
Junior Ballroom F
You’ve spent lots of time and money on your most recent L&D project. Now executives want to know: “Was it effective? What’s the ROI?” Do you know how to answer? If the answer is “no,” you’re not alone. One of the things many instructional designers struggle with is the dreaded evaluation. Even those who know the theory inside and out often struggle to put it into place.
Kirkpatrick’s model is great from an academic perspective, but the reality is that it’s challenging to implement in today’s fast-paced, deadline- and results-driven world. In this session, you will learn practical strategies for integrating evaluation into your projects from the very beginning. Evaluation is not a “one size fits all” process. You use different approaches to create different learning experiences, whether they’re instructor-led training, eLearning, social learning, VR, or blended curricula. In this session, you will learn to tailor your evaluation approach to your training project.
In this session, you will learn:
- Best practices for evaluation in a corporate environment
- How to set expectations with stakeholders
- What executives care about and what they don’t
- Tips for evaluating different types of training
- How to customize your evaluation plan to your project
Audience:
Novice to advanced designers, developers, and managers.
Dan Myers
Senior Manager of Staff Training
The Cheesecake Factory
Dan Myers is a senior manager of staff training for The Cheesecake Factory. Dan has more than 15 years of experience in all phases of learning and development, including managing the learning function, instructional design, multimedia development, and LMS administration. He brings many years of experience working with executive stakeholders to develop training programs that get results.
307 Evaluative Inquiry as a Catalyst for Learning and Change
2:30 PM - 3:30 PM Tuesday, March 27
Salon 14
Many learning and development professionals are shifting away from an overreliance on outcome-based evaluation models, such as the Kirkpatrick evaluation model, in favor of advanced evaluation models that drive actual use of findings and organizational learning. If you’re in the beginning stages of this transition, though, you may be wondering what other approaches are out there and how best to translate them from theory to practical use.
This session will share results from a recent case study that explored the use of one advanced evaluation model, evaluative inquiry (Preskill and Torres, 1999), within one department in a corporation. You’ll learn about the guiding principles, procedural guidelines, lessons learned, and recommendations throughout the planning, implementation, and evaluation process. This session will provide you with the resources and practical advice you’ll need to communicate, justify the use of, and apply the evaluative inquiry model within your organization.
In this session, you will learn:
- About the guiding principles of evaluative inquiry
- About the procedural guidelines of evaluative inquiry
- Lessons from applying evaluative inquiry in the field
- Tips on how to apply evaluative inquiry in your organization
- How to plan and implement evaluative inquiry
- About evaluative inquiry high-level processes
Audience:
Intermediate to advanced designers, developers, and managers.
Marie Paydon
Clinical Training Manager
AbbVie
Marie Paydon is a clinical training manager with AbbVie. She is a learning and development professional with extensive experience and passion around creating and elevating methods for organizational learning.
Marlies De Kluyver
Sr. Learning Excellence Manager, Global Learning
Abbvie
Marlies De Kluyver is a senior learning excellence manager of global learning for AbbVie, and a passionate learning professional who is inspired to improve the learner experience. She started her career as a digital developer and joined the Illinois Institute of Art to teach a new certificate program for adult learners. She collaborated with the school to redesign the program and develop four additional programs that launched across all sister schools in the US. Marlies also worked for Motorola as a media developer. At AbbVie, she has grown from the development side into supporting brands, global learning managers, and partners in driving learning excellence.
407 Building the xAPI Learning Ecosystem of Your Dreams
4:00 PM - 5:00 PM Tuesday, March 27
Salon 2
You’re excited about the promise of an xAPI-enabled world, but you’ve got a learning management system, a catalog full of SCORM-based courses that you need, and a handful of learning tool vendors that don’t use xAPI. What if you could get the most out of an LMS and an LRS at the same time as you move to your next-generation learning and performance infrastructure?
This session will start with the learner-facing tools that will capture your xAPI data: eLearning, mobile tools, performance support, social and informal activities, and data sources from the business. You’ll review your options when it comes to LRSs and how they work (or don’t work) with your LMS. Will you work with a standalone LRS? A front-end xAPI solution with a built-in LRS? Or an LRS that is aligned with your LMS and your current learning infrastructure? You’ll hear real-world stories of three different xAPI implementations to help you plot your organization’s course toward your next-generation learning ecosystem.
In this session, you will learn:
- How to combine activities from a variety of front-end learning tools into a coherent picture of learning and performance
- About the possibilities for your next-generation learning and performance infrastructure
- How to identify key partners in your business to engage as you migrate from SCORM to xAPI and all along the way
- From the experience of others who have implemented xAPI in their organizations
Audience:
Intermediate to advanced designers, developers, project managers, managers, directors, and senior leaders (VP, CLO, executive, etc.). A basic understanding of xAPI and learning management systems will be useful.
Technology discussed in this session:
xAPI, SCORM, LMS, and LRS.
Megan Torrance
CEO
TorranceLearning
Megan Torrance is CEO and founder of TorranceLearning, which helps organizations connect learning strategy to design, development, data, and ultimately performance. She has more than 25 years of experience in learning design, deployment, and consulting . Megan and the TorranceLearning team are passionate about sharing what works in learning, so they devote considerable time to teaching and sharing about Agile project management for learning experience design and the xAPI. She is the author of Agile for Instructional Designers, The Quick Guide to LLAMA, and Making Sense of xAPI. Megan is also an eCornell Facilitator in the Women's Executive Leadership curriculum.
SMM107 Putting Data to Work: Insights for Business and ID
4:00 PM - 4:45 PM Tuesday, March 27
Expo Hall: Management & Measurement Stage
Thanks to xAPI, L&D professionals have an ever-growing pool of data; but, in order for that data to give you more value than the traditional LMS-based data sets, you need to rethink what measures and analyses you want to employ.
This session will look at real-world applications of data (from xAPI and other sources) as a means to explore a variety of L&D and business-related questions in areas including performance impacts and design insights. You will learn more about the limitations of decoupling learning data from other business metrics, and how to create a stronger analysis by leveraging qualitative as well as quantitative data.
In this session, you will learn:
- How to ask the right questions of your data
- Why learning data on its own is not enough
- How to build a feedback loop for instructional design
- How to incorporate qualitative and quantitative data to build business insights
Audience:
Novice to intermediate designers, developers, managers, and project managers.
Technology discussed in this session:
xAPI.
Janet Laane Effron
Managing Principal
Four Rivers Group
Janet Laane Effron is a data scientist who focuses on the creation of effective learning experiences through iterative processes, data-driven feedback loops, and the application of best practices in instructional design. She has worked on xAPI design projects related to designing for performance outcomes and designing both for and in response to data and analytics. Janet’s areas of interest include text analytics, machine learning, and process improvement. She is also the co-author of Investigating Performance: Design and Outcomes with xAPI.
507 Is It Working? Correlating Usage with xAPI
10:45 AM - 11:45 AM Wednesday, March 28
Salon 14
Good courseware will use multiple elements such as video, audio, interaction, and good old-fashioned reading. You struggle to balance all of it until it’s a finely harmonized symphony of information, waiting for a person to take it all in. But are any of those activities or videos you’ve worked on actually helping anyone learn? How can you show the relationship between the activities and performance?
This session will show how you can use xAPI to capture data from different activities into a single uniform format in the learning record store (LRS). Then, by looking at a real-world example page with video and test questions, you can start analyzing results to see which activities contribute most to success and which test questions need work. By knowing how to leverage your data, you can begin to see how to design your content to make sure that data is where you need it, when you need it!
In this session, you will learn:
- How xAPI can help you collect usage data
- How to combine activity data to see if your test questions are effective
- How to combine activity data to make sure your activities are accomplishing their goals
- How to use xAPI to compare consumption to performance
- How to make sure your content does what you need it to do
Audience:
Novice to intermediate designers, developers, and managers.
Technology discussed in this session:
HTML5, JavaScript, xAPI (statements and queries), and LRSs.
Anthony Altieri
IDIoT in Chief/xAPI Evangelist
Omnes Solutions
Anthony Altieri is the IDIoT in Chief (instructional developer for the Internet of Things) and founder of Omnes Solutions, as well as an xAPI evangelist, authoring a course on xAPI Foundations for LinkedIn Learning. Anthony has worked on multiple projects implementing global LMS systems. He is a maker, focusing on user analytics and bringing the virtual learning world and the real world together through the use of Bluetooth beacons and other IoT devices using xAPI. Anthony has lectured to audiences on topics ranging from the spread of HIV to network security, content development, why it’s important to learn to code, and, of course, xAPI.
603 Adding xAPI to Your RFPs: Rethinking Your Process
1:00 PM - 2:00 PM Wednesday, March 28
Salon 18
The Experience API is a marvelous enabling technology that can give you a rich picture of an individual’s learning path. Since it captures experiences in a consistent format, it opens the door for mobile, social, or offline learning. But xAPI is a terrible place to start when writing an RFP. It should not be the reason you’re buying software. xAPI is a feature—you should be buying a solution.
Ideally, the process of selecting a new technology should look like this: Identify your learning goals, uncover what your staff needs to achieve those goals, and select a technology that meets your business and technical requirements. While it’s not recommended to include xAPI in your RFP for the sake of it, this session will provide clear recommendations for how to write an RFP if you decide you need xAPI. That way, you’re well set up to select a technology provider that suits your goals and allows you to get started with xAPI.
In this session, you will learn:
- Why you shouldn’t include xAPI in your RFP just for the sake of it
- How to go about writing an RFP so that you are well set up to find a provider that offers a solution that supports your goals
- How to include xAPI in your RFP if you decide to do it anyway
- What capabilities to require of a provider
Audience:
Novice to advanced project managers, managers, directors, and senior leaders (VP, CLO, executive, etc.).
Technology discussed in this session:
LMSs, authoring tools, LRSs, and content management systems.
Chris Tompkins
Vice President of Business Development
Rustici Software
Chris Tompkins is the VP of business development for Rustici Software, a company that provides software to improve compatibility across the L&D ecosystem. He is an expert in the standards (like SCORM and xAPI) and uses his technical expertise to support eLearning RFP and procurement. Chris has an MBA from Belmont University, focused on entrepreneurship and negotiations. Building on previous work at HP/Compaq and XM Satellite Radio, he has over 15 years of experience matching the right technical solution to a client's needs.
SMM204 A Proven Checklist Tool for Objective Program Evaluation
1:00 PM - 1:45 PM Wednesday, March 28
Expo Hall: Management & Measurement Stage
Consistent program evaluation can be a challenge. Organizations must invest time and manpower wisely in order to generate feedback that will improve training outcomes and produce a positive ROI. Asking a department to judge its own work compromises objectivity, whereas hiring outside consultants generates tension. So what’s a better option? A substantive rubric aligned with a standards-based evaluative tool. This approach can help you achieve results that a mere survey, or departmental self-evaluation, cannot.
In this session, you will learn how to implement a product evaluation checklist to help you more effectively evaluate and critique your own learning programs. You’ll discover how evidence-based best practices are reflected in this evaluation tool’s 12 broad criteria, and how you can best use them to generate an objective score that is readily comparable between departments, units, and businesses as well as across training types. Once you’ve mastered the tool, you’ll then brainstorm ways to leverage evaluation scores within your own businesses to help meet training goals, drive better outcomes, and further department and asset development.
In this session, you will learn:
- About the advantages of employing a proven rubric to evaluate a program’s alignment with learning science
- About the meaning and importance of this rubric’s 12 essential categories
- How to conduct a program review
- How to conduct a post-review session with stakeholders that will minimize friction and facilitate next steps and improvements
- How to leverage the checklist’s findings to drive an organization to positive change
Audience:
Novice to advanced designers, developers, managers, directors, and senior leaders (VP, CLO, executive, etc.).
Technology discussed in this session:
Kaplan’s Educational Product Evaluation Checklist.
Kristin Murner
Learning Design Lead
CreatorUp
Kristin Murner, the learning design lead at CreatorUp, has worked in traditional, online, and for-profit education for over 20 years. Her favorite projects have been teaching undergraduate marketing on an Army base, working with NYC public school teachers on SHSAT prep-course improvement, and quickly creating and co-hosting free educator-facing webinars to help teachers and faculty pivot to online learning during the COVID-19 shutdown. At CreatorUp, Kristin works cross-functionally to ensure all learning content is measurable, sticky, and awesome. She holds a BS in physics, an MBA in marketing, and an MSEd in instructional design and technology.
Bob Verini
Director, Academic Quality
Kaplan Test Prep
Bob Verini is a director of academic quality in Kaplan Test Prep’s learning science department, with 35 years of experience in creating and delivering instruction in on-site and online classroom environments. For the last decade he has been teaching online courses exclusively, and he created the company’s first training program to help veteran teachers understand and absorb the best practices of the new medium. An accomplished journalist for Daily Variety and other publications, Bob is often recognized for his appearances on Jeopardy! as the winner of 1987’s Tournament of Champions and a veteran of numerous invitational tournaments.
709 Lean Learning: Cutting the Fat to Demonstrate Sustainable Learning Value
2:30 PM - 3:30 PM Wednesday, March 28
Salon 16
Business leaders expect learning efforts to deliver impactful results while minimizing disruptions to key processes. They expect performance to improve—not for learning to become more effective—and for learning practitioners to demonstrate results for learning. They also expect L&D to seamlessly integrate learning into business activities while effectively minimizing the use of available resources. Because of this, learning practitioners are entering a brave new eLearning world full of creativity and opportunities to drive integrative business value.
In this session, you’ll explore how L&D is rapidly moving toward Lean learning. Lean is about creating processes that reduce human effort, process time, costs, and defects, compared to traditional systems. This approach offers L&D a leading role and opportunities to drive change and improve operational performance. This session will share strategies you can use yourself and with your team to become Lean, bringing together results, efficiencies, and effectiveness. The strategies you’ll learn will make your business leaders take notice and see you as a valued business partner.
In this session, you will learn:
- How to develop seamless, integrative learning initiatives by leveraging existing learning interactions and technologies
- Ways to reduce learning cycle time by minimizing process disruption, utilizing real-time eLearning approaches
- Strategies for creating valued business relationships through tangible results, and for connecting and partnering with related business activities to deliver targeted learning results
- How to identify and leverage existing learning and business processes to affect precise performance objectives
Audience:
Intermediate to advanced designers, developers, managers, directors, and senior leaders (VP, CLO, executive, etc.).
Ajay Pangarkar
Performance Strategist, Author, Managing Partner
CentralKnowledge
Ajay Pangarkar is a Certified Professional Accountant and Certified Training and Development Professional. He's a published author. His third book is titled The Trainers Balanced Scorecard: A Complete Resource for Linking Learning and Growth to Organizational Strategy. Other books include The Trainers Portable Mentor and Building Business Acumen for Trainers. CentralKnowledge was recognized by TrainingMag in 2008 as Project of the Year for their work with Apple. He's also an award-winning writer winning the 2014 and 2015 prestigious TrainingIndustry.com Readership and Editors' Award. Ajay was recently awarded Elearning Magazine's 2016 Learning Champion. Ajay is a regular on the #1 Montreal Talk Radio morning show.
1004 Measuring Learners’ Confidence in Their Abilities
10:00 AM - 11:00 AM Thursday, March 29
Salon 3
After the blood, sweat, and tears you put into designing a learning experience, how do you know it made a difference? Ideally, you can measure Kirkpatrick Levels 1 – 4 following your education and be confident of your impact. More often, measuring changing behaviors in the real world is trickier due to issues like cost, or access, inherent in many performance environments (e.g., healthcare). What tools can bridge that gap?
In this session, you’ll learn how measuring learners’ confidence in their abilities, called self-efficacy, can give you insight into eventual changes in their behavior and performance. Learn about the underlying theory and evidence in support of self-efficacy measures. Learn tips and best practices for creating the individual assessment items and an overall self-efficacy tool tailored to the learning experiences you want to evaluate. You’ll leave the session with a new tool in your measurement toolbox that will get you one step closer to assessing the impact of your education on your learners.
In this session, you will learn:
- Why you can use self-efficacy measures as an index of potential changes in behavior and performance resulting from your education
- What makes a good self-efficacy measure
- How to identify the behaviors or abilities that you should assess with a self-efficacy measure
- How to create a self-efficacy measurement tool that is tailored to the specific learning experience you want to evaluate
Audience:
Novice to intermediate designers, developers, and managers.
Alexander Walker
Senior Director, Learning Research and Design
MedStar Health Simulation Training and Education Lab
Alexander Walker is a senior director of learning research and design at MedStar Health Simulation Training and Education Lab, which is the educational development organization of one of the largest healthcare systems in the mid-Atlantic. He holds a PhD in human factors psychology from Clemson University. Early in his career, Alex engaged in research examining the effects of learning in different simulation environments on the impacts of performance and the development of motion sickness. His other research experience includes the investigation of team performance, the psychophysiological assessment of the workload and performance of individuals and teams.
1009 Everything You Wanted to Know About APIs but Were Afraid to Ask
10:00 AM - 11:00 AM Thursday, March 29
Salon 5
As learning systems mature, there is an even stronger need to personalize and integrate them. This is possible thanks, in part, to APIs (application programming interfaces). Many learning professionals hear about APIs in their daily workflows but don’t know much about them, which prevents them from being able to comprehend the true potential of these important features.
In this session, you will learn what APIs are and what they are not. The session will explore the technical background of APIs and make them understandable to a general audience by using easy-to-grasp metaphors. You will learn of several use cases for APIs that you can take back to your team. You’ll then see a live demo of how to configure APIs to integrate various search elements of multiple content repositories, in order to give a much more contextualized experience to the learner.
In this session, you will learn:
- What makes up APIs
- How to use APIs to integrate systems
- How APIs are evolving to XAPIs
- How to navigate IT to leverage APIs
- About the security challenges of APIs
- How to do enhanced reporting using APIs
- How to integrate workflows using APIs
- How to personalize content using APIs
Audience:
Novice to intermediate designers, developers, and managers.
Technology discussed in this session:
APIs, XAPIs, REST, SOAP calls, Cornerstone OnDemand custom page and CSOD (cloud-based) back end, development servers, JSON, and knowledge repositories.
Duncan Larkin
Digital Learning Innovation Manager
McKinsey & Company
Duncan Larkin is the head of the digital learning innovation team at McKinsey & Company. He is a passionate advocate for simple, elegant, and transformative solutions that push the boundaries of innovation and put the learner first. Duncan is a graduate of the US Military Academy at West Point and the author of two books.