Sharing What Works

March 16 – 18, 2016 Orlando, FL

Register Now Includes:

LS201 Accountant, Strategist, or Sherlock: Using Learning Data in Context

1:00 PM - 2:00 PM Wednesday, March 16

Poinsettia/Quince

Most people in the learning and development field didn’t choose the profession because of a deep love for statistics, and few have ready access to data scientists for consultation. But the availability of an ever-growing body of data highlights the value of a fundamental understanding of data collection and analysis. Learning data will only provide valuable, actionable information if curated and evaluated strategically.

In this session, you will learn how to develop clearly defined, context-dependent information goals which will then serve to delineate the data required to meet those goals. Lessons and heuristics from other analytic fields, including business intelligence, competitive intelligence, and web analytics will be used to help provide you insight on the data requirements for a variety of information goals. You will also learn how to effectively communicate analytic results to stakeholders at various organizational levels.

In this session, you will learn:

  • How to define goals and context for data acquisition
  • How to take a strategic approach to data
  • The value of quantitative and qualitative approaches to improve insights
  • How to deliver context-specific results in terms relevant to stakeholders

Audience:
Novice and intermediate designers, project managers, managers, and directors.

Technology discussed in this session:
Common tools for data collection and analysis such as LMS, LRS, spreadsheets, and databases.

Janet Laane Effron

Managing Principal

Four Rivers Group

Janet Laane Effron is a data scientist who focuses on the creation of effective learning experiences through iterative processes, data-driven feedback loops, and the application of best practices in instructional design. She has worked on xAPI design projects related to designing for performance outcomes and designing both for and in response to data and analytics. Janet’s areas of interest include text analytics, machine learning, and process improvement. She is also the co-author of Investigating Performance: Design and Outcomes with xAPI.

<  Back to session list Top ^

 

LS302 Measurement Matters: The How and Why of eLearning Metrics

2:30 PM - 3:30 PM Wednesday, March 16

Palm 5

Too often, learning is evaluated based upon one of two measures: whether learners like it, and/or how much it costs to produce. Unfortunately, the former isn’t useful, and the latter isn’t critical. Yet, increasingly, actions will have to be documented, and measurement is key. There are meaningful metrics for learning, but they should be about whether they are helping the organization. Kirkpatrick, ROI, impact—to make sense of these, you need to know some core concepts.

In this session, you’ll learn the basic models of measurement, and point out some of the common flaws observed. You will learn what makes sense for measuring learning, and the steps that are required. You’ll better understand the major terms and measures, including both formal and informal, and provide guidance about how to get started and the usefulness of the xAPI. Some new initiatives will also be looked at that can give valuable input in a strategic sense.

In this session, you will learn:

  • Why smile sheets aren’t useful
  • What makes a meaningful measure
  • How to get concrete about measuring informal learning
  • Where xAPI comes into play

Audience:
Intermediate and advanced designers, project managers, managers, and directors.

Technology discussed in this session:
The xAPI.

Clark Quinn

Chief Learning Strategist

Upside Learning

Clark Quinn, PhD is the executive director of Quinnovation, co-director of the Learning Development Accelerator, and chief learning strategist for Upside Learning. With more than four decades of experience at the cutting edge of learning, Dr. Quinn is an internationally known speaker, consultant, and author of seven books. He combines a deep knowledge of cognitive science and broad experience with technology into strategic design solutions that achieve innovative yet practical outcomes for corporations, higher-education, not-for-profit, and government organizations.

<  Back to session list Top ^

 

LS401 How to Assess Interactions with Customers

4:00 PM - 5:00 PM Wednesday, March 16

Palm 5

Learning professionals all struggle to measure the transfer of skill from the classroom to the job, especially with geographically dispersed learners. Designers need advice on how to solve this measurement challenge and involve managers in the solution. How do you observe and assess your remote employees performing critical job competencies?

In this session, you will learn through a case study how a remote sales team was evaluated in presenting key information to customers, how their managers were involved and the results. You will learn ways to leverage technology to implement skill assessment and use a framework for evaluators to better know how and what to assess. You will also gain an understanding of the planning steps to support the knowledge needed by learners to be successful in the assessment.

In this session, you will learn:

  • Creative ways to leverage technology in order to implement remote skill assessment
  • Considerations for how to involve managers in the observation and evaluation
  • A framework to give to the evaluators so they are confident with how and what to assess
  • Planning steps for supporting learner success in the assessment
  • How to measure success

Audience:
Intermediate designers and developers.

Technology discussed in this session:
SurveyGizmo, video conferencing (Lync), and website resources.

Kythrie Silva

Sr. Consultant, eLearning Development

Cardinal Health

Kythrie Silva, senior consultant of instructional design and eLearning developer for Cardinal Health, has been designing, developing, and advocating for the innovative use of technology in teaching and learning for the past 15 years. Kythrie has particular expertise in all levels of training assessment and evaluation. Currently she is responsible for building training and eLearning that produces measurable business impact. Previously Kythrie worked in an academic setting at Ohio State University helping to support the mathematics and statistics faculty to research and evaluate new teaching and learning technologies and designing learning environments.

Barbara Davis

Consultant, eLearning Development

Cardinal Health

Barb, a consultant for eLearning development for Cardinal Health, was educated as a wildlife biologist but fell in love with teaching during graduate school. She now has more than 25 years’ experience leading teams and bringing projects to successful completion. She is an expert in producing training courses and learning solutions for delivery to internal and external users, and is dedicated to assisting and supporting others with their project deliverables. Barb is talented with researching, analyzing, and developing subject material using a variety of media. She is passionate about instructional design because it keeps you on track with “need to know” and identifies “nice to know.”

<  Back to session list Top ^

 

LS501 Badges of Honor: Ensuring Badges Are Meaningful

10:45 AM - 11:45 AM Thursday, March 17

Palm 5

Digital badging is gaining traction in varied corners of the learning and development space from traditional academic environments to organizations and more. However, due to its relative novelty, there are some significant adoption barriers to those incorporating badging strategies.

In this session, you will examine how to align strategies with evidence-based techniques and multi-tiered assessment approaches to produce badges that clearly exhibit their worth and applicability. You will discuss the common questions and concerns related to the credibility and general worth of badges. You will leave this session understanding the critical components required to make badges meaningful to those that earn them.

In this session, you will learn:

  • Common badging adoption barriers and how to address them
  • The importance of understanding badge metadata and the role it plays in establishing value
  • How to incorporate diagnostic, formative, and evidence-based assessment techniques into the learning architecture for badge offerings
  • How the role of the mentor or assessor can be implemented to maximize the effectiveness and value of badge offerings

Audience:
Novice and intermediate designers, developers, and managers.

Technology discussed in this session:
Digital badges.

Participant technology requirements:
N/A

Bryan Eldridge

North American Director of Professional Services

eXact learning solutions

Bryan Eldridge, North American director of professional services for eXact learning solutions, is responsible for assisting clients in developing new strategies and skill sets for digital and learning transformation across every phase of the employee life cycle. Bryan, an MEd, has more than 25 years of experience in the design, development, implementation, evaluation, and management of educational and training solutions across a broad spectrum of cultural and contextual environments. In addition to his nearly 10-year relationship with eXact, Bryan has worked for several of the major players in learning technology in a variety of roles, ranging from product development to sales enablement.

<  Back to session list Top ^

 

LS605 A Curated Learning Journey: ePortfolios and Open Digital Badges

1:00 PM - 2:00 PM Thursday, March 17

Magnolia

Designing open and digital badges for evidence presented in curated learning ePortfolios, endorses and verifies the claims that a learner makes in this digital narrative—these claims are made against badge criteria and standards that have been co-designed by key stakeholders in the learning journey. Dartmouth College designed and developed badging to track the portion of a course that focused on digital scholarship skills where students could receive both a grade and a badge for each assignment, which would earn a progress badge, and completion of an entire training sequence or practice sequence would earn a completion badge.

In this session, you will learn how the University of Notre Dame and Thompson Rivers University are now using ePortfolios and digital badges to allow learners to chart their own pathway through their learning career. You will learn about three case studies that explore badges, and ePortfolios that show evidence for a range of competencies and capabilities through artifacts.

In this session, you will learn:

  • To examine, through case studies, how open and digital badge practice can build and leverage off ePortfolio research
  • To investigate what badge claims look like in evidence-based ePortfolios
  • To share and present the opportunities for open and digital badge researchers and practitioners

Audience:
Novice and intermediate designers, developers, and managers.

Technology discussed in this session:
N/A

Michael Goudzwaard

Lead Instructional Designer

Dartmouth College

Michael Goudzwaard, the lead instructional designer at Dartmouth College in Hanover, New Hampshire, works with learning design teams to build and offer DartmouthX courses. Michael holds a bachelor of arts degree in history from Calvin College and a master of science degree in environmental studies from Antioch University New England. His research interests include evidence-based learning, micro-credentials, and learning pathways. Michael has taught courses in environmental science and statistics and has been involved with offering MOOCs for several years, including as co-instructor for Introduction to Psychology at Keene State College and Introduction to Environmental Science at Dartmouth College.

<  Back to session list Top ^

 

LS607 Analytics: What You Want to Know

1:00 PM - 2:00 PM Thursday, March 17

Azalea/Begonia

How do you know if the money spent on training and development is worth it? One of the ways find out is by using analytics to assess who is using content: where, when, and how. Deciphering the ways to analyze a program’s effectiveness can be confusing. There is a lot of talk about big data, but what does it all mean? And just because there is a lot of data, does that really make any of it valuable?

In this session, xAPI and Google Analytics will be compared to learn about your users. Using real-world examples, you’ll see what data is available and how to find it. You’ll learn why some data is more valuable than others, and why big data isn’t always good data. Lastly, you’ll look at how all the data points come together to really bring into view a clear image of who your users are.

In this session, you will learn:

  • The basics of Google Analytics and xAPI data reporting
  • The analytic process
  • To evaluate what you really need to learn about your users
  • The pros and cons of using xAPI or Google Analytics to track user events

Audience:
Novice and intermediate designers, developers, project managers, managers, and directors.

Technology discussed in this session:
Google Analytics, the xAPI, JavaScript, and HTML5.

Anthony Altieri

IDIoT in Chief/xAPI Evangelist

Omnes Solutions

Anthony Altieri is the IDIoT in Chief (instructional developer for the Internet of Things) and founder of Omnes Solutions, as well as an xAPI evangelist, authoring a course on xAPI Foundations for LinkedIn Learning. Anthony has worked on multiple projects implementing global LMS systems. He is a maker, focusing on user analytics and bringing the virtual learning world and the real world together through the use of Bluetooth beacons and other IoT devices using xAPI. Anthony has lectured to audiences on topics ranging from the spread of HIV to network security, content development, why it’s important to learn to code, and, of course, xAPI.

<  Back to session list Top ^

 

LS703 The xAPI: What Does an Instructional Designer Need to Know?

2:30 PM - 3:30 PM Thursday, March 17

Palm 3

As adoption of the xAPI begins to take hold, the convergence of working and learning offers instructional designers the opportunity and the challenge to do more than ever before. The xAPI allows for more robust and interesting tracking of the learning process, including learning that happens outside the LMS and on the job. As an instructional designer, are you ready to step up to this challenge?

In this session, you’ll get a brief introduction to xAPI and what’s new about it from the instructional design side. You’ll also learn about three key areas that impact instructional design: identifying learning data needs, data sources, and meaningful visualizations that answer organizational and L&D questions; making choices about infrastructure—how and when to work with your LMS, your LRS or both; and models for taking advantage of the xAPI across a variety of learning vectors—formal and informal, social and private, formative and summative, and  predictable and variable.

In this session, you will learn:

  • How to identify new challenges in work as an instructional designer
  • How to describe the impact that xAPI can have on an organization’s learning and performance strategies
  • How to identify data needs and likely sources within an organization to meet them
  • How to choose one or more first projects that leverage the xAPI’s capabilities beyond what’s available in SCORM today

Audience:
Intermediate and advanced designers, developers, project managers, and managers.

Technology discussed in this session:
The xAPI.

Megan Torrance

CEO

TorranceLearning

Megan Torrance is CEO and founder of TorranceLearning, which helps organizations connect learning strategy to design, development, data, and ultimately performance. She has more than 25 years of experience in learning design, deployment, and consulting . Megan and the TorranceLearning team are passionate about sharing what works in learning, so they devote considerable time to teaching and sharing about Agile project management for learning experience design and the xAPI. She is the author of Agile for Instructional Designers, The Quick Guide to LLAMA, and Making Sense of xAPI. Megan is also an eCornell Facilitator in the Women's Executive Leadership curriculum.

<  Back to session list Top ^

 

LS806 Evaluating Your Assessments: Are You Testing the Right Thing?

4:00 PM - 5:00 PM Thursday, March 17

Palm 5

Learning in an eLearning module is generally assessed through multiple choice questions, rather than measuring demonstrations of target behaviors. When you design eLearning, you build in knowledge checks. All too often, these quizzes are reading comprehension tests rather than authentic assessments of skills. You need to test the target objectives to ensure you meet the goal of the program.

In this session, you will look at aligning assessments with intended outcomes and designing activities that measure skill gains. You will learn about alternatives like scenario-based activities that allow users to practice decision-making skills they will need to apply to their new learning on the job. You’ll also learn about self-check assessments and rubrics that learners and reviewers can objectively evaluate. You will learn how to create assessments that reflect measurable gains and help designers like you demonstrate ROI on projects, and help learners better master the subject at hand.

In this session, you will learn:

  • How to align assessments to outcomes
  • How to build authentic assessments
  • How to build a rubric
  • How to create online scenario-based assessment activities
  • Why multiple choice questions don’t effectively assess skills

Audience:
Novice to advanced designers, managers, and directors.

Technology discussed in this session:
N/A

Jean Marrapodi

VP/Senior Instructional Designer

UMB Bank

Jean Marrapodi, Ph.D., CPTD, has designed and developed eLearning for over 20 years in various industries and higher education. Named a Guild Master in 2016 by the eLearning Guild, she is considered an industry thought leader. Over the last 10 years, Marrapodi has presented more than 75 workshops and webinars for industry organizations and has taught over 40 graduate and undergraduate courses at New England College of Business, where she served as director of eLearning. Her expertise lies in her ability to make the complex simple, and pinpoint client needs to drive to business outcomes. She is a soup-to-nuts eLearning designer, able to single-handedly build a project from idea to rollout and work in a specific role on a project team. She is the chief learning architect at Applestar Productions, providing targeted eLearning and custom workshops for her clients.

Kara Witt

Senior Instructional Designer

Citizens Bank

Kara Witt, a senior instructional designer for Citizens Bank, serves as a project management, quality assurance, and evaluation expert. She has 13 years in corporate training in managed health care and two years designing online courses in higher education.

<  Back to session list Top ^

 

LS903 The Missing Link: Data Interoperability from Learning Systems to Operations

8:30 AM - 9:30 AM Friday, March 18

Kahili/Lily

SCORM, the xAPI, cmi5, and a host of other learning data standards exist and have widespread acceptance in the learning community. How can these standards extend beyond the learning world into the realm of enterprise technology? What the industry needs now is a distinct and real conversation on how to align learning technology with the technology used by the rest of the enterprise.

In this session, you will learn how Float, working with a number of industry and government stakeholders, has charted a path to bring the worlds of enterprise technology infrastructure into sync with learning ecosystems. The learning industry has had some transformation lately with the xAPI, cmi5 and a variety of cloud LMS vendors coming online. Neither of these technology realms currently talk to each other in most organizations. You’ll explore why this is, what problems it causes, and what advances in business could be gained if this issue was solved.

In this session, you will learn:

  • About the challenges of data interoperability
  • How to handle the coming data avalanche and its implications for your organization
  • Where learning and enterprise technology often gap or disconnect
  • What the next steps in bringing enterprise technology and learning technology integration are

Audience:
Managers, directors, and senior leaders (VPs, CLOs, executives, etc.).

Technology discussed in this session:
Enterprise SaaS platforms, enterprise learning systems, data schema and specifications, and various relevant application programming interfaces (APIs).

Chad Udell

Chief Strategy Officer

Float and SparkLearn

Chad Udell is the award-winning managing partner, strategy and new product development, at Float and SparkLearn. He has worked with Fortune 500 companies and government agencies to create experiences for 20 years. Chad is an expert in mobile design and development, and speaks at events on related topics. He is author of Learning Everywhere: How Mobile Content Strategies Are Transforming Training and co-editor/author, with Gary Woodill, of Mastering Mobile Learning: Tips and Techniques for Success and Shock of the New.

<  Back to session list Top ^

 

LS904 Understanding the Open Badge Ecosystem

8:30 AM - 9:30 AM Friday, March 18

Palm 3

A badge is a symbol or indicator of an accomplishment, skill, competency, or interest. Badges provide evidence of learning that happens in and beyond formal learning settings. Unlike transcripts or resumes, badges give prospective employers, schools, collaborators, and other learners a more complete picture of knowledge, skills, and abilities of the badgeholder. As with degrees, certificates, and credentials, a comprehensive ecosystem surrounds and supports badges.

In this session, you will examine the stakeholders/actors within the ecosystem, the processes that impact various stakeholders, and the data generated by or accessed by stakeholders. You will learn about frameworks supporting the competencies behind open badges, as well as strategies for assessing in a badge ecosystem.

In this session, you will learn:

  • How to define a current ecosystem
  • How to define the currency of an ecosystem
  • How to identify competency frameworks for a badge ecosystem
  • How to identify accreditation and validation frameworks for a badge ecosystem
  • How to identify appropriate assessment strategies for a badge ecosystem

Audience:
Novice and intermediate designers and developers.

Technology discussed in this session:
N/A

Anne Derryberry

Market Analyst

Sage Road Solutions

Anne Derryberry is a learning architect for serious games, simulations and virtual worlds. She works with learning organizations, game developers, tools developers, and analysts as learning architect, advisor, consultant, and industry observer. She is particularly fascinated with: group experience and how groups learn in virtual environments, especially through games; user-generated content; assessment, especially how it relates to LMSs; analysis; and how to make learning and meaningful play into profitable and sustainable business.

<  Back to session list Top ^

 

LS1010 Assessment and Evaluation in an Evolving Landscape

10:00 AM - 11:00 AM Friday, March 18

International South

As the financial and operational benefits of big data become more apparent to business leaders, the demand for accountable and impactful learning is growing. Learning portals, MOOCs, the xAPI, social media, and collaboration tools provide new channels for learning delivery, yet most learning departments are stuck measuring the traditional four levels.

In this session, you will explore the intent behind the evolving delivery modalities, and uncover the ways in which learning transfer can be measured beyond a standard assessment. You will learn how basic psychometric principles can be applied to new metrics, and how to incorporate new metrics in an impact framework. You will leave this session with a strong understanding of the type of metrics available in both traditional and new learning methodologies. You will get helpful examples and job aids to assist you in the implementation of new metrics in your own environment.

In this session, you will learn:

  • How evolving learning modalities can be measured outside of the traditional level 1 and 2
  • How to identify, capture, and validate measurement data
  • How to integrate disparate data streams into a measurement strategy
  • How to apply psychometric principles to new learning metrics
  • Which tools can be used in this process
  • Strategies for designing evolving measurement strategies

Audience:
Intermediate and advanced project managers, managers, and directors.

Technology discussed in this session:
LMS/LCMS, the Experience API, MOOC platform, Vestrics, and analytics dashboards.

A.D. Detrick

President

MetriVerse Analytics

A.D. Detrick is the president and founder of MetriVerse Analytics, a leading provider of L&D/HR measurement and analytics consulting. He is a recognized expert in the areas of learning measurement, assessment, evaluation, and human capital analytics. In his role, he oversees the design and implementation of measurement and analytics strategies for many of America’s largest and most technically innovative companies. He is a regular speaker at industry events and has contributed to numerous books on learning and analytics. Before founding MetriVerse, A.D. helped design measurement strategies as a consultant for Xerox Global Learning Services, Intrepid Learning, and JPMorgan Chase.

<  Back to session list Top ^