Register Now

107 Readying Your Organization for the eLearning Revolution: Analytics

10:45 AM - 11:45 AM Wednesday, October 29

Raphael 1

With the financial advantages of big data gaining global attention, C-level managers and senior business leaders are beginning to demand better reporting and improved accountability across all organizational divisions. While the goal of measuring learning seems unreachable for most in the training division, the reality is that these technological advancements mean that disruption in the training industry is inevitable. These innovations will enable unprecedented reporting of training effectiveness, business impact, and return on training investment.

In this session you will learn about the looming industry disruption, including the advantages and the benefits. You will explore the tools to develop an overarching strategy to prepare your organization for the change and provide tools to communicate with your training team, peers, and other business leaders. You will discuss strategies for planning a long-term change within your organization. This session will examine what’s happening with analytics in a logical and easy-to-understand way. You will leave this session understanding what you need to know about analytics, what you need to do now, and how to do it.

In this session, you will learn:

  • True learning measurement and how to separate useful and meaningful data from the noise
  • The effects of big data on the way training is currently done
  • Strategy for communicating to other business leaders, peers, and teams
  • Tools for developing a roadmap for change within your organization

Audience:
Intermediate and advanced managers, directors, and executives with an interest in the future of eLearning, decision-making authority, and/or the responsibility to manage change.

Technology discussed in this session:
Enterprise Infrastructure, learning management systems, HRIS, learning record stores, the cloud, and analytics dashboards.

Glenn Bull

CEO & Founder

Skilitics

Glenn Bull is the founder and CEO of Skilitics, which is the creator of an enterprise training development platform designed for integrated learning measurement. The Skilitics platform is fast gaining attention globally for its disruptive and innovative approach to training design and measurement. Glenn is the visionary behind this cloud-based solution and spearheads the company’s global strategy. He is also the editor of TheNewID.com training comic, contributed to by many of the industry’s key thought leaders. Glenn is one of six members of The eLearning Guild Academy’s Advisory Council.

<  Back to session list Top ^

 

306 Big Data: Training Needs You Don’t Know About But Have Already Captured

4:15 PM - 5:15 PM Wednesday, October 29

Raphael 1

Many training professionals find it challenging to know if training was effective on the job or if part of the organizational system is not supporting the training delivered. While many training departments collect data under Kirkpatrick’s levels one or two, few are able to track data beyond that. Training departments need a methodology for finding out training needs without on the job observation or having trainees return for a test.

In this session participants will explore the problem-based inquiry (PBI), a method used to systematically mine data that already exists at the organizational level that provides insight into what may need addressing with training. You will learn how this information is mined from the organizational help desk’s database of actions workers struggle with on the job. You will explore the tools and methodologies needed to use the PBI method. You will discover how inexpensive the PBI method is to employ and how it utilizes data already captured by the organization by the help desk.

In this session, you will learn:

  • How to indirectly evaluate training programs
  • How to use the problem-based inquiry method
  • The benefits of indirect training evaluation
  • How to use data already captured to improve training

Audience:
Novice to advanced designers, developers, project managers, managers, and directors with a basic appreciation for the need to assess training effectiveness.

Technology discussed in this session:
The problem-based inquiry methodology.

Matthew Casey

VP of Content and Accreditations

VectorLearning

Dr. Matthew Casey has over 10 years' experience in training solution design, implementation, and testing in a variety of business settings. Matt’s background includes training evaluation methods, training program management in both centralized and decentralized environments, technical communications management, instructional design program management, quality assurance, and help desk management.

<  Back to session list Top ^

 

402 Eight Smooth Steps to eLearning Evaluation

10:30 AM - 11:30 AM Thursday, October 30

Monet 2

In the world of Big Data, organizations are increasingly using data and technology to better understand the value of their investments and efforts. As training professionals, it is imperative that we also use data to communicate the impact of our work. Many learning professionals avoid data and measurement because it seems like a daunting task, but it doesn’t have to be.

In this session, you will examine eight smooth steps for involving stakeholders in developing an evaluation framework for your eLearning initiatives. You will discover how this framework can be foundational for creating a common language for stakeholders. You will learn how the use of this framework can minimize the number of implementation issues, and provide a structure for the ongoing assessment of eLearning programs.

In this session, you will learn:

  • The importance of evaluation for eLearning
  • About an evaluation framework for implementing eLearning
  • Ways to foster eLearning evaluation
  • The importance of setting a common language with stakeholders

Audience:
Novice to advanced designers, developers, project managers, and managers.

Andy Whitaker

Sales Manager

Rustici Software

Andy Whitaker is a sales manager and xAPI strategist for Rustici Software, a company that helps vendors and organizations conform to eLearning standards. He's been involved with xAPI since it was in its early beta stages. Andy has a degree from Middle Tennessee State University and has over 15 years of experience in helping customers understand and reach their desired business goals.

Margie Johnson

Training and Facilitation Solutions Director

Metro Nashville Public Schools

Dr. Margie Johnson is the training and facilitation solutions director for the Metropolitan Nashville Public Schools, where she is leveraging various technology tools and adult-learning theory to empower approximately 10,000 employees. Margie has extensive experience in training and development. Starting out as a middle-school classroom teacher, she has spent the last 12 years providing adult training and development.

<  Back to session list Top ^

 

502 More Than Numbers: Data, Analytics, and Design

1:15 PM - 2:15 PM Thursday, October 30

Cézanne 1 & 2

The availability of (and demand for) data around learning has grown dramatically. Tools like the Experience API (xAPI) make it increasingly easy to acquire data about learner’s activities, but this will provide little benefit to instructional designers or learners if we do not design to acquire meaningful data, know how to interpret that data, or know how to improve our learning design based on that data.

In this session participants will explore the use of data in the context of learning systems design. You will examine some basic principles surrounding the effective use of data and how to design to provide meaningful feedback. You will discuss concepts including comparative mapping of novice to expert practices and learner experiences with formal courses and other activities. You will explore the application of principles from UXD, web analytics, and business intelligence to design for meaningful (and actionable) contextual data. Examples of how using data generated by the xAPI that can be used for predictive analytics to make interventions seamless for the end user will also be discussed.

In this session, you will learn:

  • How to apply principles from fields including business intelligence and web analytics to learning design
  • How to design to gather meaningful data within the context of course goals and overall performance objectives
  • How to use data analytics to improve course design
  • Potential pitfalls of data interpretation

Audience:
Novice, intermediate, and advanced designers, developers, project managers, and managers. A conceptual knowledge of the xAPI is useful, but not required.

Technology discussed in this session:
The xAPI, learning record stores.

Sean Putman

Vice President of Learning Development

Altair Engineering

Sean Putman, a partner in Learning Ninjas, has been an instructor, instructional designer, and developer for over 15 years. He has spent his career designing and developing training programs, both instructor-led and online, for many different industries, but he has had a strong focus on creating material for software companies. Sean has spent the last few years focusing on the use and deployment of the Experience API (xAPI) and its effect on learning interventions. He has spoken at industry conferences on the subject and is co-author of Investigating Performance, a book on using the Experience API and analytics to improve performance.

Janet Laane Effron

Managing Principal

Four Rivers Group

Janet Laane Effron is a data scientist who focuses on the creation of effective learning experiences through iterative processes, data-driven feedback loops, and the application of best practices in instructional design. She has worked on xAPI design projects related to designing for performance outcomes and designing both for and in response to data and analytics. Janet’s areas of interest include text analytics, machine learning, and process improvement. She is also the co-author of Investigating Performance: Design and Outcomes with xAPI.

<  Back to session list Top ^

 

606 Guerrilla Evaluation: Closing the Feedback Loop

3:00 PM - 4:00 PM Thursday, October 30

Raphael 1

eLearning has a broken feedback loop, and it’s holding us back as a field. Because we usually can’t see our products being used, we lack the most basic information necessary to improve what we do. Traditional evaluation at best is costly and difficult to measure, and at worst either ignored all together or implemented in such a superficial way that it’s meaningless. Even good evaluation measures are not granular enough to inform future design decisions.

In this session you will explore the need for designers and developers to be able to see the impact of our creations in a meaningful way, so we can grow as a field. You will discuss the importance of this need being addressed in a way that is both inexpensive and easy to implement. You will examine the concept of “guerrilla evaluation” methods and what we can learn from the field of software usability. You will leave this session with a list of practices we need to add to that will ensure successful eLearning design.

In this session, you will learn:

  • How to recognize when you have a broken feedback loop
  • How to use user-experience best practices
  • How to conduct a guerrilla evaluation
  • How to ensure you are moving forward as a practitioner

Audience:
Novice to advanced designers, developers, project managers, managers, and directors.

Technology discussed in this session:
User-experience practices and guerrilla evaluation methods.

Julie Dirksen

Learning Strategist

Usable Learning

Julie Dirksen, a learning strategist with Usable Learning, is a consultant and instructional designer with more than 15 years' experience creating highly interactive eLearning experiences for clients ranging from Fortune 500 companies to technology startups to grant-funded research initiatives. She's interested in using neuroscience, change management, and persuasive technology to promote sustainable long-term learning and behavior change. Her MS degree in instructional systems technology is from Indiana University, and she's been an adjunct faculty member at the Minneapolis College of Art and Design. She is the author of Design For How People Learn.

<  Back to session list Top ^

 

707 Big Bad Data: How to Clean Up the Mess

8:30 AM - 9:30 AM Friday, October 31

Degas 1 & 2

Everyone’s talking about Big Data, but learning systems already generate piles of data and it’s a mess. How can we move on to a more in-depth analysis of learner behavior if we can’t even straighten out the basics? Our technicians roll their eyes and complain about poor data quality in the upstream feeds. The report designers complain that the stakeholders keep changing their requirements, and our teams are burning precious hours managing unwieldy Excel spreadsheets. We don’t need a new technology. We need a new approach.

In this session participants will examine a framework for understanding how to efficiently work with learning data. You will explore an overarching approach that you can use to communicate to both technical staff and business stakeholders. Using humor and visualization, this session will introduce you to techniques, tools, and concepts that are not usually thought of by database professionals outside of the learning industry. 

In this session, you will learn:

  • How to clarify the business drivers for gathering data and use them to inform your strategy
  • To look at learning data holistically so that you can find solutions to data problems
  • To discover techniques to use existing data to generate the data needed for reporting
  • How to put data limitations into a perspective that business stakeholders understand
  • How to create low-effort, repeatable, and flexible processes for collecting, cleaning, and reporting on learning data

Audience:
Novice to advanced designers, developers, project managers, and managers.

Adam Weisblatt

Owner

Blank Page Learning

Adam Weisblatt is a learning strategist with a passion for creating learner-centered experiences and business-centered learning systems and processes. He is the founder of Blank Page Learning, which helps companies develop strategies integrating learning technologies to open the doors of new ideas and break down the barriers to learning. Adam has 20 years of experience in all aspects of workplace learning and implementing global enterprise-wide projects. He has been an instructor, eLearning designer, and programmer, as well as a performance artist, puppeteer, and cartoonist.

<  Back to session list Top ^

 

713 Using Practical Technology for a 360-degree Practicum

8:30 AM - 9:30 AM Friday, October 31

Van Gogh 1

When learners are geographically dispersed, and the training is completed, it is difficult to know if the skills were transferred back on the job. In addition, it is difficult to observe the learner performing a new task, and to provide feedback for improvement and/or reinforcement of what was done well. We need to find ways to use technology to enable us to effectively evaluate learners’ ability to apply what they learn in training.

In this session participants will explore two real-world case studies, demonstrating the design and technology considerations applied to implement two 360-degree practicums. You will discuss key considerations for getting buy-in from evaluators. You will examine the communications used, review examples of templates and how-to aids on using software applications, and see samples of the resources provided to both learners and evaluators.

In this session, you will learn:

  • Creative ways to leverage technology in order to implement remote skill assessment
  • Considerations for getting buy-in and involvement from the evaluators
  • A framework to give to the evaluators so they are confident with how and what to assess
  • Planning steps for how to support the knowledge needed by the learners for them to be successful in the assessment
  • How to measure success

Audience:
Intermediate designers and developers with some experience with online survey tools, HTML coding, and awareness of Kirkpatrick levels of evaluation.

Technology discussed in this session:
SurveyGizmo, video conferencing (Lync), mobile phones, LMS.

Kythrie Silva

Sr. Consultant, eLearning Development

Cardinal Health

Kythrie Silva, senior consultant of instructional design and eLearning developer for Cardinal Health, has been designing, developing, and advocating for the innovative use of technology in teaching and learning for the past 15 years. Kythrie has particular expertise in all levels of training assessment and evaluation. Currently she is responsible for building training and eLearning that produces measurable business impact. Previously Kythrie worked in an academic setting at Ohio State University helping to support the mathematics and statistics faculty to research and evaluate new teaching and learning technologies and designing learning environments.

Barbara Davis

Consultant, eLearning Development

Cardinal Health

Barb, a consultant for eLearning development for Cardinal Health, was educated as a wildlife biologist but fell in love with teaching during graduate school. She now has more than 25 years’ experience leading teams and bringing projects to successful completion. She is an expert in producing training courses and learning solutions for delivery to internal and external users, and is dedicated to assisting and supporting others with their project deliverables. Barb is talented with researching, analyzing, and developing subject material using a variety of media. She is passionate about instructional design because it keeps you on track with “need to know” and identifies “nice to know.”

<  Back to session list Top ^

 

802 Increasing and Measuring Helpful Expertise in Retail

9:45 AM - 10:45 AM Friday, October 31

Van Gogh 2

Instructional designers are seeking more than just knowledge retention from learners; we are seeking positive, quantifiable, and repeatable changes in behavior—the gold standard of eLearning results in the business world. However, these behavior changes are challenging to facilitate, and even more challenging to track.

In this session you will learn how to credibly answer the age-old measurement question: Do retail salespeople who complete online training on specific brands sell more than those who don’t complete the training? You will examine the creation and results of a study comparing point-of-sale data to sales associates’ engagement with eLearning courses in the categories and brands they sell. You will discuss the factors that were used to isolate the effects of the eLearning courses from other factors that affect sales. You will explore how you can adapt this case study in your own organization.

In this session, you will learn:

  • How eLearning really can make a difference in business
  • How leveraging common goals with your partners can advance your business
  • How creative problem-solving can help you find data that will help drive your business
  • Why successful people thrive on training that can help them do their job more effectively

Audience:
Novice to advanced designers, developers, and managers.

Technology discussed in this session:
HTML5.

Chris Barker

Content Services Director

Experticity

Chris Barker is the content services director of Experticity. After more than 10 years working as a newspaper reporter, he has spent the last seven years helping retail experts become more helpful. Chris runs a team of writers and instructional designers who each year design and launch hundreds of online eLearning sites for brand clients across a variety of industries, including some of the largest consumer electronics, apparel, and accessories companies in the world.

<  Back to session list Top ^

 

Save Today!

eLearning Guild Members Save 20% or more!


Premium Sponsors

Exploring the New Learning Universe