Learning is a combination of logical and emotional processes, though one side of that equation might be getting neglected in much modern eLearning. Some researchers argue that “technological acceleration” has driven an emphasis on aspects of learning that deal with decision-making, modeling logical processes, and other cognitive activities. An equally essential element in learning and retaining information, though, is emotional impact or affective learning.
Researchers in diverse fields—learning, biology, psychology, and more—find indelible connections between emotional processes and logical or knowledge-based processes. A compelling argument can be made for considering the role of affect—emotion—in eLearning success. And emerging technologies could enable new approaches to eLearning that balance automation, cognitive elements of learning, and affective eLearning, improving both eLearning effectiveness and learner engagement.
“The aspects of cognition that we recruit most heavily in schools, namely learning, attention, memory, decision making, and social function, are both profoundly affected by and subsumed within the processes of emotion,” Mary Helen Immordino-Yang and Antonio Demasio wrote in We Feel, Therefore We Learn: The Relevance of Affective and Social Neuroscience to Education.
Their paper describes a fundamental link between emotion and learning. Referring to studies of individuals with damage to parts of the brain that control social behavior and emotional processing, they hypothesize that “emotional processes are required for the skills and knowledge acquired in school to transfer to novel situations and to real life.” A similar hypothesis for skills and knowledge acquired via other means, such as eLearning, is reasonable.
Calling for further research on affective learning, they wrote, “Knowledge and reasoning divorced from emotional implications and learning lack meaning and motivation and are of little use in the real world.”
An affective learning manifesto
A manifesto published in 2004 calls for balancing the benefits of artificial intelligence and automated or online learning with attention to the role of emotion, motivation, and attention.
“While it has always been understood that too much emotion is bad for rational thinking, recent findings suggest that so too is too little emotion—when basic mechanisms of emotion are missing in the brain, then intelligent functioning is hindered. These findings point to new advances in understanding the human brain not as a purely cognitive information processing system, but as a system in which affective functions and cognitive ones are inextricably integrated with one another,” the authors wrote.
The manifesto envisions an “intelligent tutoring system” that can sense and respond appropriately to learners’ emotions. “With skills of affect perception, a computer that detects the learner making a mistake while appearing curious and engaged could leave the learner alone since mistakes can be important for facilitating learning and exploration; however, if the learner is frowning, fidgeting, and looking around while making the same mistake, then the computer might use this affective feedback to encourage a different strategy.”
Fifteen years ago, this level of emotional recognition was in very early development; today, emerging technologies could put this intelligent tutor within reach.
AI technologies that could support affective eLearning
The Future Today Institute calls artificial intelligence (AI) the “third era” of computing. Its 2019 Tech Trends Report is careful to point out that AI itself is not a trend; it is “the most important tech development of our lifetimes.” In addition to myriad ways that AI touches daily life, work, government, and business, emerging AI technologies can interact with humans in ways that can transform eLearning. Some of these technologies, described in the 2019 Tech Trends Report, are:
- Facial and image recognition and completion, computational photography: AI algorithms’ ability to recognize items and individuals from images or in real time continue to advance. Some technologies can even complete images based on their “training” and access to millions of related images. Other technologies enable AI engines to create entire graphical environments from short segments of video.
- Natural language recognition, processing, understanding, and generation: Natural language recognition and generation technologies are in use in journalism, marketing, retail, and finance and form the basis of chatbot applications and other user- or customer-facing interactions. Natural language understanding allows extraction of concepts, mapping of relationships, and analysis of the writer’s or speaker’s emotions. These technologies can make interactions between an intelligent tutor and a learner feel more real to the learner, which can boost engagement and the learner’s emotional investment in the training.
- Gesture recognition: These technologies can interpret motions to identify individuals and even anticipate what a person might do next. “Gesture recognition unlocks the interplay between our physical and digital realms,” the Trends Report states.
- Emotional recognition: Putting together language, gestures, facial expression, and data about an individual, an AI engine can detect and identify the emotions of a person it is “seeing” or “hearing.” According to the Trends Report, automaker Kia has developed a system that uses facial expressions, heart rate, and electrodermal activity to detect a passenger’s emotional state. These capabilities could be the foundation of eLearning that detects and responds appropriately to learners based on far more than whether they select the correct answer to a question.
- Reinforcement learning: An AI algorithm that goes beyond basic machine learning can now “learn” to identify the most desirable outcomes and choose the actions or responses that are most likely to produce those outcomes. AI engines can use context to inform their real-time learning, as well as other information such as learner responses to questions; in turn, this input can shape the AI’s interactions with learners.
Explore affective learning and more
Register for The eLearning Guild’s Science of Learning Summit, May 15–16, 2019 to explore affective neuroscience more deeply. Nick Shackleton-Jones will present “How Affective Neuroscience is Upsetting Educational Convention,” and Clark Quinn will explore “The Cognitive Foundations of Learning: From Neural to Useful.” Join these and other expert presenters for a deep dive into the science underlying eLearning, memory, and employee performance.