Many people, including myself, love looking at reports and analytics related to their training. Graphs. Charts. Tables. Big data. While these create the illusion that training is happening and learners are learning, what are your metrics actually saying? Can you interpret the learning analytics data and leverage it in a meaningful way?
While it’s important to provide your stakeholders with analytical recaps, it’s even more important to be able to explain what it all means.
Before we get ahead of ourselves, let’s go back to basics.
What is learning analytics?
We define learning analytics as any data related to your L&D (learning and development) program.
For example, popular L&D metrics include
- Number of enrollments
- Number of completions
- Average grade
- Average time to complete
Most training platforms measure all of these things and a whole lot more. But at the end of the day, it’s all just a bunch of numbers if you don’t know what they mean.
Let’s look at a common scenario. Fred is a training manager at Clear Day Windows, a medium-sized company that manufactures and installs windows. He is responsible for rolling out a new, company-wide training program and needs to share his progress with stakeholders regularly.
While Fred is skilled at creating and delivering training content, he isn’t great with numbers. His learning analytics look very impressive, but the data is overwhelming.
Why do learning analytics matter?
Learning analytics is the objective means by which you can measure your training’s success or failure. Without them, how do you know if your training program is working?
Within the L&D world, training success is typically determined by looking at a training initiative's return on investment (ROI). ROI is usually measured by collecting up to three key performance indicators (KPIs), then identifying the correlation between training progress and improved performance.
By not digging into his learning analytics, Fred is missing out on a huge opportunity to use them to guide and evolve his training. In addition, he will have a hard time showing the success of his training program. He won’t be able to provide the concrete information stakeholders want and need to justify their continued investment in training.
What learning analytics can you see in OttoLearn?
As mentioned earlier, standard learning analytics focus on enrollments, engagement, and completions. While these numbers are essential, they don’t show if learners are actually retaining any information.
OttoLearn tries to take things further by measuring knowledge and retention. Instead of looking at completions, OttoLearn measures mastery strength from the Module level down to individual Concepts.
GRAPHIC SHOWING MASTERY LEVELS
Metrics that show how his learners interact with their training is exactly what Fred needs — granular reporting that will help guide him in content creation AND demonstrate training ROI.
Let’s take a closer look at the seven panels on OttoLearn’s Analytics page and how Fred might use them.
The Overview analytics panel summarizes all the activity in your OttoLearn account, including the number of learners and Module assignments. It also provides information on learning progress and progress towards mastery, including your learners’ average mastery strength.
Fred can use this information to show stakeholders how much information learners have learned and retained (on average). Over time, he wants to see the Module Assignments in Mastery percentage increase and remain high (knowing that it can drop when new content or learners are added).
Fred can also use the Activity History graph to show the precise number of Activities completed each month and how engagement is trending over time. Assuming learners have things to learn, these values — Activities completed and engagement — should increase from month to month.
Check out Analytics Deep Dive: Overview Analytics Panel to learn more about this learning analytics panel.
Learner Engagement Analytics
The Learner Engagement analytics panel summarizes learner performance and engagement — two key metrics for Fred.
More specifically, this panel shows the number of engaged learners in the past seven days (short-term) and the past 30 days (long-term). Together, these numbers help Fred see what direction learner engagement is trending.
For added context, Fred can look at the table at the bottom of the panel to see how many learners are:
- Not engaging and not in mastery
- Engaging and not in mastery
- In mastery
These details help explain why the number of engaged learners may have dropped. For example, when learners have nothing to learn (are in mastery), they will naturally engage less; this is expected.
The engagement table also provides the names of learners who are not engaging, making it easy for Fred to follow up with employees who may need some extra prompting to do their training.
Check out Analytics Deep Dive: Learner Engagement Analytics Panel to learn more about this learning analytics panel.
Assignment Engagement Analytics
The Assignment Engagement analytics panel provides information on Module assignments and compares engagement between learners who have reached their target mastery strength and those who have not.
Over time, Fred wants to see 100% of Assignments in Mastery. An “assignment” is an association between a learner and some content. It is considered “in mastery” when the learner has reached their target mastery strength. If Fred adds new content, his metrics will drop, so he needs to keep that in mind as his training evolves.
The two charts featured in this analytics panel provide a great visual representation of engagement and mastery over time. Fred can leverage these easy-to-understand charts to demonstrate to stakeholders how learner knowledge increased from one month to the next.
Check out Analytics Deep Dive: Assignment Engagement Analytics Panel to learn more about this learning analytics panel.
Knowledge Gaps & Lifts Analytics
The Knowledge Gaps & Lifts analytics panel is about proficiency. Proficiency is calculated whenever a learner completes an Activity. It is based on several factors, including accuracy, confidence, duration, and past performance. Using this information, Ottolearn determines if the learner was proficient (knew the answer) or not proficient (didn’t know the answer or likely guessed).
The Knowledge Gaps tab in the panel allows Fred to see a visual representation of learners closing their knowledge gaps as they increase their mastery strength. This progression can be seen both from a broad perspective — in the Knowledge Gap by Mastery chart — and at a more granular level in the Knowledge Gap by Module chart.
Fred can also use the information in the Knowledge Gaps & Lifts analytics panel to guide his content development. For example, if a Concept has a large knowledge gap, Fred may want to expand upon that content or add additional Activities for practice. Alternatively, if Fred sees that his learners have a large knowledge lift on a Concept early on (when at mastery levels 0 and 1), he can deduce that the content is already familiar to them.
Check out Analytics Deep Dive: Knowledge Gaps & Lifts Analytics Panel to learn more about this learning analytics panel.
Knowledge Card Analytics
The Knowledge Card analytics panel illustrates how learners interact with your Knowledge Cards. Learners can read Knowledge Cards in a linear format — such as during a Learning Session — or ad-hoc — when exploring their training content or reviewing specific information.
Looking at this panel, Fred can see what content his learners most often view. This information can help him determine what content learners find most (or least) helpful. If he wants to go further, he can compare this information to the Knowledge Gaps & Lifts analytics panel data. Are his learners reviewing the content associated with their knowledge gaps? Ideally, they are.
At the bottom of this panel, Fred can see how learners view Knowledge Cards based on their mastery strength. As learners increase their mastery strength (mastery level), they typically view fewer Knowledge Cards.
Check out Analytics Deep Dive: Knowledge Card Analytics Panel to learn more about this learning analytics panel.
Mastery Points Analytics
The Mastery Points and Rewards analytics panels only display when points and rewards are enabled in your account.
Learners earn points when they successfully complete Activities and increase their mastery strength, up to and including their target mastery strength. For example, if a Concept has a mastery goal of level 3, learners can earn up to three mastery points (one for each level) for that Concept. Learners may also be awarded points manually.
If you have a Rewards Store set up, learners can redeem their points for rewards (such as company swag, gift cards, or food vouchers). Tangible rewards are a great way to incentivize learners to engage with their training and are a form of external motivation. Fred has points and rewards turned on in his account and has already set up a Rewards Store with many exciting items for purchase.
The Mastery Points analytics panel shows how many points have been earned and redeemed. This information provides an at-a-glance view of the current points economy. Using this information, Fred can determine the value of each point, choose appropriate prices for his rewards, and estimate how much budget he needs to maintain his Rewards Store.
He can justify the rewards investment by showing stakeholders the Points chart, which shows how many points have been earned and redeemed each month. This information is even more powerful when displayed alongside engagement analytics from the previous panels we discussed. For example, if Fred sees that many points were earned and redeemed in December, he should also see an increase in engagement during that time.
Check out Analytics Deep Dive Mastery Points Analytics Panel to learn more about this learning analytics panel.
The Rewards analytics panel allows you to track your Rewards Store inventory and learner’s purchase history.
Fred uses the metrics in this panel to determine what rewards are most desirable to his learners. He should add more of these to his store. If certain rewards are not being purchased, they are either too expensive or are not motivating for learners.
While some learners are intrinsically motivated — by a personal drive or desire to succeed — others are extrinsically motivated by tangible rewards — such as gift cards or merchandise. Extrinsic motivators are only effective when they are desirable.
Fred can also use the Rewards analytics panel to monitor inventory in his Rewards Store. He needs to ensure that he restocks his inventory regularly, so points don’t become meaningless to learners. Using the Top Rewards table, he can quickly check his stock and see how learners redeem their points. Do they save up for big-ticket items or spend them quickly on low-value gift cards? These details can help Fred determine what items to add.
Check out Analytics Deep Dive: Rewards Analytics Panel to learn more about this learning analytics panel.
Learning analytics help guide your training journey
In the past, Fred never paid attention to his organization’s learning analytics because he didn’t understand their value. He was overwhelmed by all the numbers, and while he found some metrics interesting, he didn’t know what they meant. Fred didn’t understand how they could help guide him toward more successful training. He didn’t know how to use them to illustrate training ROI (return on investment) to stakeholders.
Your name may not be Fred, and your company may have nothing to do with manufacturing or windows. However, you can still leverage OttoLearn’s learning analytics to discover more about your training. Are learners engaging? If they are, is their knowledge improving over time? How much? Where are learners getting stuck? What eLearning content is viewed the most, and what content is viewed the least? Where are learners’ knowledge gaps? Are these gaps being closed? These are just some of the many questions you can use OttoLearn’s analytics to answer.
Have questions about your OttoLearn analytics? Contact us at email@example.com.
ProTip: By default, the languages you add will be inactive until you finish inputting all your translations. Having each language set to inactive until it’s ready to use prevents learners from being presented with a Module in multiple languages. For example, you don’t want learners to do a Learning Session and read one Concept in English and the next in French.
Once you have added all your translations, you can set a language to active using the toggle. You’ll also be able to change the Module’s primary language (the language in which it is presented to learners by default).
In our next post, we’ll look at how learners can set up the language(s) in which they receive content. Stay tuned!
- For learning content to enter and remain in a learner’s long-term memory, the learner needs multiple exposures to the content. Long-term encoding “needs opportunities for rehearsal and repetition,” Jan Breckwoldt et al. wrote in a study on mass vs. spaced learning.
- Repeated exposures alone are not as helpful as spaced repetitions that ask learners to recall and apply information — and especially when learners have to use that information in different ways, many studies have found (for example Rohrer, Lin et al., and Bjork and Bjork).
- The ability to remember information depends on the number of times a learner encounters it and the interval between repetitions, according to Tabibian et al.
Access to knowledge or performance support tools
Achieving a worthwhile or meaningful goal
Achieving a reward — a grade, a badge, points, a prize
Receiving an unexpected reward
Contributing to improving a project or a product
Wanting to be perceived as a team player, wanting to be liked
Improving performance or effectiveness relative to own past performance
Improving performance or effectiveness relative to coworkers; “winning” or being the best
Knowing enough to avoid making mistakes and do better work
Losing status or levels within a gamified framework as the result of making a mistake
Feeling of completing a task, accomplishing a goal, finishing a project
Doing the “right” thing — following rules or norms, being ethical
Is the corporation’s compliance training program well designed?
Prosecutors will look at whether the training is designed to prevent and detect wrongdoing and whether management is enforcing the program by means of training, incentives and discipline.
Is the program being applied earnestly and in good faith? In other words, is the program being implemented effectively?
Prosecutors are expected to directly investigate whether a program is merely a “paper program” or a sincere effort. Evidence of a company-wide commitment to ethics and compliance, promoted by senior and middle management, is needed.
Does the corporation’s compliance training program work in practice?
Good intentions and training don’t count if they don’t work; in assessing whether the program “works in practice,” prosecutors will look at how the suspected misconduct was detected, what the company’s investigation process is and how the company is trying to correct the problem.
Microlearning delivers small, narrowly focused bits of information.
Adaptive microlearning tailors that content to each learner’s knowledge gaps and learning goals, ensuring the training is relevant.
Continuous adaptive microlearning conditions each learner to engage with relevant training every day — for just a few minutes.
Read More Burning Questions
Learning experience platforms
Virtual and augmented reality
Consulting more deeply with the business
Developing the L&D function
When people have a question or don’t know how to do something, what do they do?
Whip out a smartphone and look for information. What they don’t do is sign up for a 1-hour seminar.
Microlearning brings corporate eLearning into the modern paradigm. Microlearning describes eLearning content that is:
- Narrowly focused
- Available on demand
- Mobile-first or mobile-friendly
It must answer a question, meet an immediate need, or help the learner solve a problem.
In the City of BigTown, there was held a conference,
One of training professionals — those making a difference.
A difference to company ROI by delivering training,
From many perspectives — like Manufacturing.
And, too, there were call centers, colleges, corporate sectors,
Each chiming in about outcomes and metrics.
All shipped their training through an LMS platform,
But were desperately seeking true training reform.
One was Antonio, who hated the manuals —
For his product revisions and updates, they were annual.
Plus his printing costs? Oh, they were crazy!
And he truly believed that franchisors were hazy.
None knew how to train in an effective way,
"There’s too much to read, to do!” they’d all say.
For there were many levels of training to assign,
From the top at head office, down to those on the front-line.
Trainers Helen and Abinash nodded, “We agree!”
Said Feng, "Paper and handbooks? Just another dead tree.
On the job, not everyone will have the info they need,
Because the content changes and updates they never did read.
They never learned the content added along the way
That may apply to their region or division today.
Plus, in the field with team members in many locations,
Mobile-first training would make a stronger foundation!”
Said Sales trainer Jane of her PDFs stored online,
“They’re rarely revisited after onboarding time.
I need content delivered in snack-sized bites,
And the ability to test them until they get it right.”
Ursula chimed in, "Onboarding’s a pain for new hires,
With most feeling like their hair is on fire!
Plus, promoted reps must refresh what they know
To be properly prepared to perform their new role."
"I deal with compliance," sighed Manal the Banker.
Abinash nodded, Frank turned to thank her,
For she’d raised the ugliest concern of them all —
That certifications aren’t based on year-long recall.
“To maintain the standards and follow each rule,
We need more than one test that comes out of the blue.
When it comes to things like health & safety, it's a game-changer
Because if their training is lacking, they could be in danger.”
Continuing he asked, “Could training be location-specific?
As learners move through the plant, alerts would be terrific!”
Helen asked who used traditional classroom training
Combined with online to keep interest from waning.
Did they have workshops, seminars, or events,
The kind that take workers away from their desk?
"They learn at that moment, then likely forget —
is there a way to get long-term retainment?”
Rachel had been quiet, she’d said not a word,
When suddenly she leaned in so her voice would be heard.
"We solved these concerns after ditching binders and books —
We use daily drip training and our learners are hooked!
When we update our content, it gets to them faster,
And metrics and KPIs reveal the content 'masters.'
We use OttoLearn for microlearning and we’ve been thrilled,
for all of our training needs — and more — are fulfilled."
So ends our tale of the nine trainers complaining
about the problems they had delivering training.
Training that mattered, with metrics and firm ROI,
Based on data analysis of prime KPIs.
Many problems they shared, with no clear resolution,
Found Agile Microlearning with Otto was the solution!
Microlearning both adaptive and agile saved them from disaster,
Making trainers and trainees learn happily ever after!
- Persian (Farsi)
- Combining the question and activity tabs
- New WYSIWYG editor which is “inline” with the text
- Ability to include media (images, video, audio) within activities (question, answers and feedback)
- Icons to indicate correct answer, position locking, whether or not the answer is visible to learners (active), and override feedback
- Learner password reset
- Streamlined data entry into the content studio, by being able to quickly add
- Numerous small updates and bug fixes
- Check out our most recent updates and add yourself to be automatically notified when we push updates
- Super easy to understand
- Very predictable cost, if you have a specific number of users (eg: employees)
- Doesn’t differentiate between users that have different volumes.
- Have to purchase seats for your maximum number of users.
(Typically the number of users that log in during a month)
- You don’t need a license for every specific user, you can often only license half of your users (since perhaps only half ever log in during a month)
- Typically there is a large cost for going over your licensed number of users, which can be incredibly expensive (eg: 5-10x more than your licensed cost)
- You often have to “play games” as an administrator, not wanting to do a mass course enrollment if you have only have your users licensed in a month
What It Means
Why It Matters
An algorithm determines each learner’s knowledge gaps and feeds them practice activities to close those gaps.
Efficiency. Learners learn the material faster because they spend less time on what they already know.
Learners can follow a scaffolded learner path or self-direct their learning.
Learners are inquisitive. We all Google for information when we need it, so why lock learners into a particular learning path?
Learners engage the most when they are allowed to deviate from a set path and explore available content.
At the end of the day, as long as each learner reaches their mastery goal, the particular path they took to reach there is unimportant.
Delivering content to the learner in smaller chunks.
Chunking content is important only if it is paired with the ability to search for and find specific content chunks “on demand” and the ability to consume just the chunks a learner needs. With these features, training doubles as a performance support.
Learning Experience (LX) Design
Using science and art to create experiences that help learners fulfill the learning outcomes they desire, in a user-centered and goal-directed way.1
Have you used Google? If so, then you have benefitted from Experience Design (XD): When you search for something, you rarely have to go past the first result.
With good XD, you don’t think about the design; it “just works.”
With poor XD, your learners will disengage. They’ll say they “don’t have time.” What they are really saying is that they “don’t have time for the poor experience.”
Typically, when used in relation to L&D, AI actually means “machine learning.”
Machine learning algorithms learn from data and “get smarter” over time.
Have you used Netflix or Amazon recommendations? They are based on machine learning.
The algorithms look at a ton of data, including your past choices and choices made by others who are similar to you, to make predictions as to what you will want to watch or buy.
In L&D, machine learning principles are being integrated in much the same way: to provide recommended content for a learner to consume.
This reduces the burden on training administrators to try to predict or guess what is relevant for each learner. It also provides a more personalized experience for each learner.
Imagine that you are a salesperson, and your training mix subtly and automatically shifts, based on the nature of opportunities in your sales pipeline. You are offered training only on available products that you have not already mastered. That would be a training program that is driven by machine learning.
An algorithm determines each learner’s knowledge gaps and feeds them practice activities to close those gaps.
Use learning analytics to make better decisions by converting data into insights.
The true value is not just in providing more data, more charts, and more graphs. The value is in leveraging AI to search for and surface insights that you’d never think to look for.
Combine the analytics from learners’ performance with key KPIs for the outcomes you desire, and have the analytics engine generate predictions such as, “Learners who reach mastery in the Objection Handling module will close 3.4 percent more deals.”
Now that’s actionable intel.
About a week before I began getting my Ottolearn Mastery Moments, I had a popup window from Adobe appear on my screen as I was working on another project, prompting me to update my version of Flash. We do use Flash, so like an idiot, I clicked on the popup and asked it to start the update—and only then noticed that the url was not an adobe address. Of course, I closed the popup window using the X in the upper corner, which didn’t solve anything. Our IT guys did the best they could for me, but my computer is still compromised, and is being replaced.
Fast forward to Ottolearn and your Online Security for Employees course. After completing several mastery moments, I have now learned what to do with popups like that. This morning, as I restarted my computer again, that same Adobe popup appeared and this time I was ready! I opened task manager and killed that little $%^&^ dead in its tracks.
I know the point of letting us try out OttoLearn as participants was for us to experience the power of this platform from the learner’s point of view. I can tell you that I personally am very grateful for the training you provided to me, and the fact that I was able to let others in my company know how to kill off those nasty virus-carrying popups. Yes, it works. Yes, it’s fun! And yes, I have a true feeling of accomplishment.
I can’t wait for the point at which we can talk more about developing courses for our clients.
New accounting rules
Workplace violence & harassment prevention
Framing a basement
Changing a tire
Retrieval practice is the key to retention.
Your brain wants to be as efficient as possible. Why would it try to encode information for long term storage if it thinks you don’t need it? You need to actually practice retrieving memories (information) in order to have your brain store it in long-term memory.
Spaced retrieval radically improves learning efficiency.
You not only need to practice retrieving information from memory, but you need to wait until you’re on the edge of forgetting it. This is why cramming is so ineffective at generating long-term retention.
Interleaved learning feels strange at first, but dramatically improves retention and skill.
Interleaved learning—mixing up material while learning and practicing, such as mixing up practice activities while learning WHMIS and supervisory skills, will improve your retention of both.
- Cost predictability. Each seat costs you $x/month
- Typically more expensive than a usage-based license
- Typically less expensive than a seats license
- Cost variability tempered by pre-purchasing usage credits that never expire and consume them over time
- Best possible quadrant for engagement
- Will overcome learning obstacles
- Will find a way to learn, even if materials are poor
- Won’t need nudging or incentives
- Text is great
- Can easily learn something
- May need to work up the energy to engage in low quality materials
- May procrastinate, so incentives can help motivate.
- Text is great
- Wants to learn
- Has little experience so can benefit from more instructional quality
- Greatest benefit of video and other rich media
- Worst possible quadrant
- May not have experience in the topic
- May not really care about it
- Will require a lot of motivation to see engagement
- Video can help