ELearning isn’t the only industry that has struggled with determining ROI; those of us in marketing have also seen a radical shift in how our work is measured.
John Wanamaker (1838-1922), who revolutionized the retail experience, once said that “Half the money I spend on advertising is wasted; the trouble is I don’t know which half.”
Today, that’s only true if you are foolish enough not to master and manage your metrics. A successful 21st-century marketer aspires to measure the ROI for every dollar spent in every channel. True, some channels are easier than others to measure, but we try.
Training Managers, the keepers of the KPIs, used to be happy with simple course completions. Historically, the most important thing was getting your team to complete their assigned training with an acceptable grade. Then came compliance requirements: learners had to revisit material every year to update their knowledge and ensure they were worthy of their compliance certifications.
Training Has an ROI? ROTI?
But what is the real Return on Training Investment? We can measuring changes in worker behavior, where training has had a measurable impact on:
- Employee engagement and retention
- Error or accident reduction
- Increased effectiveness and productivity
Ultimately, these three tangible metrics contribute to a better bottom line — no matter what industry sector you may be in. Fewer accidents, more widgets produced, calls taken, sales closed, lower staff turnover, and higher employee satisfaction are the cornerstones of profitability. We are either looking at an improvement in sales or productivity or a reduction of costs incurred through downtime, employee churn, or injury compensation payout as examples. Every industry has its own specific metrics. Do you know yours?
Marketing used to work with gross measurements, such as impressions and clicks on banner ads managed by third-party agencies. Google pioneered self-managed advertising, putting marketers in charge of their daily ad spend, channel delivery, and ad rotation. With the tracking available through Google Analytics, marketers could determine the ROI of their spend — tracking a click on an ad right through the sales funnel to happy customer purchase.
Calculating ROI SEEMS Easy
A rough ROI calculation for online marketing is easy.
Imagine you have an ecommerce site:
If you pay $250 for PPC (pay per click) advertising, and receive $1,000 in sales, then your Campaign ROI is
- ROI = (Sales $ - Cost)/Cost) or
- ROI = ($1000-$250)/$250
- ROI = $750/$250
- ROI - 300%
300% ROI means you receive $3 in value for a $1 investment. A pretty good return!
We are not calculating net profitability, just the gross ROI for that campaign.
But, eLearning is Different!
ELearning is teetering on the edge of the transition between gross measurements (courses completed, passing grades attained) and a more sophisticated way to calculate ROTI — measuring the business impact of changed behavior. For the most part, we provide eLearning to our learners and hope to receive value. But do we really know the ROI of our training efforts?
Even industry experts say that our often rudimentary evaluation methods are rife with problems that can mislead us on the effectiveness of our training. Post-training evaluations rarely correlate with knowledge transfer or performance results.
Google trends shows that searches for topics related to the ROI of marketing have steadily increased, whereas ROI of training has stayed steady.
Why is that? Because calculating your training ROI is much more difficult than our ecommerce example.
Breaking Down the ROTI Calculation
Learn to ignore learner surveys (smile sheets) as a way to calculate ROI. A well-crafted learner survey will tell you about learner perceptions and feelings, but it will do little to indicate actual results.
Identify the results you want and the training outcome you want to achieve. For example:
- Health & Safety training for manufacturing staff = reduced accidents and absences
- Product knowledge for retail staff = higher sales and customer satisfaction
Let’s look at two real-world examples.
A Manufacturing Example: How Much does Reducing Injury Save?
You run a manufacturing plant with 200 employees. Your costs per injury (direct and indirect) average $27,000. The injury rate is 16% per year, which means 32 injured employees or $864,000 as the total injury cost every year.
You can’t reduce the number of employees, but you can reduce your incidents of injury by training each employee to work more safely on the job.
If you reduce your incidents of injury by 1 percent, to 15% per year (or 30 injuries), your annual cost of injury reduces to $810,000. That 1 percent unit of training saves you $54,000.
Calculate the cost of rolling out a Pilot Project to retrain 25 percent of your workforce. The same 1 percent improvement will deliver $13,500 in reduced costs based on your Pilot Project investment alone. Well worth the test to roll out training to your entire workforce.
A Retail Example: The Impact of Training on Sales
Customer service and product knowledge training delivers immediate positive results. The higher quality of your team’s interaction with your customers, and the more knowledgeable your sales force, the more customers are served and the higher your individual sale or “basket” size.
Assuming labor hours, wages, and traffic remain constant, you can measure the impact of training for both factors.
Once you determine the value of a unit of training — a 1 percent improvement in one variable in your ROI equation — you can determine how much to invest in training.
- A 1 percent improvement in browser-to-buyer conversion due to better customer service: If your median is 10 sales/labor-hour, and you add one more conversion per hour, you’re adding 8 new sales per day. At $45/basket, that adds up to $360 in gross sales improvement per day per sales person. Multiply that by the number of members in your team.
- Assuming conversion stays the same at 10 but basket size goes up 1 percent due to better product knowledge, adding 45 cents to each basket x 80 transactions per day adds an entire basket value of $36 to your sales. That is 80 percent of a new customer, just by improving existing sales 1%.
Keep in mind that effective sales & product training cannot help move both metrics simultaneously. How much would you spend to see these kinds of boosts in sales? What would you have to spend on marketing to achieve the same results? (All props to the Marketing Dept. Don’t cut their budget).
Putting it All Together
Here’s a simple action plan to get you started, no matter what industry you are in.
- Find the Problem. Identify one metric that you want to improve upon; that will determine the focus of your training materials. Also, identify the team members you want to use in your Pilot Project.
- Uncover the Gap. Perform a Knowledge Assessment to find the gaps that need to be filled to set a baseline against which to measure improvement. Our OttoLearn RapidScan™ process is a proven formula that delivers actionable results for improved knowledge retention.
- Train for Improvement. Once your baseline has been established, deliver the training you need to improve performance, always focused on the desired result. OttoLearn delivers that training for your team in 2-minute Mastery Moments™ several times a day.
- Analyze and Measure ROTI. Watch your metrics from Day One. When I was running retail stores, we watched our sales-per-labor-hour on a daily basis. Look at team averages for the duration of the Pilot training period. Calculate the improvement of your unit value of training.
- Roll Out a Complete Training Program. Extrapolate how much training you would need to add to achieve that result across your entire team, plant, franchise empire, etc. Use OttoLearn to deliver your training and keep reporting your improved ROTI.
Here’s the sweetener: Pilot Project numbers don’t lie. An OttoLearn RapidScan Pilot Project, using our microlearning platform, is a low-risk way to prove that better training will deliver on your ROTI goals. It removes the emotion from the discussion, because you can determine the value of a unit of training. It gives you numbers. And numbers are power.
Identify - Test - Improve - Invest
It’s nearly impossible to figure out what your team has forgotten from their previous trainings or what they have learned to do incorrectly over time. Now you can measure that. Use that data to deliver improved training, and measure the results.
These results WILL surprise you. Knowledge changes behavior. From there, continue to invest in ongoing, daily training to ensure training knowledge retention and ongoing changed behavior.
You can’t improve what you don’t measure. Once you start calculating your ROTI, you will never look back.
Access to knowledge or performance support tools
Achieving a worthwhile or meaningful goal
Achieving a reward — a grade, a badge, points, a prize
Receiving an unexpected reward
Contributing to improving a project or a product
Wanting to be perceived as a team player, wanting to be liked
Improving performance or effectiveness relative to own past performance
Improving performance or effectiveness relative to coworkers; “winning” or being the best
Knowing enough to avoid making mistakes and do better work
Losing status or levels within a gamified framework as the result of making a mistake
Feeling of completing a task, accomplishing a goal, finishing a project
Doing the “right” thing — following rules or norms, being ethical
Is the corporation’s compliance training program well designed?
Prosecutors will look at whether the training is designed to prevent and detect wrongdoing and whether management is enforcing the program by means of training, incentives and discipline.
Is the program being applied earnestly and in good faith? In other words, is the program being implemented effectively?
Prosecutors are expected to directly investigate whether a program is merely a “paper program” or a sincere effort. Evidence of a company-wide commitment to ethics and compliance, promoted by senior and middle management, is needed.
Does the corporation’s compliance training program work in practice?
Good intentions and training don’t count if they don’t work; in assessing whether the program “works in practice,” prosecutors will look at how the suspected misconduct was detected, what the company’s investigation process is and how the company is trying to correct the problem.
Microlearning delivers small, narrowly focused bits of information.
Adaptive microlearning tailors that content to each learner’s knowledge gaps and learning goals, ensuring the training is relevant.
Continuous adaptive microlearning conditions each learner to engage with relevant training every day — for just a few minutes.
Read More Burning Questions
Learning experience platforms
Virtual and augmented reality
Consulting more deeply with the business
Developing the L&D function
When people have a question or don’t know how to do something, what do they do?
Whip out a smartphone and look for information. What they don’t do is sign up for a 1-hour seminar.
Microlearning brings corporate eLearning into the modern paradigm. Microlearning describes eLearning content that is:
- Narrowly focused
- Available on demand
- Mobile-first or mobile-friendly
It must answer a question, meet an immediate need, or help the learner solve a problem.
In the City of BigTown, there was held a conference,
One of training professionals — those making a difference.
A difference to company ROI by delivering training,
From many perspectives — like Manufacturing.
And, too, there were call centers, colleges, corporate sectors,
Each chiming in about outcomes and metrics.
All shipped their training through an LMS platform,
But were desperately seeking true training reform.
One was Antonio, who hated the manuals —
For his product revisions and updates, they were annual.
Plus his printing costs? Oh, they were crazy!
And he truly believed that franchisors were hazy.
None knew how to train in an effective way,
"There’s too much to read, to do!” they’d all say.
For there were many levels of training to assign,
From the top at head office, down to those on the front-line.
Trainers Helen and Abinash nodded, “We agree!”
Said Feng, "Paper and handbooks? Just another dead tree.
On the job, not everyone will have the info they need,
Because the content changes and updates they never did read.
They never learned the content added along the way
That may apply to their region or division today.
Plus, in the field with team members in many locations,
Mobile-first training would make a stronger foundation!”
Said Sales trainer Jane of her PDFs stored online,
“They’re rarely revisited after onboarding time.
I need content delivered in snack-sized bites,
And the ability to test them until they get it right.”
Ursula chimed in, "Onboarding’s a pain for new hires,
With most feeling like their hair is on fire!
Plus, promoted reps must refresh what they know
To be properly prepared to perform their new role."
"I deal with compliance," sighed Manal the Banker.
Abinash nodded, Frank turned to thank her,
For she’d raised the ugliest concern of them all —
That certifications aren’t based on year-long recall.
“To maintain the standards and follow each rule,
We need more than one test that comes out of the blue.
When it comes to things like health & safety, it's a game-changer
Because if their training is lacking, they could be in danger.”
Continuing he asked, “Could training be location-specific?
As learners move through the plant, alerts would be terrific!”
Helen asked who used traditional classroom training
Combined with online to keep interest from waning.
Did they have workshops, seminars, or events,
The kind that take workers away from their desk?
"They learn at that moment, then likely forget —
is there a way to get long-term retainment?”
Rachel had been quiet, she’d said not a word,
When suddenly she leaned in so her voice would be heard.
"We solved these concerns after ditching binders and books —
We use daily drip training and our learners are hooked!
When we update our content, it gets to them faster,
And metrics and KPIs reveal the content 'masters.'
We use OttoLearn for microlearning and we’ve been thrilled,
for all of our training needs — and more — are fulfilled."
So ends our tale of the nine trainers complaining
about the problems they had delivering training.
Training that mattered, with metrics and firm ROI,
Based on data analysis of prime KPIs.
Many problems they shared, with no clear resolution,
Found Agile Microlearning with Otto was the solution!
Microlearning both adaptive and agile saved them from disaster,
Making trainers and trainees learn happily ever after!
- Combining the question and activity tabs
- New WYSIWYG editor which is “inline” with the text
- Ability to include media (images, video, audio) within activities (question, answers and feedback)
- Icons to indicate correct answer, position locking, whether or not the answer is visible to learners (active), and override feedback
- Learner password reset
- Streamlined data entry into the content studio, by being able to quickly add
- Numerous small updates and bug fixes
- Check out our most recent updates and add yourself to be automatically notified when we push updates
- Super easy to understand
- Very predictable cost, if you have a specific number of users (eg: employees)
- Doesn’t differentiate between users that have different volumes.
- Have to purchase seats for your maximum number of users.
(Typically the number of users that log in during a month)
- You don’t need a license for every specific user, you can often only license half of your users (since perhaps only half ever log in during a month)
- Typically there is a large cost for going over your licensed number of users, which can be incredibly expensive (eg: 5-10x more than your licensed cost)
- You often have to “play games” as an administrator, not wanting to do a mass course enrollment if you have only have your users licensed in a month
What It Means
Why It Matters
An algorithm determines each learner’s knowledge gaps and feeds them practice activities to close those gaps.
Efficiency. Learners learn the material faster because they spend less time on what they already know.
Learners can follow a scaffolded learner path or self-direct their learning.
Learners are inquisitive. We all Google for information when we need it, so why lock learners into a particular learning path?
Learners engage the most when they are allowed to deviate from a set path and explore available content.
At the end of the day, as long as each learner reaches their mastery goal, the particular path they took to reach there is unimportant.
Delivering content to the learner in smaller chunks.
Chunking content is important only if it is paired with the ability to search for and find specific content chunks “on demand” and the ability to consume just the chunks a learner needs. With these features, training doubles as a performance support.
Learning Experience (LX) Design
Using science and art to create experiences that help learners fulfill the learning outcomes they desire, in a user-centered and goal-directed way.1
Have you used Google? If so, then you have benefitted from Experience Design (XD): When you search for something, you rarely have to go past the first result.
With good XD, you don’t think about the design; it “just works.”
With poor XD, your learners will disengage. They’ll say they “don’t have time.” What they are really saying is that they “don’t have time for the poor experience.”
Typically, when used in relation to L&D, AI actually means “machine learning.”
Machine learning algorithms learn from data and “get smarter” over time.
Have you used Netflix or Amazon recommendations? They are based on machine learning.
The algorithms look at a ton of data, including your past choices and choices made by others who are similar to you, to make predictions as to what you will want to watch or buy.
In L&D, machine learning principles are being integrated in much the same way: to provide recommended content for a learner to consume.
This reduces the burden on training administrators to try to predict or guess what is relevant for each learner. It also provides a more personalized experience for each learner.
Imagine that you are a salesperson, and your training mix subtly and automatically shifts, based on the nature of opportunities in your sales pipeline. You are offered training only on available products that you have not already mastered. That would be a training program that is driven by machine learning.
An algorithm determines each learner’s knowledge gaps and feeds them practice activities to close those gaps.
Use learning analytics to make better decisions by converting data into insights.
The true value is not just in providing more data, more charts, and more graphs. The value is in leveraging AI to search for and surface insights that you’d never think to look for.
Combine the analytics from learners’ performance with key KPIs for the outcomes you desire, and have the analytics engine generate predictions such as, “Learners who reach mastery in the Objection Handling module will close 3.4 percent more deals.”
Now that’s actionable intel.
About a week before I began getting my Ottolearn Mastery Moments, I had a popup window from Adobe appear on my screen as I was working on another project, prompting me to update my version of Flash. We do use Flash, so like an idiot, I clicked on the popup and asked it to start the update—and only then noticed that the url was not an adobe address. Of course, I closed the popup window using the X in the upper corner, which didn’t solve anything. Our IT guys did the best they could for me, but my computer is still compromised, and is being replaced.
Fast forward to Ottolearn and your Online Security for Employees course. After completing several mastery moments, I have now learned what to do with popups like that. This morning, as I restarted my computer again, that same Adobe popup appeared and this time I was ready! I opened task manager and killed that little $%^&^ dead in its tracks.
I know the point of letting us try out OttoLearn as participants was for us to experience the power of this platform from the learner’s point of view. I can tell you that I personally am very grateful for the training you provided to me, and the fact that I was able to let others in my company know how to kill off those nasty virus-carrying popups. Yes, it works. Yes, it’s fun! And yes, I have a true feeling of accomplishment.
I can’t wait for the point at which we can talk more about developing courses for our clients.
New accounting rules
Workplace violence & harassment prevention
Framing a basement
Changing a tire
Retrieval practice is the key to retention.
Your brain wants to be as efficient as possible. Why would it try to encode information for long term storage if it thinks you don’t need it? You need to actually practice retrieving memories (information) in order to have your brain store it in long-term memory.
Spaced retrieval radically improves learning efficiency.
You not only need to practice retrieving information from memory, but you need to wait until you’re on the edge of forgetting it. This is why cramming is so ineffective at generating long-term retention.
Interleaved learning feels strange at first, but dramatically improves retention and skill.
Interleaved learning—mixing up material while learning and practicing, such as mixing up practice activities while learning WHMIS and supervisory skills, will improve your retention of both.
- Cost predictability. Each seat costs you $x/month
- Typically more expensive than a usage-based license
- Typically less expensive than a seats license
- Cost variability tempered by pre-purchasing usage credits that never expire and consume them over time
- Best possible quadrant for engagement
- Will overcome learning obstacles
- Will find a way to learn, even if materials are poor
- Won’t need nudging or incentives
- Text is great
- Can easily learn something
- May need to work up the energy to engage in low quality materials
- May procrastinate, so incentives can help motivate.
- Text is great
- Wants to learn
- Has little experience so can benefit from more instructional quality
- Greatest benefit of video and other rich media
- Worst possible quadrant
- May not have experience in the topic
- May not really care about it
- Will require a lot of motivation to see engagement
- Video can help