is a powerful and revolutionary tool. It allows you to reach thousands
of people across the world—at a fraction of the cost of traditional
training—and can transform your company.
a year ago, elearning initiatives were exploding everywhere. Now,
however, executives are starting to ask the tough questions. “What
are we really getting for all this money spent on elearning?”
“Now that we’ve bought all this content and software, how do we
know that our elearning programs are really effective?”
implementing elearning at many corporations, I find, repeatedly,
that one of the biggest differences between elearning and other
forms of training is that elearning is completely trackable. You
know everything that every learner did, unlike classroom training.
You have the opportunity to measure precisely the impact of your
article reviews a proven methodology for measuring elearning,
how widely it is being used, how effective it is in transferring
knowledge, and, most importantly, how much impact it has on the
measure? As Peter Drucker stated, “If you can’t measure it, you
can’t manage it.” Some of the huge economic benefits of measuring
know how well your sales force understands a new product.
can decide whether your vendor content library is worth what you’re
can compare the effectiveness of various internally developed
can measure the value of collaboration and synchronous elearning
that takes valuable time away from employees and customers.
that the ultimate purpose of elearning is not to reduce the cost
of training, but to drive business results. If you cannot identify
the business goal of an elearning program, you should ask why
you are doing it in the first place. eLearning is a business performance
improvement tool, not a training tool.
elearning drives you toward measurable business goals: increasing
product revenue ramp rate, reducing turnover, reducing rework,
or increasing customer satisfaction ratings. Before you measure
this business result, however, you do have to know if the elearning
program itself is being used and if it is changing people’s knowledge
Five Step Program to Measuring Effectiveness
group elearning effectiveness into five steps (The “Bersin Five
Step Program?”). In order to make this measurement, you will have
to get data from your Learning Management System (LMS) or your
elearning content provider. Demand that they provide you this
information. If they can’t, you probably have the wrong supplier.
1: Enrollments. “Is the audience showing up?”
first and most obvious thing to measure is enrollments. Is the
audience actually enrolling in the course or courses? In most
systems, courses are stored in your dynamic course catalog, and
students have to enroll proactively in order to start a course.
In some cases, they are “pre-enrolled” or “automatically enrolled.”
need to monitor enrollments over time. If a course is launched
on June 1, and the audience size is 1000—what percentage of the
people have enrolled by June 10? You should try to monitor enrollments
week by week.
learners are not enrolling, then you probably have a marketing
problem. Either people can’t find the course, they don’t know
how to enroll, or they do not understand why it is important!
You may have to establish a more active marketing program.
the course is one mandated by management, enrollments should pick
up fast. If it is elective, then perhaps the course is named poorly,
not well positioned in the catalog, or people do not even know
it exists. You have to take on the responsibility of marketing
your elearning programs. People will ignore the program completely
if they do not understand why they need it.
1: Measuring Enrollment
1 shows that the pre-enrolled course (Intro to Java) is fully
running from day 1. Java 102 was well marketed so enrollments
picked up in 30 days. Something is wrong with Java 203, however.
People do not know about the course, they cannot find it, or perhaps
the target audience is small.
2: Activity. “Are they eating the dog food?”
next important issue to address is, “Are people moving through
the course?” Have they started? What percent have they completed?
Your content provider should be able to give you this information,
2: Measuring Activity
should monitor activity correlated to enrollment date. For example,
if you take a group of people who enrolled the first week of February,
how far have they progressed by the end of February? There should
be a natural activity level which continues throughout the course.
If you find a large number of these people started but are not
now completing or using the course, you have a content problem.
Either the content is inappropriate, too difficult, hard to operate,
or just uninteresting and hard to work through.
have found that the most important factor which governs how much
activity you have in a course is the incentive to complete. If
the course is truly “optional,” then the content itself must entice
the learner to finish. If the course is “directed” (meaning that
a manager or supervisor mandates the course as part of an employee’s
job) then activity will typically continue.
long should it take to complete a course? What is a reasonable
rate of activity per week? That depends. For a course a few hours
long, you will find that people progress at an hour a week or
so. People usually go quickly and then stop at a particular point.
This valuable information can help you assess the usability, relevance,
and performance of the content.
Step 3: Completion. “Did they finish?”
is just a special case of Activity—but a very special case. People
who truly complete a course deserve special recognition. They
will give you the best feedback on content quality and effectiveness
toward the business goal. They wanted to finish.
3: Completion Measures
interesting point—you can’t “average” completion percentages.
A group of students who achieved 30% completion is not the same
as 1/3 of the students who achieved 100% completion. The former
means that you have a poorly performing program. The latter means
that you may have a great program, but it’s not targeted
your program is important—it’s like marketing. If you target the
right content toward the right audience, you will achieve the
4: Scores. “How well did they score?”
people think scoring is the one and only way to measure effectiveness.
I tend to disagree. Sure, if people score highly they have learned
something. But in elearning you don’t always know why they
scored high. Did they really learn the material? Did they copy
from someone else? Did they already know the material? Did they
just try the test 15 times until they got it right?
is another technology issue here. Does the course count the number
of times a student attempts a question? If not, the score data
may be useless—because people will keep guessing until they get
are scores only taken at the end or are there assessments along
the way, which you can use to measure learning? You need multiple
assessments, so you can measure progress toward the final learning
goal. And the best content actually will categorize assessments
by learning objective, so you can measure exactly what someone
has scored well on and where they have fallen short.
and completion percentage together tell you a lot. You will see
that people fall into different segments based on completion percentage
4: Scores vs. Completion
Figure 4 shows, users clump into different segments. Most likely,
you will find a group of users in one of the outlying corners
(upper left or lower right). If you find a huge number of people
there, you know there is a problem with the content or with the
Step 5: Feedback or Surveys “How
did they like it?”
is a critically important part of elearning. Unlike traditional
training, you have little or no face-to-face contact with learners.
You need regular feedback in order to understand what people like
and why they may not like something. And you need that in both
numeric form and written form.
will tell you very important things, such as: Did the content
play? Did the assessments work? Did the video, audio, and other
media work? Were the material and interactivities engaging? Was
the material useful? Were the graphics interesting?
will find that the more personality you put into elearning the
more effective it will be. For example, if you have a graded assignment
that goes to a real mentor or tutor, you will find that course
completion goes up by orders of magnitude. If you have live synchronous
sessions mandatory (with attendance monitored) you will find that
people get their pre-work done. There are many ways to incorporate
real people into elearning, and doing so has a huge influence
Making the Correlation to Business
the correlation to measurable business results matters. How do
you know if a given elearning course or program really produces
results? It is hard to correlate precisely, but here is a methodology,
which works well. And I strongly urge you to go through this exercise—or
hire a consultant to help. The results are critical to your success
in the long term.
at the business results in four quadrants. Each is important in
its own way.
5: Correlating Business Results.
information gives you valuable insights. If your elearning program
was a sales effectiveness initiative, you might want to interview
five sales managers to ask them if they have seen a difference
in behavior. You will find that people definitely want to talk
about the results of elearning at an individual and manager level.
information gives you real ROI measurement. If you can hold a
target group constant, you can often see real improvements in
business metrics after rolling out an elearning program. You can
correlate this information to the elearning program by looking
at the timing of the rollout, timing of completion, and the business
change during that period. I have done such studies and found
amazing correlations are possible when you look at the information
Is all this worth the effort?
Over the course of your elearning experience, you will spend a
lot of time and money on content. If you do not measure results,
you will never know if you are getting your money’s worth. Just
as a Marketing executive would not launch a multi-million dollar
campaign without some form of measurement, you should not launch
an elearning program without measurement. I have worked with companies
who have multi-million dollar content agreements and they have
little or no idea what effect this is having on their business.
Once you start measuring results, you can start refining and improving
your program—generating higher ROI and saving money in the process.
is not a training tool. It is a business performance improvement
tool. If you use it that way, your entire elearning program will
be cost-effective, powerful, and aligned with the business, and
it will have the potential of driving competitive advantage. This
is the promise, and today it is possible.
Bersin is a former elearning executive still wrestling with nightmares
of learning management, content development, and measurement,
and secretly likes it. He has helped more than 30 companies deploy
more than 7 million elearning activities. He consults with enterprises
and vendors to help make elearning best drive business results.
He can be reached at firstname.lastname@example.org.
Copyright (c) 2000-2004 LiNE Zine (www.linezine.com)