
The learning industry has to quickly learn how to use data to drive learning efforts forward.
by Andie Burjek
March 24, 2016
The learning industry had a long history of inconsistent and intermittent data use, but that’s changed thanks to the surge of interest in, and the increasing availability of, big data.
Businesses have been overcome by this dramatic change, and many don’t analyze results at all, said A.D. Detrick, learning measurement consultant at Xerox Learning Solutions, in an email interview. The solution? Businesses need to define a learning success via a measurable set of outcomes. “If we know whether or not there was a change in performance after the training, it opens up options about what to do with the data,” he wrote. “Without those outcomes, millions of data points just start to feel like noise.”
Detrick talked to Chief Learning Officer about how learning leaders can effectively use data to analyze results and make meaningful changes in their learning strategy and in their business.
What are the benefits and challenges to using big data for learning?
This huge volume of available data is both a blessing and a curse. The biggest challenge is managing expectations around large volumes of data. The perception of data analytics is that it is so big and so powerful that either it will steamroll over the industry before they even begin to understand its purpose, or they expect it to solve major problems, and provide massive insights overnight. Neither expectation is true. The reality is, data analytics is a process of consistent insight. It provides regular, actionable data to help us continually refine our learning offerings to ensure that we’re meeting our outcomes and impacting the organization positively.
Continuous analysis is key for learning leaders when it comes to learning content. How does it work?

When I talk about continual analysis, I’m referring to a disciplined process of strategic measurement where we accurately measure training outcomes, the performance behaviors we’re trying to change and — ideally — the financial impact of those changes. This is a very front-heavy process, but when done correctly, it scales almost effortlessly.
The heavy lifting of this process is identifying the intended outcomes and impacts of the training, and ensuring that we will have valid and reliable metrics. The continual analysis monitors those incoming data streams for outliers — over-performers and under-performers — within each individual data stream, and compares each data stream against the others to find strong correlations between training outcomes and impact. This gives us some good insight into what is working to improve performance and what isn’t. We can also use that data to make frequent, regular and, most importantly, minor corrections to drive that impact.
In your experience, what programmatic learning experience improvements yield positive results?
Two things I’ve noticed are microcontent and badging — especially when used together wisely. The ability to analyze microcontent — how it is browsed, viewed, downloaded, or shared — provides a very detailed picture of our learners. It’s not just a picture of what they learn; it’s a snapshot into how they learn.
Instead of hoping that all aspects of training are applicable, engaging and effective for the learner, we can now see each aspect individually. Instead of correlating a knowledge assessment with performance behaviors, now we can drill down to correlate the individual learning objective of the microcontent with performance behaviors. It’s a remarkably informative way to inform or design our content curation, and it tells us specifically what information correlates with improved performance.
I’ve also seen badging used really well by a number of clients. There’s no golden rule for how to use badging. The greatest successes I’ve seen all really understand what motivates the people in their organizations. In some cases, it is competition; badges are given to winners. In other cases, it’s recognition; badges are given to those who earn them for going above and beyond. Some are a mixture. But the best examples have all tapped into the strong motivations within their culture, as opposed to just giving out badges for progress or participation.
These two areas are really good examples of elements in the learning landscape which only recently became popular because of the ability to capture the data behind them. We’re able to use that data to get really clear insight into the best ways to continue serving the learner.