You CAN measure the return on training investment: a case study

ROI: it’s an acronym that causes gastric distress for many performance leaders in industry and government. When everyone has to do more with less, it seems like the right thing to do – to measure how - and if - our training investments are actually working.

And of course, it is the right thing to do. So why does it seem to be so doggone hard?

Well, it doesn’t have to be.

Using a simple case study as an example, we’ll walk through how to measure the value added to an organization following a large software training initiative. You can use this model as a template for a variety of measurement needs.

And while most ‘creative training types’ cringe at higher math, me included, not to worry. This model is very much paint-by-numbers.

There are lots of ways to measure training effectiveness. That’s mostly because we can define “measure” and “effectiveness” in different ways. But that’s a different conversation for a later time.

Here are two good articles on how to measure training ROI:

Keep your training program funded

How to measure return on training investment

So why do you want to evaluate training effectiveness, anyway? Because it can and does:

  • Improve training quality: evaluation forces us to think about improvement - and we are more likely to make improvements as a result

  • Increase learning: when we determine how much was learned we can make improvements for more and better learning

  • Identify roadblocks to learning and job-skills transfer, and eliminate those roadblocks, which improves proficiency and productivity; which

  • Help the organization achieve its mission – and improve the bottom line

Also, it’s the right thing to do. It demonstrates the value of learning in the organization.

Here’s the model in the form of a case study:

Sue is the training manager of an organization with approximately 4,000 employees. She and her team of three conducted a series of software classes, both instructor-led and online. The software included both word processing and spreadsheet programs which were new to all employees.

Before the classes were rolled out, Sue decided to measure the return on this training investment and report the results back to leadership. This was her golden opportunity to demonstrate the value of her training unit. Leadership didn’t ask for this, and they kinda shrugged when she told them she wanted to do it. The shrug was all the permission Sue needed. She was a corporate badass warrior who knew what needed to be done.

Her first step was to define how and what she would measure. She wanted to see what kind of productivity increases resulted from the computer skills training, and defined this as “value added” to the organization.

Her definition of “value added”: The dollar value of performance improvement associated with the tasks performed on the job using the skills learned in training.

For the mathematically oriented, it looks like this:

Value added = (S x T)(P2 – P1)

S = Annual salary

T = % of time spent on specific tasks

P2 = Predicted productivity AFTER training

P1= Productivity BEFORE training

For the mathematically-challenged, it looks like this:

Salary times Time (percent of time spent on those specific tasks) multiplied by the % of productivity change.

Here’s how Sue’s team collected the data:

These questions were added to the post-training ‘smiles sheet’ that participants completed after class (the Level 1 evaluation):

1: What percent of your total work time will you spend on tasks that require the knowledge/skills provided in this course?

2: Rate your productivity before training on the tasks that require the knowledge/skills provided in this course (on a scale of 0% to 100%)

3: Predict your productivity rate after training on the job tasks that require the knowledge/skills provided in this course (on a scale of 0% to 100%)

4: Annual salary: ranges were offered with several choices to select. (The evaluations were anonymous, of course.)

Let’s plug in the numbers for a typical participant:

The formula, using a salary of $40,000/year as an example, where this employee estimated he spends 10% of his work day on tasks using the skills learned:

(Salary x Time)(Productivity AFTER x Productivity BEFORE)

$40k x 10% = $40,000

Estimated 80% productive AFTER

Estimated 40% productive BEFORE

80%- 40% = 40% increase – So…

$4,000 x 40% = $1600 value added

Next, Sue knew that she needed to validate this data with some form of reality. How did the training help people on the job?

She asked class participants for their productivity estimates approximately 60 days LATER, after the enthusiasm of the class had worn off. This gave employees time to actually use the skills they learned in the class.

The team sent an email with the same four questions, along with a brief explanation about the evaluation follow-up. The team randomly chose 4 participants from each of the 10 classes they taught, and the same number for the online courses.

  1. On a scale of 0% to 100%, how much time do you spend working on tasks that require the skills you learned in the such and such training?

  2. On a scale of 0% to 100%, what was your productivity rate on those tasks before you went to the class?

  3. What was your productivity rate on those tasks before you attended the training?

Now when they plugged in the numbers, they had a more accurate assessment of how the skills were being used and could put this data into qualitative terms – dollars and productivity increases.

The punch line to this case study is that it really happened. In state government!

Most important of all, our leadership and employees recognized that we were a learning organization, and they became more invested and enthusiastic in its success.

The results were surprising. We used a control group of 20 software classes, and determined that the value added for the entire series of 192 classes delivered that year returned a value of over 5 million dollars! The average value added per employee was $3,300.

Our annual training budget was increased significantly the following year. (Our team leader was Roy, not Sue, and he was more mentor than badass.)

Most important of all, our leadership and employees recognized that we were a learning organization, and they became more invested and enthusiastic in its success.  Did I mention this happened in a state government agency, one that had just undergone a disruptive merger with another agency?

If it can happen there, it can happen with your organization, too. And you, an evaluation warrior, can lead the way. Good luck!