Measure What Matters in Your Talent Development Programs With A/B Testing

Here’s a thorough breakdown of how you can conduct A/B testing (or split-testing) to measure metrics that matter in your talent development programs.


One of the biggest mistakes I made in learning and development early in my career was not paying attention to the business impact of my team’s work.

Too often, a learning team’s hard work is under-appreciated and undervalued. We’re usually the last to get headcount to grow our team or the first to lose headcount and shrink our team.

I remember creating reports for our key stakeholders at Google and Yahoo that showed metrics like 287 employees completed XYZ Training in Q3.

These are vanity metrics. They sound great and appear to be successful, but…

Nobody cares.

If you’re reporting like this, it’s not helping you or your business.

If you’re not reporting team or individual metrics at all, I encourage you to start…but make sure you don’t use vanity metrics.

Vanity Metrics

Vanity metrics don’t tell us anything about the impact on the business. The number one way to show your team’s value or your individual value is to directly tie your work to a business metric, or key performance indicator (KPI).

You must first understand what really matters to your business. We know generally speaking there are two main drivers of any business: Increasing revenue and decreasing cost.

The challenge, for the most part, is L&D teams seem to rarely have a direct tie to the business. We’re pretty much seen as a cost center instead of a partner to the business.

Sales Enablement is one area where it’s most obvious how training and education is tied to a KPI or main value driver: Revenue.

Even with Sales Enablement, it’s practically impossible to prove that training was the cause of increased revenue or reducing time to a new sales rep’s first deal.

The best we can is correlate the data.


Why don’t we?

It’s time for a mindset shift. Our industry needs it. You need it. Your business needs it.

Talent, employees, people are the most essential element of any business.

Every business’ success lasts only so long riding on the success of the product, technology, innovation, or marketplace position. Ultimately, the employee experience and opportunities for personal and professional growth make or break business.

Losing top talent due to bad managers and few opportunities for personal development is on us as learning and talent development professionals.

How you find the connection to tie your team’s work to the business’ KPIs is your most essential task in order to foster a culture of learning, growth, and loyalty to a company.

What if you’re not in Sales Enablement…or what if you are and you still don’t know how to find the correlations? If you’re the learning leader in your organization, all learning and its impact on the business must matter to you.

I have some ideas for how you can turn vanity metrics into metrics that matter.

First, a quick story…

When I was consulting Yahoo’s IT Communications & Training team in 2009, I was also studying the technology and market changes happening in marketing.

Businesses were using marketing technology to track metrics like email open rates and clickthrough rates by their customers.

My personal favorite was and still is A/B testing website landing pages to see what converts better (e.g. Headline A yielded 124% more signups than Headline B).

At that time, Yahoo struggled internally to get employees to register for IT webinars and trainings, not to mention actually attend the trainings if they did sign up.

Initially, the value of my work and consulting was measured by how many lessons or modules I produced, the number of webinars I hosted, and other–you guessed it–vanity metrics.

I decided to try something different. I pitched to my team and stakeholders that we take a radical approach: We act like a marketing team.

What did I mean?

Imagine what 100% of our email communications about a new training was like. Do you have it mind? I bet you’re thinking of something like this:

Email Subject: [Required] IT Training Registration for Operating System Upgrade

Wow! I can’t wait to open that email! How about you?

But that’s exactly the kind of communications we used to send.
I asked the team what they thought about doing this instead:

  1. Step up our copywriting game and write stuff people actually want to read
  2. A/B test, or split-test, subject lines to see which ones get more opens
  3. Track clickthrough rates of the call-to-action links in our emails to see which links get more clicks (to understand which copywriting was more effective or where to best place the links in the email. Tip: Adding links to postscripts, P.S., is a powerful way to motivate last-chance clicking for someone who might be passively reading.)
  4. Track throughput, meaning we track the previous metrics all the way through to whether or not an employee registered for and ultimately attended a training session.

Here was the vision: Use all the tracking data to measure attendance and correlate all the data to the KPI that mattered most at that time.

The likely KPIs then would have been:

  • Reduction in internal IT support ticket volume, and
  • Reduced risk of security breaches on employee computers

Guess what the team and stakeholders’ reactions were…

“Nah, that’d be a lot of effort and we don’t have the tools to do that.”

I couldn’t believe it, but in a way I could.

I believed it because that was, and still is, the state of our industry.

But we’re changing. The industry is changing. And I have to believe that you wouldn’t have joined this adventure with me if you didn’t want to have real impact on learning, on your employees, customers, and partners, and ultimately on your organization’s business health.

At Yahoo, I decided to do it anyway.


Here’s how

Since we didn’t have any internal email marketing tools (and I clearly wasn’t going to get buy-in to invest in any), I hacked my own indirect way of measuring and tracking.

  • First, I separated a single training into two identical trainings in the learning management system. (I subtly denoted which one as version A and version B without it being obvious or confusing to the learner when they signed up.)
  • Then I manually separated half the recipient list into two groups.
  • When I drafted the email comms to go out to the entire group of employees, I also drafted two different subject lines. The email with subject line A went to one of the two groups I created, and the email with subject line B went to the other group. Subject A was our typical robotic copywriting, and Subject B was written in a way to capture attention and compel the person to open
  • Once the registration deadline passed, I deduced from the registration list who must have opened the email and clicked the registration link inside (they would have had no other way of receiving the registration link except from that email). It was a rough way to calculate the open rate and clickthrough rate of each subject line.
  • Based on the data I was able to piece together, I then sent the remaining half of the entire recipient list the email with the winning subject line.

Which subject line won?

To make all this more compelling, I previously calculated our average registration rate and attendance rate for our IT webinars.

Although the winning subject line lacked statistical significance, my little experiment revealed that Subject B, the one with the more dynamic, marketing-minded copywriting…

DOUBLED the registration and the attendance rate.

You can imagine how excited I was…but…

In my mind this was only one step above vanity metrics. The next step would have been to correlate this new data with the learning data and ultimately the key business metrics.

Unfortunately, even after the success of the email communications A/B test, the team didn’t fully see the merit and value in using data to connect more deeply to the business, so the experimented ended.

And this is the lesson

Don’t ignore data. Don’t be afraid of data even if you’re not sure how to collect it or what to collect. Don’t separate yourself from key business metrics.

Here’s how you can embrace a data-driven approach to learning and development. Before you hear from me next, do the following:

  • List all the learning programs you and your team directly manage and/or influence
  • Note the programs that have direct connection to business metrics or org goals
  • Identify the exact business metric, KPI, and goal your learning programs connect to
  • Finally, identify all the related data you can capture with existing tools, systems, and processes you have in place (communications tools like MailChimp, event attendance like EventBrite, video hosting like Wistia, etc.)

Once you’ve done all that, you can do a gap analysis to identify what tools you need to capture missing data.

This is exactly what I helped Medallia do to track email open rates, clickthrough rates, and attendance rates in order to more than triple employee engagement with professional development and learning opportunities.

When you’re done with the checklist above, ask me questions or share feedback. Happy to help you take a major leap forward in your approach to data-driven, business-impact learning.

Thanks for reading. After scaling talent development at Google, LinkedIn, Yahoo, Medallia, and more, I created Sprintwell’s Talent Innovation Toolkit to share all the frameworks and strategies that worked and the mistakes to avoid.

Up Next:

The Quick, Essential Framework to Measure Any Business Initiative

The Quick, Essential Framework to Measure Any Business Initiative