When I learned about Wistia, a video platform for business, I was blown away.
Wistia’s videos are creative, artistic, and entertaining. I’m referring to the videos in their Learning Library on their website. I learned at least one thing I could put into practice from every video I watched.
Then it hit me.
They made marketing videos educational, and they made educational videos marketable.
That is, the videos were entertaining, engaging, and added value.
I remember the days when e-learning took months of development that followed a formula like this:
Several Slides + Voice-Over Audio + Processing in Adobe Captivate = e-learning
And software screencasts:
Captivate or Camtasia + Voice-over audio track = e-learning
Unfortunately, these formulas became so pervasive — and ultimately even replaced most instructor-led training in our industry — that learners equated online learning with “Training sucks.”
They sucked the humanity out of teaching, coaching, and learning.
Your employees, customers, partners deserve better. Your business deserves better.
Imagine the branding and business impact you can bring to your company by elevating the online learning video standard. What if you made videos personal that resonated with the human element?
You might not even have online learning at the moment, but you can bring it to life and start elevating your company’s learning experiences in days.
Days. Not months.Days.
And you don’t need a big budget or a big production team.
What impact could you have by elevating your customer, partner, and employee education with human-centered videos and for a fraction of the cost of what you’d pay a video production firm?
Meet Brian from SoundHound
Brian Bautista, a SoundHound product marketer and customer educator, used the strategies and techniques from this toolkit and Wistia’s DIY kit.
He created the first Hound How-To, a series of customer education videos that put a face and personality to Hound’s customer support.
Internally, a new excitement was in the air as SoundHound’s marketing team realized how engaging and humanizing this made customer education.
And here’s the best part of all: Brian spent about $100 and only a single day to write, film, edit, and produce the first video prototype so he could show a proof of concept to his colleagues.
The prototype was a huge hit at SoundHound. The head of marketing immediately approved the video to launch publicly and green-lit more.
Check out Brian’s first video prototype on YouTube:
One More Story
Brian launched Hound How-To videos with a simple prototype and a tiny budget.
He believed it was possible because I showed him step-by-step how my team and I did it at Medallia.
The learning team at Medallia was in an early stage formation. The team was relatively young and inexperienced.
Once I shared how it’s possible to make engaging videos like Wistia, one of my favorite teammates and I realized one thing:
We won’t convince anyone to adopt this approach and build a video studio by just talking about it.
We simply did it.
We used our iPhones, an empty conference room, a solid colored wall for background, scoop lamps from the hardware store, and a makeshift tripod.
It was the scrappiest prototype I’ve ever made. From start to finish, it took us about 3 full days.
We shared our production at the next HR all-hands meeting. People couldn’t believe it.
Our head of HR was so excited, he approved a $7,000 budget to build our DIY video studio with higher quality equipment. We spent just under $6,000 total and used a spare storage room in the back corner of the building.
My favorite part: my teammate, who was just starting her career, has since become known at Medallia for being the go-to person to make great learning videos.
It’s unlocked creativity, scalable training videos, and most importantly, a personal approach to learning.
Medallia’s internal communications team and marketing team use the studio too, bringing together cross-functional teams that otherwise wouldn’t interact much at most companies.
Why this matters
You can do this too. Your team can do this.
And I’m 100% confident it will be a game-changer for your company. Just like it has been for SoundHound and Medallia.
Before next time, I encourage you to do two things…
Here’s a thorough breakdown of how you can conduct A/B testing (or split-testing) to measure metrics that matter in your talent development programs.
One of the biggest mistakes I made in learning and development early in my career was not paying attention to the business impact of my team’s work.
Too often, a learning team’s hard work is under-appreciated and undervalued. We’re usually the last to get headcount to grow our team or the first to lose headcount and shrink our team.
I remember creating reports for our key stakeholders at Google and Yahoo that showed metrics like 287 employees completed XYZ Training in Q3.
These are vanity metrics. They sound great and appear to be successful, but…
If you’re reporting like this, it’s not helping you or your business.
If you’re not reporting team or individual metrics at all, I encourage you to start…but make sure you don’t use vanity metrics.
Vanity metrics don’t tell us anything about the impact on the business. The number one way to show your team’s value or your individual value is to directly tie your work to a business metric, or key performance indicator (KPI).
You must first understand what really matters to your business. We know generally speaking there are two main drivers of any business: Increasing revenue and decreasing cost.
The challenge, for the most part, is L&D teams seem to rarely have a direct tie to the business. We’re pretty much seen as a cost center instead of a partner to the business.
Sales Enablement is one area where it’s most obvious how training and education is tied to a KPI or main value driver: Revenue.
Even with Sales Enablement, it’s practically impossible to prove that training was the cause of increased revenue or reducing time to a new sales rep’s first deal.
The best we can is correlate the data.
Why don’t we?
It’s time for a mindset shift. Our industry needs it. You need it. Your business needs it.
Talent, employees, people are the most essential element of any business.
Every business’ success lasts only so long riding on the success of the product, technology, innovation, or marketplace position. Ultimately, the employee experience and opportunities for personal and professional growth make or break business.
Losing top talent due to bad managers and few opportunities for personal development is on us as learning and talent development professionals.
How you find the connection to tie your team’s work to the business’ KPIs is your most essential task in order to foster a culture of learning, growth, and loyalty to a company.
What if you’re not in Sales Enablement…or what if you are and you still don’t know how to find the correlations? If you’re the learning leader in your organization, all learning and its impact on the business must matter to you.
I have some ideas for how you can turn vanity metrics into metrics that matter.
First, a quick story…
When I was consulting Yahoo’s IT Communications & Training team in 2009, I was also studying the technology and market changes happening in marketing.
Businesses were using marketing technology to track metrics like email open rates and clickthrough rates by their customers.
My personal favorite was and still is A/B testing website landing pages to see what converts better (e.g. Headline A yielded 124% more signups than Headline B).
At that time, Yahoo struggled internally to get employees to register for IT webinars and trainings, not to mention actually attend the trainings if they did sign up.
Initially, the value of my work and consulting was measured by how many lessons or modules I produced, the number of webinars I hosted, and other–you guessed it–vanity metrics.
I decided to try something different. I pitched to my team and stakeholders that we take a radical approach: We act like a marketing team.
What did I mean?
Imagine what 100% of our email communications about a new training was like. Do you have it mind? I bet you’re thinking of something like this:
Email Subject:[Required] IT Training Registration for Operating System Upgrade
Wow! I can’t wait to open that email! How about you?
But that’s exactly the kind of communications we used to send.
I asked the team what they thought about doing this instead:
Step up our copywriting game and write stuff people actually want to read
A/B test, or split-test, subject lines to see which ones get more opens
Track clickthrough rates of the call-to-action links in our emails to see which links get more clicks (to understand which copywriting was more effective or where to best place the links in the email. Tip: Adding links to postscripts, P.S., is a powerful way to motivate last-chance clicking for someone who might be passively reading.)
Track throughput, meaning we track the previous metrics all the way through to whether or not an employee registered for and ultimately attended a training session.
Here was the vision: Use all the tracking data to measure attendance and correlate all the data to the KPI that mattered most at that time.
The likely KPIs then would have been:
Reduction in internal IT support ticket volume, and
Reduced risk of security breaches on employee computers
Guess what the team and stakeholders’ reactions were…
“Nah, that’d be a lot of effort and we don’t have the tools to do that.”
I couldn’t believe it, but in a way I could.
I believed it because that was, and still is, the state of our industry.
But we’re changing. The industry is changing. And I have to believe that you wouldn’t have joined this adventure with me if you didn’t want to have real impact on learning, on your employees, customers, and partners, and ultimately on your organization’s business health.
At Yahoo, I decided to do it anyway.
Since we didn’t have any internal email marketing tools (and I clearly wasn’t going to get buy-in to invest in any), I hacked my own indirect way of measuring and tracking.
First, I separated a single training into two identical trainings in the learning management system. (I subtly denoted which one as version A and version B without it being obvious or confusing to the learner when they signed up.)
Then I manually separated half the recipient list into two groups.
When I drafted the email comms to go out to the entire group of employees, I also drafted two different subject lines. The email with subject line A went to one of the two groups I created, and the email with subject line B went to the other group. Subject A was our typical robotic copywriting, and Subject B was written in a way to capture attention and compel the person to open
Once the registration deadline passed, I deduced from the registration list who must have opened the email and clicked the registration link inside (they would have had no other way of receiving the registration link except from that email). It was a rough way to calculate the open rate and clickthrough rate of each subject line.
Based on the data I was able to piece together, I then sent the remaining half of the entire recipient list the email with the winning subject line.
Which subject line won?
To make all this more compelling, I previously calculated our average registration rate and attendance rate for our IT webinars.
Although the winning subject line lacked statistical significance, my little experiment revealed that Subject B, the one with the more dynamic, marketing-minded copywriting…
DOUBLED the registration and the attendance rate.
You can imagine how excited I was…but…
In my mind this was only one step above vanity metrics. The next step would have been to correlate this new data with the learning data and ultimately the key business metrics.
Unfortunately, even after the success of the email communications A/B test, the team didn’t fully see the merit and value in using data to connect more deeply to the business, so the experimented ended.
And this is the lesson
Don’t ignore data. Don’t be afraid of data even if you’re not sure how to collect it or what to collect. Don’t separate yourself from key business metrics.
Here’s how you can embrace a data-driven approach to learning and development. Before you hear from me next, do the following:
List all the learning programs you and your team directly manage and/or influence
Note the programs that have direct connection to business metrics or org goals
Identify the exact business metric, KPI, and goal your learning programs connect to
Finally, identify all the related data you can capture with existing tools, systems, and processes you have in place (communications tools like MailChimp, event attendance like EventBrite, video hosting like Wistia, etc.)
Once you’ve done all that, you can do a gap analysis to identify what tools you need to capture missing data.
This is exactly what I helped Medallia do to track email open rates, clickthrough rates, and attendance rates in order to more than triple employee engagement with professional development and learning opportunities.
When you’re done with the checklist above, ask me questions or share feedback. Happy to help you take a major leap forward in your approach to data-driven, business-impact learning.
Thanks for reading. After scaling talent development at Google, LinkedIn, Yahoo, Medallia, and more, I created Sprintwell’s Talent Innovation Toolkit to share all the frameworks and strategies that worked and the mistakes to avoid.
The model helps marketers connect initiatives to business results before they launch campaigns.
The gist of the framework is a cascade from Objectives to Targets:
Objectives > Goals > KPIs > Targets
What does this have to do with talent development and learning?
You can connect talent initiatives to business results before you start a training project.
Use the Talent Development Measurement Model
Let’s take a common scenario and apply it to talent development.
Dana leads your customer success team. She asks you to help scale her team’s training.
Dana has a dozen customer success reps who respond to email, chat, and phone support tickets.
1. Objective: Start by asking Dana why she needs to scale her team’s training.
The answer may seem obvious, but it’s better to understand and unpack than to assume. You’re seeking insight here.
You want to discover the reason why scale (or build, optimize, change, fix, etc.) is so important to Dana and why now.
You want to answer the question “Why does this project exist?”
Dana explains that customer satisfaction is declining for chat support (her top support channel). The team is doubling its size next quarter to meet the demands.
Let’s unpack what’s going on.
Business Problem: Customer satisfaction is at an all-time low. Customer support turnaround time is too slow.
Business Response: The customer success team will double next quarter to meet demand.
Desired Change: Quick improvement of customer satisfaction scores and faster rep onboarding.
Why Talent Development Matters: Your team needs to help onboard new reps faster. You also need to level-up existing reps’ productivity.
2. Goal: Define the goal together.
Get alignment and commitment on goals you define together.
One goal might be to speed up new-hire onboarding. Another is to increase customer success satisfaction scores.
The goal is often the desired change based on the reason for the initiative. In this case, the reason is declining customer satisfaction. The desired change is to stop the decline and improve it.
3. KPI: Choose your Key Performance Indicator (KPI). Again, together.
Although I just mentioned one goal is to ramp up new reps faster, it’s easy to think that faster rep onboarding is a KPI.
I believe it’s a vanity metric.
It’s what you might think of at the surface, but you can dig deeper.
Faster onboarding isn’t close enough to the business goal of increasing customer satisfaction.
Go one layer deeper.
What happens when customer success reps achieve full productivity faster?
In my experience, ticket turnaround times (TAT) decrease. Customers hear from you faster. Credibility and trust increase. Brand loyalty increases. Net new sales increase. And so on.
In this case, a solid KPI could be ticket TAT. You’re focused on live chat TAT since it’s your top support channel.
Note the focus on KPI and its direct tie to the customer success team. Most learning and development focus on learning objectives without the connection to the performance of the people they’re serving.
4. Target: Put a number on it.
Now it’s time to choose a numerical value.
In your empathy interviews with Dana, you made sure to define her current state and where Dana wants to be.
You learned that Dana’s team currently averages a 72-hour TAT with Live Chat. Scary, I know. I’m exaggerating to illustrate the point.
Dana wants to get to 24 hours.
You now have your target: 24. You can also reframe it as a percentage decrease in TAT.
Putting It All Together
Dana asks you to help scale her team’s training because her team is about to double next quarter. The reason is due to a decline in customer satisfaction.
The goal is to increase customer satisfaction and decrease new-hire onboarding time.
Live Chat ticket TAT is the main way you’ll track success of your program you design.
The company wants to improve TAT from a 72-hour average to a 24-hour average.
You should see by now that the framework empowers you. You now have:
1. a model with which to fill in data before you get started.
2. a clear, concise story your executive leaders can buy into.
3. a clear map that you jointly created to measure success.
Put this framework to use with new and existing talent development initiatives. Let me know how it goes.