Of all the activities one could do to ensure the success of a sales organization, identifying, tracking and understanding how reps are performing is paramount.
Although analytics and reporting is progressively becoming a central part of sales, increased access to data and sales metrics doesn’t always result in an increase in sales performance.
It’s common for sales managers and VPs to meticulously track overall rep performance on activity, pipeline and results, but is that enough?
Let’s dive even deeper into activities.
When it comes to the sales metrics within the activities — such as open rates, clicks, positive replies, etc. — everyone knows to pay attention to these numbers. What many people don’t know is how to methodically test and improve them.
For example: If we change an email subject line and it gets a subpar open rate, we try a different subject line. But how do we decide on the most appropriate subject line to test next? At best, we use our intuition or instinct. At worst, we pick a new subject line at random.
However, if we really want to become masters at selling, we need to borrow a page from marketing and become more analytical by using data to test and improve our outbound campaigns.
The 5 Step Process to Test, Track and Improve Your Outbound Campaigns
If you’re not familiar with an A/B test (also called a split test), the concept is simple: you are comparing two versions of the same thing against each other with one variant to determine which one performs better.
Here are five steps to A/B testing an outbound sales campaign:
1) Define what you’re testing
When A/B testing, we want to be as methodical and scientific as possible.
Rather than simply thinking of two different subject lines and testing them, having a specific element of a subject line under the microscope is going to be much more useful.
For example, is using a variable, such as {{first_name}} better than not having a variable? Is asking a questions better than making a statement? Are shorter subject lines better than long ones?
This will help you develop a better strategy for determining what works best.
A word of caution: This has all been tested before, and the winners are widely known in the sales community. However, fatigue and learned blindness sets in. As soon as sales reps catch on to what is working for others, every sales rep starts doing it until it’s become overused, rendering it useless.
For example, “best practices” tell us that having a question in the subject line is better than a statement because it piques the curiosity of the recipient. However, in a recent A/B test by Campaign Monitor, they found that questions have reduced open rates on every split test. And there are many more out there like it.
Elements to test in a subject line:
- Custom variable vs. no variable (such as name or company)
- Question vs. statement
- Long vs. short
- Feature vs. benefit
Elements to test in email body:
- Value proposition A vs. Value proposition B
- Call-to-action A vs. call-to-action B
- P.S. vs. no P.S.
- Social proof vs. no social proof
- Whitepaper vs. no whitepaper
In the spirit of taking a scientific approach, write down your hypothesis and reasoning for that hypothesis. Now go out and try to prove yourself wrong! This is the true scientific method.
It’s hard because we all have egos and want to be seen as smart. But as a mentor of mine once said, “Would you rather be right or rich?”
The main point is try not to bias the results as much as possible. We all want to be right and prove that we’re smart, but it’s not at the expense of figuring out what is going to ultimately close more deals.
2) Define your performance metrics
Simply put, what are you measuring the performance of? We covered this in the previous blog post, so you can use those same numbers.
3) Run your test
Start with one of your best performing email templates, or you can use Brendan Hartt’s 5 Steps To Writing Personalized Cold Emails. Once you have your original template, which is called the “control,” we’re going to copy that template and make one change, which we defined in step one.
Ideally, you’ll have a platform that lets you easily set up an A/B test.
If you’re hacking it together manually, the most important thing to remember is to keep everything between the best test groups as similar as possible: your audience, the time of the day your emails are sent, day of the week your email are sent, body of the email, etc. If you change any of these other factors, you’re introducing variables that will influence the outcome.
4) Collect, analyze and make sense of the results
Here’s where you’ll need to use some discretion, depending on what you’re testing. For example, when testing subject lines, the most relevant measure is open rate because that’s the main job of the subject line. If you’re testing body copy, response rate is the most appropriate.
The big questions when running A/B tests are “How long should I run it for?” and “How many emails is enough?” If you have a large list and can afford to test generously, then test until you can reach statistical significance.
Here’s an A/B testing statistical significance calculator to help. Generally speaking, if you can get to a 90% statistical significance rate, than you can be confident that the results are meaningful.
However, if you don’t have that luxury of a big list and ample time to reach statistical significance, you’ll have to make do with what you have and use your best judgement. The more tests you run, the quicker you’ll be able to get a feel for what is working and what is not.
5) Iterate and Improve
Keep testing and refining. If you’ve conclusively proven that short sales email subject lines are better than long ones, then you can start testing different short subject lines again each other. Once you think you’ve found the best subject line, you can move on to testing an element in the body of the email, like the CTA. Then you can take it to the next level by testing the other critical factors for outbound sales follow up success.
A big mistake is finding one subject line that works and exclusively sticking to it. The B2B space evolves quickly and people catch on to what works. Then, when everyone is using the once powerful subject line, it decreases in effectiveness. That’s why we recommend every 6-9 months going back to the beginning and besting your original findings.
When we’re looking at improving performance of your outbound campaigns, it takes a methodical approach. When we can learn to test and tweak our campaigns, that’s when we can begin to see some of our other major sales metrics start to move in the right direction too.
About the author:
Brandon Redlinger runs Growth at PersistIQ. PersistIQ helps reps deliver truly personalized outbound sales at scale. Automate the tedious tasks so you can get more replies and meetings with qualified leads. For more information on how you can be more effective at outbound sales, visit PersistIQ.com. You can also follow Brandon at @Brandon_Lee_09 for the latest information on outbound sales.