As customer attention spans are shrinking and online competition is expanding, Conversion Rate Optimization (CRO) is a must-have skill for marketers.
CRO is one of the most important aspects of your digital marketing strategy because conversion rate is the only measurable metric that actually correlates with ROI.
Even if a customer “conversion” on your website is something other than a purchase (such as a newsletter signup), the rules of CRO still apply.
Unfortunately, when it comes to implementing a CRO plan, you can get completely lost in a sea of online resources that tell you to do things like change the colors of your buttons, add social proof, shorten your web copy, include gamification… Stop the madness!
Before jumping into tactical fixes, there is only one thing you need to do to optimize conversion rates on your website, and that is what today’s blog post is all about – A/B testing.
While a leap of faith worked for Bruce Springsteen in 1992, it won’t bring you success in the future. So, rather than take a leap of faith on a set of tactics, use web analytics to get a ton of insight based on real-time user feedback. The data can be used to optimize any area of your website based on the real-life behaviors of real-life customers. What could be better?
Of course, you may already have a hunch about what your users prefer and how they consume content, and that brings us to…
A/B Testing Rule #1: Forget everything you think you know about your customers.
It is tempting to make assumptions about your audience based on things like age, gender, location, or income. Resist the temptation when possible! There was a time when customer profiling was the best way (the only way) to target customers; and, yes, it still has its place in marketing.
However, in the digital era we have so many more options! No longer do we have to rely on segmentation to deliver hyper-personalized experiences. We now have the ability to leverage every digital touchpoint as an opportunity to learn about our customers’ preferences on a one-to-one basis.
A/B Testing Rule #2: Always establish a baseline.
Increasing conversion rates is your immediate goal, and if you’re like me, you’re in a hurry. But, before jumping into a high stakes A/B test (or even a low stakes A/B test), it is important to budget time up front to establish a current baseline to measure against. If you don’t know what your current conversion rate is, how will you know if your future tests are successful? (More on that in Rule #5.)
A/B Testing Rule #3: Just because it worked for someone else, does not mean it will work for you.
If CRO were a repeatable process that worked the same way for every website every time, there would be no need for testing at all. Marketers would know the way all e-commerce websites perform, and everyone would follow the same rules.
Unfortunately, this is not the case (and a world full of sameness would be rather boring anyway), which is why you must perform A/B testing on your own unique content with your own unique audience. Sure, you can borrow ideas from other CRO-ers, but don’t expect the same results.
For example, let’s say Company ABC sells shoelaces and Company XYZ sells enterprise software applications. Clearly, the buying cycle looks completely different for these two companies, even if they have customers in common. Company ABC may find that changing its primary call-to-action (CTA) button to green instead of red increases sales by 75%. But, it is not likely that Company XYZ would experience similar results.
A/B Testing Rule #4: Test one thing at a time.
This one is pretty self-explanatory but worth mentioning because it’s important. When performing A/B testing on your website, test one variable at a time so that results are readable at the end. If you change your headline at the same time you change your navigation, how will you know which one of the variables contributed to the most conversions?
Pro tip: If you run a headline test, be sure your test headline works with the rest of your digital touchpoints throughout the sales funnel. Consistency builds credibility.
A/B Testing Rule #5: Do not call a “winner” until statistical confidence is reached.
In A/B testing, statistical confidence refers to the likelihood that the same results can be expected if the same test is run again in the future. In other words, it tells you how confident you can be of the results of your test.
For example, let’s say you perform an A/B test on your shopping cart page where “A” is the use of radio buttons and “B” is the use of dropdown menus. Let’s also say that “B” produces a 75% lift in conversion rate. Obviously, B is the winner, right?
Not necessarily. There are three more facts to consider:
- Sample size: Using the example above, if your sample size is 4 people, that means only 3 people prefer dropdown menus. Sure, it’s a good start, but the likelihood of the results remaining true in a sample size of 1,000 is extremely low; therefore, this test result has a low confidence level.
- Percentage: The accuracy of your A/B test results also will depend on your margin for error. If, in a sample size of 500, 99% of customers convert when shown dropdown menus, you can be fairly certain that your margin for error is low. If, on the other hand, only 51% of customers convert when shown dropdown menus vs. 49% who are shown radio buttons, random chance gives you a larger margin for error, and you should continue running the test until a higher confidence level is reached.
- Population size: If the size of your entire audience is 250,000 and your sample size is 25, again, this will yield a test result with a low confidence level. To calculate your recommended sample size, check out Raosoft’s Sample Size Calculator.
A/B Testing Rule #6: Walk before you run.
This proverb is true in many aspects of business, and A/B testing is no exception. As customer perceptions and expectations evolve, CRO has always been and will always be a moving target. You will make mistakes. You will learn from your mistakes. With practice, you will become an A/B testing master.
A/B Testing Rule #7: Get a second opinion. Or a third. Or a fourth.
User testing has never been more important, nor has it ever been easier! Even if you do not have the luxury of a User Experience (UX) Department on hand, there are a number of free and low-cost services that offer usability testing on the fly, such as:
Peek User Testing: Peek is a super easy and quick way to gather qualitative feedback on your website.
- The pros: Feedback is generally unbiased, detailed, and free!
- The cons: It doesn’t always make sense to test an interface outside of its intended audience. Also, it is difficult to gather a large quantity of feedback using this method due to its time-consuming nature.
Amazon Turk: Amazon Turk allows you to gather feedback from thousands of real-live people in a short period of time through the use of quantitative research methods such as surveys.
- The pros: Generally inexpensive, scalable, and quantitative, and you can pre-select qualifying criteria for your testers.
- The cons: This is generally performed via a survey engine, which can introduce artificial filters.
The bottom line: any feedback is better than no feedback!
A/B Testing Rule #8: User behavior data and customer survey data may conflict.
Surveys certainly have their place in marketing, but realize they may not always provide honest feedback the way behavioral feedback captured via your web analytics can. This is because surveys introduce human biases in a way that raw behavioral data does not.
For example, imagine that you are in a hurry to print out important documents on your way to a meeting and, 3 pages into your print job, you find that the ink cartridge needs to be changed. Now, what if I ask you how you would handle this particular situation?
Before reading any further, please pause and think about your honest answer.
You probably said you would change the ink cartridge and continue printing your documents. If this were a survey, I would accept that as your answer.
In a user-testing environment, though, I would note that you kicked the printer 4 times, cleared a paper jam, and hit the cancel button 7 times; and then you changed the ink cartridge. While sorting your documents, you spilled coffee all over your shirt, got frustrated, and had to re-schedule your meeting.
In the survey setting, you didn’t outright lie about what you would do in the situation. You did change the ink cartridge, after all. But in the survey setting, I would have missed out on all the extra behavioral data that happened before and after.
A/B Testing Rule #9: Clearly define your success metric.
Never lose sight of your ultimate success metric. CRO is about conversions. It is not about open rates, click-through rates, tweets, shares, or pins. Unless, of course, tweeting and pinning is the “conversion” on your website. In that case, go crazy with it.
The bottom line: have a goal in mind and optimize your content around that goal. Everything else is a key performance indicator (KPI).
A/B Testing Rule #10: Don’t test whispers.
This saying dates back to the days of direct mail, and it still holds true for online marketing. Avoid testing miniscule elements that have little chance of driving significant change. Use your common sense, trust your intuition, and focus on high impact tests. For a list of 485 real-life test ideas, check out Which Test Won.
CRO is not just about getting more people to click your buttons. It’s about delivering the right content to the right audience and encouraging them to click the right buttons at the right time. If you’ve A/B tested your entire website, optimized based on the data, and your conversion rates are still lower than you’d like them to be, perhaps you are measuring the wrong set of metrics.
For example, let’s say you own a gourmet cupcakery and your website has a 2% conversion rate. In this example, a customer placing an order for cupcakes is the “conversion.” Here are a few questions to ask yourself:
- Is the 2% conversion rate based on all web traffic, or is the 2% conversion rate based on only those who click through to the “How to Order” page?
- What are the traffic sources of those who click through to the “How to Order” page?
- What are the traffic sources of those with the highest bounce rates?
- What are the behavior patterns of those who ultimately convert? Did they watch a video? Browse your photo gallery? Read customer testimonials?
- Last and most important question: how can I use this data to better qualify prospects?
By slicing and dicing your web analytics in this fashion, a couple of things may happen:
- You may uncover areas of your website and/or your search strategy that need improvement.
- You may find that your conversion rate is better than you had originally estimated.
Either way, this exercise will help you prioritize your A/B testing calendar.
A Final Word
Outside of basic functionality like site speed and mobile optimization, there is no single truth or secret sauce to CRO. The only way to know for sure what works with your audience is to run a set of A/B tests and then be willing to implement changes based on the data.
About the Author: Nicki Powers is a Digital Marketing Strategist located in Saint Louis, Missouri, who loves to engage customers and drive sales through the use of emerging technologies. You can follow her on Twitter here: @nicki_powers.