Kissmetrics Blog

A blog about analytics, marketing and testing

Built to optimize growth. Track, analyze and engage to get more customers.

Why Conversion Optimization Strategy Trumps Tactics Every Time

I’m willing to bet you’ve come across a lot of opinions about how to do conversion optimization.

You’ve probably formed some of your own ideas, too.

Unfortunately, in our work testing the so-called “best practices” over the past five years, my testing team and I have disproven many common recommendations for how to maximize website conversions and revenue.

That’s what this post is about — to tell you about one of the most important distinctions between successful and mediocre conversion optimization testing programs. It’s about strategy.

There are two kinds of people doing conversion optimization: tacticians and strategists. Tacticians focus on following “best practices” and improving metrics like conversion rate. Conversion strategists, on the other hand, focus on building a repeatable strategic process that creates powerful hypotheses and insights in order to fulfill business goals.

The business results can be astounding.

For example, in today’s case study, Iron Mountain, the conversion optimization strategy led to a 45% lift in the first test, then a 404% boost (!), then another 44%, then an additional 38%, followed by a 49% conversion rate increase. And that was just on a few of the landing pages.

I’ll give you details on one of those A/B/n tests in a moment; but first, let’s review conversion optimization at a high level.

A Quick Primer on Conversion Optimization

Your website has two kinds of conversions:

  1. On-page actions (a type of micro-conversion)
  2. Revenue-driving conversions (the ones that support your business goals).

On-page actions are things like add-to-cart’s and form submissions. Revenue-driving conversions are things like e-commerce sales and quote-request leads for your sales team.

For both types of conversions, your conversion rate hinges on six factors:

  1. Value proposition – This is the sum of all the costs and benefits of taking action. What is the overall perceived benefit in your customer’s mind? Those perceived costs and benefits make up your value proposition.
  2. Relevance – How closely does the content on your page match what your visitors are expecting to see? How closely does your value proposition match their needs?
  3. Clarity – How clear is your value proposition, main message, and call-to-action?
  4. Anxiety – Are there elements on your page (or missing from your page) that create uncertainty in your customer’s mind?
  5. Distraction – What is the first thing you see on the page? Does it help or hurt your main purpose? What does the page offer that is conflicting or off-target?
  6. Urgency – Why should your visitors take action now? What incentives, offers, tone, and presentation will move them to action immediately?

These are the six elements that both tacticians and strategists should understand to be able to make conversion rate and revenue improvements.

The main difference between tacticians and strategists is how they plan and interpret their tests.

Where Tacticians Shine

Conversion tacticians shine in the places where details are happening. They think about button color and size. These are the people who have arguments over whether a Big Orange Button (I call him BOB) will solve the problem. Tacticians have a tool kit that includes a wide array of tested elements that can be applied to a problem quickly.

Tacticians are great for getting started with testing, but they rapidly hit a limit on the benefit they achieve from tweaking elements, rather than enhancing customer satisfaction and business goals.

Tactical conversion optimizers rarely create hypotheses that describe customer behavior. They are focused on elemental, on-page concerns like form fields, pop-up windows, or maximizing your testing tool’s capabilities. Elements like these are the quick and easy areas to attack first, but they rarely result in big gains for your business, and they don’t generate marketing insights that lead to the next great hypothesis.

Why Strategy Is Better

A strategic approach aims for more fundamental and ongoing improvements. They aim for a mix of big wins and incremental improvements that give marketing insights.

Conversion strategists know three important things:

  1. The only conversion rates that matter are relative.
  2. Conversion rate improvement is a means to an end—and that end is profit.
  3. Learning from hypotheses is more important than winning with every test.

That is why good strategists create documentation about why they are running tests and what customer needs they are trying to address. Making hypothesis-based testing part of your organizational culture is far more important than making a button the right color (hint: there is no right color). Strategists know that an ongoing, structured process of continuous learning and improvement delivers the best results over time.

The strategist knows from experience to look for meaning behind the numbers rather than simply taking test results at face value.

Take a look at this test result report:

internet web traffic source and conversion rate

At the point when a new traffic source was added to the test page, the conversion rates for all of the test variations dropped.

A tactician would see this conversion rate trend and say, “Oh no! Our conversion rate plummeted. We have to fix this!” They would run to the design department to order new buttons and images or turn off the new source of traffic.

A conversion strategist, however, would look deeper at sales and profit numbers to find out about order volume and order value. They might create a hypothesis for whether there’s a different customer need specific to the new traffic source and plan a new round of testing based on that hypothesis. The strategist may discover that our green combination is the best for the new traffic source, even though it isn’t the winner for the other traffic sources.

There could be many other insights and hypotheses the strategist would gain from the results analysis, too.

That’s why, in my 7-step testing process, the 7th step is often the most important. That’s where the strategist solidifies the insights that can be fed into the next tests.

7 steps to conversion strategy

A Conversion Optimization Strategy Looks at the Big Picture

With a conversion optimization strategy, every test leads to insights that lead to more hypotheses to test. The learning from each test leads to greater lift in the following tests.

Let’s look at the Iron Mountain example again. For the past three years, my team has been testing and optimizing the conversion rates for Iron Mountain’s most important marketing touch points. In that time, we’ve run many A/B/n tests on various areas of the website, including landing pages and site-wide elements on

This partnership had unique challenges that emphasized the need for a strategy:

  • How do you implement a conversion optimization strategy for a website that has over 17,000 pages, plus landing pages and campaign microsites?
  • How do you make sure your website experiences are maximizing conversion rates and that the learning is applied across the organization?

Your conversion optimization system needs good planning and great execution. Individual tests should feed learning back into an evolving understanding of your customers.

Here’s an example of one of the landing page analyses we performed as part of a testing strategy for Iron Mountain:

landing page analyses

Notice that a strategist is not looking to specific tactical changes when planning for the test. Each one of the called out points are conversion problems that can be turned into a hypothesis. Overall, there are also themes that emerge. Readability and distraction seem to be the major issues to address. This analysis contains 10 points of action that can be tested.

There is a rich pool of possible changes for each point of action. Any test that is performed in this structure, even the losing tests (because not every test will win), can offer marketing insight that can be taken to every portion of your site and business.

Strategy Turns Weakness into Strength

Take the worst thing on your page and hypothesize its opposite. This doesn’t mean changing everything that is orange to blue. It means making everything that is vague, clear. It means making everything that is distracting or frightening disappear.

weaknesses vs strengths in conversion strategy

The Results of Strategy: Big Wins plus Insights

This winning page design for this particular test gave Iron Mountain a 404% lead generation conversion rate lift!

I won’t promise that every test gives results like that. And they don’t have to.

This result was built on the backs of previous tests because of the strategic approach we took.

Plus, the learning from this test gave insights that led to wins in other areas. The marketing team learned the type of offers that work best. That’s an insight that continues to pay dividends over and over again.

Strategy Wins Because It Offers New Ideas

eConsultancy reports that companies which take a structured approach in their conversion optimization are 2x as likely to report large increases in sales than those that don’t. Conversion strategy leads to structure and a continuous stream of new ideas, while conversion tactics alone lead to guessing and a constant hunger for more advice.

Conversion tactics by themselves are dead-end streets. If you’re overly attached to one technique or answer to a problem, you will always find the place where they fail. When tactical tests fail, they are simply over. When strategic hypothesis-based tests fail, you get insights that can be taken to a new understanding of your conversion problems.

Because conversion strategy puts emphasis on customer needs and creating a vision for solving problems related to the business action, not the page action, it leaves fewer moments where you have to ask “What’s next?” This means that if you keep testing, you keep getting new insights. With new insights, you get new ideas. And then you have a continuous cycle of improvement.

Within a strategic framework, each time you start a new round of hypotheses, you will find new success for your business.

About the Author: Chris Goward is Co-Founder and CEO of WiderFunnel, the conversion optimization agency, and author of “You Should Test That!” He developed conversion optimization strategies for clients like Electronic Arts, Google, SAP, Shutterfly, and

  1. Brandon Parsons Jan 09, 2013 at 1:39 pm

    I’m assuming that you would have indivdually tested the changes you listed above that resulted in 404% increase in conversion rate?

    I think is pretty easy to imagine that at least one of those changes could have hurt conversions depending on your customers.
    – Headline doesn’t say backup? Maybe your customers use different terminology….
    – Reducing size of header graphic to push more text above the fold? Maybe your customers are worse off by some confusion/distraction produced by too much content….

    Although it’s entirely possible that all of these changes were a plus, I really hope you tested them individually.

    • Don’t understand why you think they should all be tested individually. I think if you asked the CEO of Iron Mountain whether they’d prefer an immediate 404% uplift in one shot or over a series of 10 tests and a number of months then they’d plump fo the former every time. The point of CRO is to grow the bottom line as quickly as possible not help the tester learn every minute detail of why.

      Always slightly sceptical about big conversion results though without context. 404% uplift on a conversion rate that was already sitting at 10% is great (although it still depends on how much traffic the page was getting). A 404% uplift on a rate of 0.2% isn’t so hot.

      Now if you’d quoted how much you grew their bottom line by, that would be a different story.

      • Totally agree with your perspective on the goal of conversion optimization, John.

        And I hear your point about conversion rate lift being more impressive with already well-performing pages. This was, but I actually disagree about absolute conversion rates. For some businesses, a 0.2% conversion would be very successful and for others 10% would be a disaster! For one client of ours, they’d be disappointed with anything less than a 50% conversion rate, now that we’ve got them up to that point.

        There are many reasons that I believe conversion rates are actually meaningless if they’re not evaluated from a *relative* perspective. I explain why I think that here:

    • As John pointed out, the goal is conversion rate and revenue lift over learning.

      But, we also *always* plan tests to include isolations to learn as well.

      The tradeoff for more variations is longer tests or more tests needed to achieve the same result. A good optimization strategy strikes a balance and aims for learning that creates customer insights, which lead to greater lift in the future or on different pages.

      It’s a good point, Brandon, and a question we often get about how to plan tests. Thanks.
      As I said in the article, “This result was built on the backs of previous tests…”

      • So basically you are saying, you were able to do these broader test because previously you had done individual test that supported your theory. I too would prefer a 400% increase in one test vs. 40 test for 10% increase each time. My only concern would be that, I would believe there was a correlation that isn’t true because my test wasn’t isolated enough. You know that old saying “correlation does not imply causation”

  2. Why wouldn’t you include the revised copy of the landing page? What kind of traffic was that landing page getting? It’s hard to say your 404% increase was much of anything without some traffic stats behind it. I do like the six factors you mention, this is something newbies overlook when they’re trigger happy on which color to change and which button shape to adjust.

    • Thanks, Gene. The test did have enough traffic and conversions to achieve statistical significance at the 95% confidence level.

      This article wasn’t meant to be an in-depth case study with all the details. The full case study with the winning screenshot is posted on the website under “What You Get”. Or, just click on my name, above.

  3. I’m not questioning the numbers, we all know CRO works.

    My take? The point of the article was to provide an introduction into how real strategists think (or should be thinking). I’ve been doing CRO for a while now, but I really appreciate how you’ve described it in this blog.

    I like how you look at pages at first and only intend to identify problems, instead of looking at each problem and coming up with a solution in that minute. Then you stand back and decide which problem(s) is your biggest. Only after that do you start to list out the many ways to fix it. I think this is a crucial step for many strategists and provides an opportunity to work with their designers or even better a good developer and a designer. Sometimes as a strategist you can find yourself quickly stepping on toes if you try to come up with all of the solutions for problems you see instead of relying on the people who are experts at those things.

    I’m a fan.

  4. Hell Chris,
    thanks a lot for this post. Great work. While one can disagree on the details (tactics) to the implementation in the examples – the real value is in the process. You can do it over and over again – and teach it to other people.

    Great work!

    My process is similar but more tailored to bigger teams.

    1. Measure
    2. Understand
    3. Create a vision or hypotheses
    4. Convince (the stakeholders)
    5. Act!
    6. start at 1.

    Those who speak german might find it on
    If there you are interested I will tranlate it for you.

    best regards

  5. I like your point – “conversion rate hinges on six factors.” These six tips are simple yet so important enough that complacency in a anyone will affect your site in a wrong way. Way to go.

  6. Darren Fisher Jan 19, 2013 at 2:27 am

    This is such a great article on so many levels. First of all, the combination of creativity and process is something u believe is necessary fir long term success and quality over time. Secondly, this is so clearly written that our team can apply it to our website redesign and conversion strategy. Thanks fir the information.

    Darren Fisher
    Business Strategist
    Darren Fisher Consulting

  7. Jenifer Taylor Jan 20, 2013 at 3:30 am


    Your IDEA is realistic.
    Thanks for the valuable tips and links.

    Jenifer Taylor

  8. As you point out tactics are there to get people off the ground. They help the little guy make a start and in many cases tactics are probably all that is required for successful CRO.

    Tactics are the first-aid kits before you can see the doctor. Sometimes all you need is a band-aid.

    The difference between tactics and strategy is probably more blurry than you present here. Strategy in my mind is merely tactics applied by someone with experience and reason.

    I don’t particularly agree that there isn’t a ‘right colour’ for a CTA button (though I think your statement was just to emphasis your point). The right colour for a button depends on a variety of factors, not least the surrounding colours.

  9. Great stuff. I think every CRO guy is a tactician at first, and strategy is learned. The best way to put it is – when have you ever decided NOT to buy a product or fill out a form because of the color of the button? Focusing too much on these kinds of changes makes your testing plateau. CRO is about testing hypotheses.


Please use your real name and a corresponding social media profile when commenting. Otherwise, your comment may be deleted.

← Previous ArticleNext Article →