Recently, an unmanned NASA rocket burst into flames just seconds after liftoff. The disaster perfectly illustrates how easily things can go wrong when you’re dealing with a complex system.
It certainly holds true at my own job where I often test website changes with Kissmetrics.
In many ways, a website test is like launching a probe into deep space. There are many points where human error can enter into the equation. And, each test is expensive to implement when you consider the opportunity cost of developers, project managers, analysts, and designers.
It’s worth taking the time to ensure each test is carefully vetted; because, as with deep space exploration, once you launch a test, there is no way to get it back.
My team has spent the last quarter prepping for a major homepage refresh. Before releasing it to 100% of our visitors, I will need to show that the new homepage is outperforming our current version. In the meantime, using my experiences on this project, I have put together a brief guide of pitfalls to avoid and steps to take in order to prevent your tests from blowing up in your face.
Everyone makes mistakes. The trick is to not make the same mistake twice and, where possible, to learn from the mistakes of others. Below, I’ve listed some common missteps in descending order of severity:
Your approach lacks a cohesive strategy.
Are you running tests without a clear question in mind? Why? If you haven’t written down the questions you are trying to answer and involved at least one other person in the process for a sanity check, you lack a cohesive strategy. You will undoubtedly fail.
Your website implementation has errors.
If you put your event ID tag in the parent div of a button you want to track instead of the link, that’s an implementation error. If you forget to put tags on both versions of a page, that’s an implementation error. Be careful when making coding changes. Check them twice, and then check them again.
You failed to consider scale.
This is an error of omission. Are there additional metrics worth capturing that would benefit stakeholders? Minor additions to the design of your test can require marginal effort and have a big impact.
You implemented your events incorrectly.
Kissmetrics is the most forgiving aspect of the implementation. If your event name conflicts with something you are already measuring, you can always change it on the fly. If you use regex logic in your “Visit the page,” you will have only heartache; but, never fear, you can change that, too. Kissmetrics has its own proprietary way to match multiple URLs. I have been told regex support is forthcoming. The point is that changing the Kissmetrics settings on the fly is like pushing a software update to deep space. Just be sure to make a note of when you started collecting good data.
Identify Mission Goals
Imagine a NASA scientist sitting in front of the House Subcommittee on Space:
Representative: “So, before we approve this budget, we would like to know what impact this deep space probe will have on the body of scientific knowledge. What is your mission?”
NASA scientist: “To explore strange new worlds, to seek out new life and new civilizations, to boldly go where no one has gone before.”
NASA scientist: “Sorry, I’ve always wanted to say that in an official capacity. Seriously, though, we aren’t really sure. I guess our mission is to collect space dust and stuff.”
If you don’t plan out your strategy with clear goals, you are no better than a NASA scientist who wastes public money and a good Star Trek reference. Fortunately, these four easy steps can help you do better:
1. Collect a wish list from key stakeholders.
Most website testing initiatives grow out of a desire to better understand user behavior and optimize the number of users taking a desired action. Make sure you understand the goals of your tests and how they may differ from department to department. For example, for our homepage tests, marketing wanted to improve registrations, while UX wanted to know which parts of the design were most engaging.
2. Ask if the scope is feasible.
You might not be able to answer every question with the same test. Prioritize the questions that have the biggest impact on the company and limit event measurement to those that make sense together.
3. Work your way backward from the desired results.
Start your test with the end goal in mind. Once I created the slides of what I wanted to show, I worked my way back to the reports I would need to run. With the report structures firmly established, I could think about what I should be measuring and how I would do so.
4. Visualize your approach.
Mapping out your testing approach will help you communicate with stakeholders. I use Lucidchart to create flowcharts and collaborate with team members on tweaking the details of implementation before we make our first code change. A map is especially helpful with complex flows. Our homepage test introduced three new persona-specific landing pages, so it was hard to keep track of all the moving parts. Having a map of the flow helped us keep track of everything and made any holes in our logic immediately apparent.
Get Ready for Launch
You have your mission goals firmly established, so let’s execute on the details of pre-launch preparation.
1. Establish clear protocols.
For my tests, I put together a spreadsheet that laid out every event, where to place it, and which reporting goal it helped us accomplish. This spreadsheet wasn’t made just for developers and QA. It also will help me remember the purpose of each event when I have to start crunching the numbers in about three months, which is when we’ll have enough data to make a final decision.
2. Identify the launch vehicle.
Kissmetrics isn’t going to serve up the A/B experience, so figure out what you are going to use and make sure you know how to use it. We use either Google Analytics Experiments or an in-house solution, depending on whether we want different URLs for the A and B versions.
3. Diagnostics check.
As part of our release schedule, we push a sprint’s worth of code to staging for a week of QA testing. When the pages with the Kissmetrics tags are pushed to staging, I can implement the A/B test and have QA go through the test experience to make sure events are firing properly.
4. Countdown to launch.
Assuming everything looks good, I then wait for DevOps to push staging to the production servers. Your own deployment may be different.
5. Launch test.
Congratulations, you did it! Now wait for that data to roll in. You’ll be increasing your company’s profitability at warp speed.
The Lucidchart homepage test is currently live, so feel free to check it out. Bonus points for anyone who finds the Easter egg! If you’d like a deep dive into our specific implementation, you can read about that on our tech blog.