A/B Testing Blunders: 5 Things You May Be Doing Wrong
Remember when science was cool; when Bill Nye, The Science Guy was your idol?
For many people, somewhere around high school, experiments stopped being fun and turned into a complicated burden. For Inbound Marketers, this meant poor ol' A/B testing got a bad rep.
A/B Testing, or "Split Testing" is as simple as black and white.
It's little insights can make a huge impact on your campaign, but unfortunately, many Inbound Marketers are still falling victim to the age old scientific stigma and steer clear!
5 Things you may be doing wrong with your A/B Testing
1. Jumping to Conclusions
Whether you are an impatient 11th grader in Chem lab or an anxious Inbound Marketer under a deadline, one mistake (or "scientific error") everyone makes is jumping to conclusions. It is important to let your experiment run its course.
If you make a decision too early, based on results from only a day or two, it's not likely that you reached a large enough representative sample. Most A/B testing tools report with statistical confidence (statistical significance/credibility) in mind, but you may want to consider using an online calculator to determine how long your test should run.
2. Surprising Your Regulars
Heed my warning: never "experiment" with your regular visitors. Testing your regulars may run the risk of scaring them away with a random variation you will not even end up using.
Instead only include new visitors. Their reactions will be unbiased and won't risk you losing visitors you have already won over.
3. Not Testing Simultaneously
Always test your variations at the same time. The goal of an A/B test is to determine which alternative performs better than the other, but when you don't test simultaneously, you introduce different variables (i.e. varying site traffic) that may skew your data.
4. Choosing "Instincts" Over Test Results
Just because the results don't meet your expectations doesn't mean they're wrong! Even the most experienced of Inbound Marketers can have their off days so if a result surprises you, goes against your intuition, or just isn't easy on the eyes, don't immediately reject it. Your goal is to achieve a hire conversion rate; no one said it was going to be pretty!
5. Being Inconsistent
Always make sure that your tests are consistent across your website. If you are testing a Call To Action that will appear in a sidebar across all of your pages, make sure that your visitor sees the same variation everywhere. This way you won't confuse your data by accidentally showing one person conflicting offers, colors, etc.
At the end of the day, A/B Testing need not be feared like the graded experiments of yesteryear. According to HubSpot, A/B testing has proven to generate up to 30-40% more leads for B2B sites and 20-25% more leads for eCommerce sites so take advantage of it! Put aside all of the misconceptions and try testing your:
- Calls To Action (the wording, size, color and placement, etc),
- Form Length & Fields,
- Website layouts and styles,
- Product pricing and promotional offers,
- Images, and
- Content Length.
After running several successful A/B tests, the best results will rise to the surface to help convert your leads and optimize your sales.
About Ramona Sukhraj
As Content Marketing Manager, Ramona approaches marketing not only as a profession, but as a creative outlet. She has a passion for all things artistic and strives to create content that is educational, yet quirky and entertaining as well. With a B.S. in Marketing from the UCONN School of Business, Ramona is a frequent contributor to the HubSpot blog and a nonprofit consultant. Outside of IMPACT, she is a design, movie, and pop culture buff, and a fierce advocate of free hugs.