How A/B Testing REALLY Works (Edmonton SEO Experts Reveal It All)

by Aaron Janes, Founder

A typewriter and a Macbook back to back in a comparison

Something we hear about all the time in the business, marketing, and SEO worlds is to A/B test absolutely everything. 

 

This means creating at least two versions of something, with slight variations. Maybe you use white-on-black text and then black-on-white.

 

Then you run them for a predetermined amount of time, say a week, and see which does best. You take the best one, make another slight change, and repeat. Then, you keep going until you find what works best.

 

Then, when you come to create another piece of content, you take everything that was good and get to a winning design straight away.

 

The name literally means testing version A against version B.

 

However, while there's a lot of talk about why this is important, there's less talk on how to actually do it and what it looks like for your business. 

 

Do you really need to keep creating different versions of the same piece of content? What do you change? How do you find time for this process? How do you analyze it? Is it actually worthwhile, especially as a small business?

 

That's what we're covering today and everything in between. Consider this your ultimate guide. Let's get into it.

What are the basics of A/B testing success?

A web design team working together to A/B test components of their project

These days, you can A/B test practically anything: headlines, images, call-to-action buttons, website copy, you name it. By making small, strategic tweaks and analyzing the results, you'll unlock a treasure trove of insights about your audience's preferences.

 

You can use this data again and again whether you're picking certain types of language, colours, designs, perspectives, formatting, imagery, and so much more.

 

But first, let's take a moment to grasp the core principles of A/B testing.

Formulating a Hypothesis: Where It All Begins

First things first, we need a clear direction. What are you hoping to achieve with your A/B test? This is where your hypothesis comes in – it's essentially an educated guess about what might improve your results.

 

Many YouTube channels will do this for an apparent reason.

 

When you post a YouTube video, there are two things a viewer will see that will determine whether or not they click to watch the video. That's the thumbnail and the title.

 

Viewers make their decisions based solely on these two pieces of content. If they don't find either interesting, they won't click, so YouTubers spend a lot of time A/B testing.

 

Even today, I saw this podcast by Rich Roll posted. On YouTube over the next few hours, the video was recommended each time with a new thumbnail (the team reportedly changed the thumbnail each hour for the first few hours after uploading to see which gets the most clicks.

 

While I didn't get screenshots of the thumbnail, you can see on the ViewStats Chrome extension that the video's title has been changed six times over the course of just today (the publish date).

A screenshot of ViewStats Chrome Extension showing how many times a YouTube video title has been changed

The goal here is to get more clicks.

 

So, start by pinpointing areas on your website or in your marketing campaigns that could use a little boost. Are your conversion rates lower than expected? Is your bounce rate giving you the chills? 

 

Once you've identified a problem area, it's time to set a SMART goal – Specific, Measurable, Achievable, Relevant, and Time-bound.

 

For instance, instead of vaguely aiming to "get more sign-ups," a SMART goal would be to "increase email sign-ups by 15% in the next month." See the difference?

 

Now, transform that goal into a testable hypothesis. 

 

For example, "Changing the call-to-action button colour from blue to red will increase email sign-ups." 

 

This gives you a clear prediction to test and measure.

Selecting Variables to Test: Choosing Your Battles Wisely

Your website or marketing campaign is a symphony of different elements working together. But when it comes to A/B testing, it's crucial to isolate individual variables to understand their impact.

 

Think about the elements that could influence your desired outcome. Is it the headline that's not grabbing attention? Is the call-to-action button blending in too much? Or maybe the image isn't resonating with your audience.

 

Prioritize testing variables that have the potential for the biggest impact and are relatively easy to change. And remember, focus on testing one variable at a time. 

 

Why? 

 

Because if you change multiple things simultaneously, you won't know which change actually caused the improvement (or decline). 

 

It's like trying to bake a cake and changing the flour, sugar, and baking time all at once – you'll never know which adjustment made the difference.

 

By mastering these fundamentals, you'll set the stage for A/B testing success.

How to set up your A/B test: From tools to targets

Someone sat down at their computer writing and A/B testing their content

Alright, now that we've got the fundamentals down, let's roll up our sleeves and get this A/B testing party started.

How to choose the right A/B testing tools

First things first, you need the right tools for the job. Thankfully, there's a whole suite of A/B testing platforms out there ready for just this. Some of the heavy hitters in the A/B testing world include:

  • Omniconvert: A comprehensive conversion rate optimization platform that goes beyond A/B testing. Omniconvert provides tools for surveys, popups, and personalization, making it a versatile option for businesses looking to improve their website's performance.
  • Optimizely: A more robust platform with advanced features like personalization and multivariate testing. It's a solid choice for businesses looking to scale their A/B testing efforts.
  • VWO (Visual Website Optimizer): Another comprehensive platform offering many features, including heatmaps and session recordings. It's known for its user-friendly interface and strong customer support.

 

When choosing a tool, consider your budget, technical expertise, and specific needs. Do you need a simple tool to get started, or do you require advanced features like personalization? 

 

Don't be afraid to try a few different platforms to find the one that fits you like a glove.

 

That said, you can do it manually. If you're posting on YouTube, updating your CTA buttons, or something like that, you can easily just switch it manually. Just make sure you note the analytics and metrics you're using so you can see the difference.

Creating variations: Time to get creative (but not too crazy!)

With your tool in hand, it's time to create the variations you'll be testing. Remember that hypothesis we formulated earlier? This is where you bring it to life.

 

Let's say you're testing the impact of your call-to-action button colour on click-through rates. You might create two variations: one with a red button and another with a green button. 

 

Keep the changes distinct enough to measure the impact, but don't go overboard. You want to isolate the variable you're testing, not redesign your entire website!

 

And remember, consistency is key. 

 

Ensure your variations align with your overall website design and branding. You don't want your test to create a jarring visitor experience.

Defining your target audience: Not all visitors are created equal

Now, let's talk about who you'll be testing on. 

 

Your website visitors aren't a homogenous blob – they have different demographics, behaviours, and preferences. To get the most accurate results, you'll want to segment your audience and target specific groups.

 

For instance, if you're an e-commerce store selling clothing, you might want to test different variations on visitors who have previously purchased items in a specific category. Or, you could target new visitors with a different experience than returning customers.

 

The key is to ensure your test groups are representative of your overall audience. You also need a sufficient sample size to achieve statistically significant results. Don't worry, we'll dive deeper into statistical significance later on.

 

By carefully setting up your A/B test with the right tools, variations, and target audience, you'll be well on your way to gathering valuable data and making informed decisions. 

How to run and analyze your A/B test

A team sit down at the desk on their laptops running some A/B tests

So, we're set up, and you're now ready to run the rest. This is where the magic happens, so they say. Let's explore how to run your A/B test effectively and, most importantly, how to decipher those juicy results.

Determining the test duration

One of the biggest questions you might have is: "How long should I run my A/B test?" Well, there's no one-size-fits-all answer. 

 

The ideal test duration depends on several factors, including your website traffic and the level of statistical significance you're aiming for.

 

Again, think of it like baking a cake. 

 

You wouldn't take it out of the oven after five minutes, right? You need to give it enough time to bake fully. Similarly, you must allow your A/B test to run long enough to gather sufficient data for reliable results.

 

Generally, it's recommended to run your test for at least a week or two, but it could be longer, depending on your traffic volume. The key is to be patient and avoid drawing premature conclusions. We want those insights and the data from it to be rock solid.

Monitoring test performance

While your test is running, it's essential to keep a close eye on its performance. 

 

Your A/B testing tool will provide you with real-time data on key metrics, such as conversion rates, bounce rates, and click-through rates.

 

Think of these metrics as your test subjects' vital signs. They'll give you valuable clues about how each variation is performing. Most tools offer nifty dashboards with charts and graphs, making it easy to visualize the data and identify trends.

 

For example, if you see a consistent upward trend in conversions for one variation, it's a good sign that it's outperforming the other. However, don't get too excited just yet. We need to ensure those results are statistically significant before popping the champagne.

Analyzing results and drawing conclusions

Once your test has run its course, it's time to analyze the results and draw some conclusions. This is where statistical significance comes into play. In a nutshell, statistical significance tells us whether the observed difference between variations is likely due to a real effect or just random chance.

 

Most A/B testing tools will calculate statistical significance for you, often expressed as a p-value.

 

A p-value of 0.05 or lower is generally considered statistically significant, meaning there's a high probability that the observed difference is not due to chance.

 

Interpreting the results involves understanding confidence levels and effect size. Confidence levels indicate how certain you can be that the results are accurate, while effect size tells you the magnitude of the difference between variations.

 

Once you've analyzed the data, it's time to document your findings and draw conclusions. Did one variation outperform the other? 

 

If so, why do you think that was the case? 

 

These insights will be invaluable for future A/B testing and optimization.

 

By running your A/B test with patience, monitoring its performance closely, and analyzing the results with a keen eye, you'll unlock a treasure trove of data-driven insights.

Advanced A/B testing strategies

A top-down view of a marketing team working on a project

That should be all the basics you need to know when it comes to A/B testing and will be enough to get you some good results. However, these are the fundamentals, and if you're looking for better results, here are the ways to do some better A/B testing.

Multivariate testing

While A/B testing focuses on testing one variable at a time, multivariate testing (MVT) allows you to test multiple variables simultaneously.

 

Imagine you want to test different combinations of headlines, images, and call-to-action buttons. With MVT, you can create multiple variations with different combinations of these elements and see which combination performs best.

 

However, MVT comes with a caveat. 

 

It requires more traffic and a longer testing duration to achieve statistically significant results. It's also more complex to analyze, as you need to consider the interactions between different variables.

 

That's great for larger businesses with thousands of impressions a day. Not so much a small business.

 

That said, if you're up for the challenge, MVT can unlock a deeper understanding of how different elements work together to influence user behaviour.

Personalization and tailoring the experience

Personalization is all the rage nowadays. It's about creating tailored experiences for your visitors based on their individual preferences and behaviours. A/B testing is a great way to test content among all the different audiences.

 

For example, let's say you're selling running trainers.

 

Your audience could be young 30-somethings and also seniors. The content you give to both will differ, and you can A/B test between the two to see what works. This is essentially simply if you've segmented your audience, such as within your email marketing campaigns.

 

But this is just one way of doing things.

 

For example, you could personalize website content based on user demographics, past browsing behaviour, or purchase history. You could also personalize email campaigns based on subscriber interests or engagement levels.

 

By using A/B testing to fine-tune your personalization efforts, you can create a more engaging and relevant experience for your visitors, leading to increased conversions and customer loyalty.

Split URL testing

Split URL testing, also known as A/B/n testing, takes things a step further by testing completely different web page versions.

 

This approach is useful when testing different design approaches, content strategies, or even entirely different layouts. For example, you could test a long-form landing page against a shorter, more concise version.

 

Then, you can gather new insights on which approach resonates best with your audience. There are even plugins and ways to set up your website so you randomly show one page or the other (or, you know, 50/50), allowing you to test both at the same time.

 

It's a powerful tool for making significant changes to your website and measuring their impact.

 

By embracing these advanced A/B testing strategies, you'll gain a deeper understanding of your audience and unlock new levels of optimization.

A/B testing pitfalls and traps you don't want to fall into

It's crucial to be aware of some common pitfalls that can sabotage your A/B testing efforts. Consider this your friendly warning sign, helping you traverse the A/B testing landscape safely.

The "Too Many Cooks" syndrome

Remember the old adage, "Too many cooks spoil the broth?" Same applies to A/B testing. 

 

Testing too many variables simultaneously can muddy the waters and make it impossible to pinpoint which change actually influenced the results. It's like trying to solve a mystery with too many suspects – you'll end up chasing your tail.

 

Stick to testing one variable at a time, or if you're feeling adventurous, use multivariate testing with caution. Remember, focus is key.

The "Knee-Jerk Reaction" trap

Patience is non-negotiable. Don't fall into the trap of ending your A/B test prematurely. 

 

It's tempting to jump to conclusions based on early results, but those initial findings might not represent the bigger picture. It's like judging a book by its cover – you might miss out on a hidden gem.

 

Give your test enough time to gather sufficient data and achieve statistical significance. 

 

Remember, we're aiming for reliable insights, not hasty decisions.

The "Ignoring the Elephant in the Room" blunder

A/B testing isn't conducted in a vacuum. External factors can influence your results, so it's crucial to consider the context in which your test is running. 

 

Is there a holiday season affecting your traffic? Are there any major news events impacting user behaviour? 

 

Be mindful of external influences and take them into account when analyzing your results. This will help you draw more accurate conclusions.

The "Victory Lap Without the Finish Line" mistake

So, you've found a winning variation. Congratulations, but don't celebrate just yet. 

 

The final step, and arguably the most important one, is to implement those winning changes on your website or in your marketing campaigns. Failing to do so is like winning a race and not collecting the trophy.

 

Make sure to implement your winning variations promptly and track their performance over time. This will ensure your A/B testing efforts translate into real-world results.

 

By being mindful of these common pitfalls, you'll navigate the A/B testing landscape with confidence and avoid costly mistakes.

Wrapping up

Give yourself a pat on the back – you've gained a valuable skill set that can transform your online presence. But let's be honest: diving deep into A/B testing requires time, expertise, and a whole lot of data analysis.

What if you could skip the heavy lifting and have a team of experts handle all your A/B testing and online optimization needs?

That's where Ignite Web Design in Edmonton comes in. 

We're passionate about helping businesses like yours thrive in the digital world. Our team of skilled professionals will take the reins of your A/B testing, crafting compelling variations, analyzing data, and implementing winning strategies to skyrocket your conversions.

From website design and development to SEO and content marketing, we'll handle your entire online presence, allowing you to focus on what you do best – running your business.

Ready to ignite your online success? 

Contact Ignite Web Design in Edmonton today for a free consultation and let's get your online presence working precisely as it should be.

More articles

13 HTML Tags Every Edmonton Business Needs to Dominate Search Results

Edmonton businesses, boost your local SEO. Learn 13 essential HTML tags to climb search rankings, attract customers, and dominate the online scene.

Read more

Is WordPress the Right Choice for Your Edmonton Website Development Project?

Is WordPress right for your Edmonton business? Weigh the pros and cons, explore its flexibility, and make an informed decision for your website.

Read more

Tell us about your project

Our office

  • Edmonton
    1367 Siskin Wynd NW
    Edmonton Alberta T5S 0R3
    780-720-1385