Beginners guide to smart AB testing

Tweaking your website may be effective to enhance your results, or not.

That's why A/B testing was created. 

A/B testing tells you if your marketing actions have a positive impact, and how much better you get. 

What is A/B Testing

Version A relates to the original version, which is called in the experiment "the control".

Whereas Version B relates to some modification applied to the site also known as the "treatment".

So basically, and A/B Test is about comparing the performance of version A versus version B. 

To do that, an A/B testing software will send 50% of the traffic to version A, and 50% of the traffic to version B.

Why A/B Testing

Old times of easy trafic are gone

When you first launch your website, your obsession was probably to acquire trafic, to get visitors coming in.

Years ago, it was quite easy to build traffic. At least, was it affordable. From these times, it seems that many people still believe that getting success on the Internet equals to getting traffic.

 Unfortunately, that's only one part of the equation. People coming to your site must convert. Then you get what you want: revenue, transactions, sales, leads, signups etc.

For years, marketers really focused on traffic acquisition. Now, SEO competition is high in most industries, and Pay-per-click ads aren't so good for the money. But the gold rush towards more trafic went on. To such an extent, that around 92 % of marketing budgets online were spent on acquisition in 2014.

Now acquisition is getting harder and more expensive, so site owners get more interested in their existing visitors.

Conversion rate optimisation is rising in the minds everyday : testing softwares, tools for heatmaps, session recordings, on-site surveys, on-site chats... Let's face it, CRO has the wind in its sails !

Obviously any site still needs traffic. But a serious online business can't scale these days without optimising for conversions.

What you will love about A/B testing

With SEO or most traffic acquisition channels, it's difficult to get clear numbers about your return on investment (ROI). But with AB testing, it is such a breeze !

If like most digital marketers you've been through many acquisition channels...

...You will love A/B testing because you get the ROI for each of your action, with a scientific precision.

You still have to be very careful with numbers and statistics (more about this below). But in the end, you know what you're doing, remember : if you do better, and how much better.


Getting started

Analysis : Dive into the data

People are usually quite excited before launching their first A/B test. I still feel that all the time... Is the line green or red ? Am I going to get a 300% uplift in conversions ?

But before launching a successful test, there are some necessary steps.

One of the most important reasons for a great A/B test : proper analysis. This is a crucial phase, during which you will get a lot of testing ideas, based on your site's data.

Other sites may have great results testing this or that in their context. Getting ideas from your data is much stronger than copying ideas other people tested on other websites. Context is a key to success, so take time to focus on your context.

After analyzing, you will be able to answer some questions like:

  • Is my website converting better on my homepage, or on a specific product/service page ?
  • Does my website convert better on mobile device, on tablets, on desktop computers ?
  • What is the conversion rate for returning visitors compared to new visitors ?

And so on.

When you will notice the biggest gaps, you'll know that you have found the greatest opportunities, and this is where you'll want to focus on first.

"What should I test" is usually the first question, and shouldn't be 

You will find plenty of block posts over the Internet giving you ideas of what to test. And this is good as long as this is not your only source of inspiration.

When you're doing your analysis, it's not the time when you should think about what elements you should change on your website. This comes later.

At this step, I like better to think in terms of conversion opportunities. 

The first question is : "Where" are the opportunities ?

And even, where do lie the biggest opportunities ?

What I personally like to do, when I begin to work with a new client, is **not** looking at the client's website, but rather looking only at his web analytics.

Why ? Because if I look at the website, I will tend to have a UX approach:

"This is Okay", or "ouch this is ugly, what am I going to test, this site is full of best practices…"

Instead of that, I dive into Google Analytics and import my own custom reports :

  1. What pages generate the most traffic? The most conversions? The most revenue?
  2. What are the demographics of the audience : gender, age?
  3. Which landing pages make people bounce, or stay?
  4. Which important pages make people leave the site? And convert worst?
  5. Is the site working properly on every browser? On every device? On every OS?
  6. How many visits before a conversion, on average?
  7. Do people use the site search? What words do they type in?
  8. What is the common path visitors take (goal flow report)? 

If you have these custom reports prepared in a sheet, you will have a broad overview of the website activity and audience within less than 30 minutes.

As an example, you might have noticed that a particular product has a lot of pageviews, and a very weak conversion rate compared to product page's average. This is the kind of thing you're looking for.

And at this stage, you still haven't looked at the website.

Optimizing for what ? 

Different kinds of conversions

In the end, you optimize to increase revenue. This is your macro conversion.

But despite our will, visitors don't come in and buy everything after 10 seconds. Visit after visit, we as marketers direct them deeper in our conversion funnels. 

So every step in the Conversion funnel should be tested.

This should also be done in case you don't have enough traffic. For an e-commerce business, there are way more clicks on the add-to-cart button than there are transactions. So it's easier to test for Add-to-Cart Clicks, at first.

Most AB tests will focus on smaller goals like :

  • Number of transactions
  • average basket
  • number of leads
  • users signups
  • newsletter opt-ins
  • Key pages visits (Top of the funnel, checkout process...)
  • Clicks on a "Call to action" button

There is a number of different goals you may want to reach by A/B testing. You may wonder, how then could I optimise for revenue ?

To get further : Integration with Google Analytics 

Whatever is your conversion goal within the AB test, have a look deeper into Google Analytics to see what happens to your revenue.

Thanks to GA integration with all serious A/B testing software, you have the ability to check not one but all the metrics you want. Compare version A vs version B : bounce rates, engagement metrics, progress along the funnel... This is where it really gets awesome.

This way, you're sure that in case of great uplift for transactions, it also affects positively the revenue.

Be aware of nasty cause-to-effect like : your treatment could get you more visitors deeper into the funnel, but affect negatively the overall revenue !

Take this example : Imagine that you tell that your products are all free (when they're not). You will probably get lots of clicks on your products, but you're not gonna make lots of transactions.

Ok now, 5 ways to get A/B test ideas 

Let's begin with these data sources of choice: 

  1. Heatmaps

  2. User testing : Browsing the site in a customer's shoes

  3. Surveys : When you don't know, ask

  4. Case studies : What others do
  5. Previous Tests results : The most powerful source of inspiration (obviously this source of data comes into play after a few tests)

1) Using heatmaps to uncover opportunities

Now you know better what pages should be optimised first. You may want to have a look at these pages with a heatmap tool, like crazy egg or my favourite, Beampulse.

Within two weeks, I will blog about all the data you can extract from heatmaps, so I'm gonna keep it short here. With heatmaps, you may better understand what visitors are doing on your site, and where they are clicking, or not.

You may identify that your navigation can be really optimised,

or that a banner is ineffective and takes a lot of room in the page,

or that another element in the footer of the page is clicked a lot, and should be moved up.

2) Now diving into UX : Put yourself into a customer's shoes

Now, let's dive into what the website has to offer. What does it sell? What looks nice? Let's try to shop, or fill that form. Is the experience smooth? Is everything clear, or would one need some explanations at some point in the customer's journey ?

Usually, when you've done your analysis, you have an Excel sheet filled with ideas. You may also have a lot of questions to ask to your clients, or to your segments of audience, with a survey.

3) Let visitors speak with a survey

If you suspect a page to have a potential for conversions, but you don't really know why the page doesn't convert well, then it's good to ask.


Whether your visitors responses confirm what you think, or give you new ideas, surveys are a powerful way to feed A/B testing ideas. And you will feel more confident about what your visitors think.

4) Case studies, knowing what successful optimizers do 

Be cautious here, because you can find lots of crap on the web. Prefer case studies from renowned optimizers than the first blog post about A/B testing. Bloggers aren't necessarily optimizers.

But it doesn't mean you shouldn't find inspiration in what others do, you should ! Once you've done your analysis' job, reading case studies, lots of them, is a great source of data. 

Case studies are real, not just pieces of advice that come from the dark void. 

To begin with, here are 3 reliable sources of Case studies:

  • WhichTestWon
  • Unbounce blog
  • VWO Blog

Refining your ideas with clear Hypotheses: articulating your potential solution

I'll keep this section short, as this is an upcoming post. Criticising an element or a web page is easy, but there is a time to be constructive.

If you think that you have found the problem, focus on it, spell it loud and build your solution around it. The more laser focused is your statement, the better.

Why? Ideally for a well-designed test, you would be able to confirm or discard your hypothesis based on the results of the tests.

Designing clear hypotheses is what allows you to do better tests month after month. This is crucial to progress by iteration.

In short... your hypothesis should be written in your spreadsheet before launching the test.

From "What to test" to "what to test first"

Now, it's time to choose your first A/B test.

Most likely, you have too many ideas. ( it is strange to find so many "A/B testing ideas" articles, as it's not really the hardest part)

So you will have to make a choice: what changes have the greatest conversion potential?

Here are my personal criterias to score each idea :

  • What pages is the treatment applied on ? All pages, a few, high / low trafic pages ?
  • Analytics: do the numbers say the potential is huge?
  • Is the treatment easy, medium, or difficult to build?
  • Is the conversion goal quite important or very important to the client?
  • Should I wait for any element before launching the A/B test? (For instance, a design, or a promotional code from the client)
  • What is my gut feeling about it?

Then, it's all about writing a score in front of each hypothesis/treatment, and prioritising each of them.

If interested by my personal Excel spreadsheet, please leave a comment below.

A word about Testing Softwares and Setup

Many A/B testing softwares are great these days, so make your own choice. The leader is Optimizely, followed by VWO and Convert, and my favourite is Beampulse (check the full review of it here). 

Regarding setup, all tools I know are created equals :

If you have access to either FTP , your backend or a Tag Manager (like GTM), and if your tags aren't messed up, it takes no more than 5 minutes. 

Implementing your first A/B Test

For most A/B testing software, you will follow the same pattern. 

  1. Who are you presenting the experience to? On what pages?

  2. What is the treatment ? What changes do you want for version B ?
  3. For what goal ?
  4. When ? Weekdays, Weekends, days or nights...

To prevent you from lingering in front of the software, it's good to have your testing scenario fresh and detailed in your mind beforehand. Or even better in an Excel spreadsheet.

1) Segmentation : Who are you presenting the A/B test to ?

Here is a common list of segmentation criterias that you will find in most A/B testing software :

  • Visitor is on page X
  • returning or new visitor
  • visitor has spent X minutes on page,  or has been active for X minutes
  • has visited at least X pages
  • Basket value is > X and < Y
  • When an element is visible on screen (say a promotional banner)
  • Any cookie value, or JavaScript variable
  • Custom dimensions: you can import CRM data to match specific customers, or set of customers.

2) Action : How are we going to create Version B ?

What to test ?

That's basically marketing. If you could change anything on your site, what would you do ? 

With a good A/B testing software you can do whatever you want. You can either : 

i) Create a copy of your original page, and then modify it as you like (kinda old school, but still useful).

You then have 2 URLs :


In some softwares, this type of experiment is called "split test".

Modifying the original version often requires you to code, which was limiting for many people. That's why modern A/B testing softwares developed a feature to allow marketers to change things without coding :

ii) Create a variation within the testing software, using a visual editor

Marketers hate waiting for IT team to develop their great ideas. They describe this problem as the most frustrating when optimizing a website.

If you faced this bottleneck, give a try to a testing tool live VWO or Beampulse. Here is what can be tested without any technical knowledge :

  • Changing texts, like your tagline in the header, may be the simplest but the most effective A/B tests.
  • Removing or Adding a block
  • Changing the order of appearance of a block
  • Adding a layer for marketing messages: pop-ups, pop-ins, footerbar, headerbar, tooltips...

And again, you don't need FTP access, or backend access in your platform. All changes are made in the testing software.

3) Conversion : What is our goal with this experiment ?

Tracking abilities are more effective than ever before. We're able to know what is the most effective marketing message for each segment of your audience, within weeks or days !

Believe me : the more you do A/B tests, the more you're addicted to it.

The question here is : What do we track ?

This is the last step : you tell the software when to trigger a conversion. For example, when :

  • A key page has been visited (transaction receipt, thank you page, Top of the funnel page, registration page...)
  • A call to action has been clicked (sometimes it's easier to set up than setting a page URL, when the site is dynamic) 

These 2 goals are the most common, but you still have more advanced options (cookie values, JavaScript variables...) if needed.

Delivering the promise

Now you get this nice curve, with a positive trend…what we all want. One last thing of importance :

Are results reliable (eg statistically valid) ?

I bet that after launching your first A/B test, you will be coming back and forth to check how your treatment is doing versus the control.

Even if your website has enough traffic, you should wait.

To keep it simple, here are 4 criterias that you should validate before declaring a winner :

  1. There is no trend's variation during the last third of the experiment. Curves A and B should be parallel for a while. 
  2. The test should last at least two weeks, preferably three weeks.
  3. The statistical confidence of the test should be at least 95%
  4. and finally, each variation should have 100 conversions. This number highly depends on the performance of the variation. If you only have, say 5% of uplift, then you need more than 100 conversions for each variation. If you have a 300% uplift, then the "Sample size" required is much smaller.

If all these 4 are checked... Pop Champagne corks !

What if your tests get negative results ?

That will happen, to everyone, for sure. Even the renowned optimiser Peep Laja estimates that only 60 to 70% of his tests get positive results.

A key towards success is what you do when you have negative results. CRO isn't the new Santa, after all. Every failed test should be analysed in-depth. It may leads you to start over again the test, with a stronger hypothesis, or a narrower segment.


Do you have a better overview of what A/B testing is?

If you began this article thinking that A/B testing was like throwing pieces of your website against the wall, to see what happens, I hope you changed your mind.

Testing the colour of a button and a few other things leads just nowhere.

If you follow the process described above, you'll build successful A/B tests sooner. Results come test after test. CRO is all about iteration.

Don't be put aside if you don't have all the necessary skills by now.

Testing requires lots of skills, and successful optimisers have very different backgrounds. Focus on your strengths, and the rest will come with time.

Did you begin A/B testing recently ? Do you enjoy data-driven marketing and want to share a story ?  Or maybe you are struggling to take the plunge ? Leave a comment below, and I'll do my best to help.

Add comment

Security code