A/B Testing

Most Frequently Asked Questions

What is A/B testing?

A / B testing (also known as split testing) is a method of comparing at least two versions of a website or app to determine which version performs better against a defined website target.

Why A/B testing?

Quantitative data speaks for itself. So far, your employees and you could only guess how visitors of your website will respond to a particular design of your website or app. A / B testing gives you the opportunity to show visitors two versions of the same page. Based on the results, you can then determine which variant works better. Continuous testing and optimising your website based on the test results helps to improve your website goals such as increase in sales, revenue, donations, leads, registrations, downloads or the number of user-generated content while providing valuable insight into visitor behaviour.

A / B testing also allows a comparison of the effects of different prices, promotions and similar factors. The goal is always to optimize a product or a service in such a way that as many users as possible are addressed and lead to a (purchase) action. A / B testing is designed to improve website performance and typically increase conversion rates.

How does A/B testing work?

In A / B testing, you use live traffic to test two versions of a website – version A (original) and version B (variant) – and measure the effect each version has on the conversion rate. Once enough visitors have passed the test and the results are statistically significant, the test is terminated and a winner variant is determined. Subsequent A / B tests include the information obtained in the previous tests.

If you want to run an A / B test, you must first define a goal for your website or app, to which the respective conversion rate in the test refers. It can also determine the effect of each variant on multiple website goals. The variant with the best goal achievement is then permanently implemented.

Targeted testing puts an end to puzzles when optimising your site and makes data-driven decisions instead. “We believe …” belongs to the past as it becomes “We know …” instead.

Which A/B testing tool should I use?

When it comes to A / B testing, any optimiser will sooner or later come up with the tool question. The number of optimisation solutions on the German market has multiplied in recent years. You should therefore spend time selecting a suitable solution for your situation or appointing an independent service provider.

A/B tests vs. multivariate testing

A / B testing is also called a split test and divides the users of the website or app into two subgroups, group A and group B, during the test phase. There are also two variants of the website or app to be tested, the original and the variant. During the test, for example, in a 50% / 50% split test, 50% of the users (group A) are directed to the original page and 50% (group B) to the variant and the conversion rates achieved in the respective groups are compared, considering the relevant website goal.

Multivariate testing is another method of optimizing the website, in addition to A / B testing, to maximize the conversion rate. While A / B testing only compares two different web site variants, multivariate testing changes several variables on one page at the same time and compares the variants. The goal is to use multivariate testing to find out which combinations of variables can produce the most positive impact on conversion rates.

Iterative vs. innovative A/B testing

Iterative testing means testing minor changes such as:

  • Changing the color of a button
  • Changing the placement of a “call to action” button
  • Changing the wording of a heading
  • Changing a particular image or its size

For iterative tests, the first question is whether sufficient traffic and conversions are available to achieve statistically significant results through small site changes. Can I just change the color, headline or image of the button and watch the conversion increase dramatically? For many smaller companies, the answer to this question is unfortunately “no”.

Iterative testing is most appropriate if you’re getting a significant amount of conversions per month for an asset, otherwise it would take too long to get a statistically significant result. If you need six months or more to generate enough conversions, other factors in your traffic may change so much that they affect the test.

Iterative tests generate quick wins. They are easy to implement and with the right amount of traffic you can test numerous conversion points for the entire website platform. In addition, if you use these tests to identify winner variations, it’s generally easy to implement the recommendations across similar pages.

For example, if changing the color of the “Add to Cart” button results in a 3% increase in e-commerce sales, the change should be implemented on other pages on your site that also use that button type.

On the other side of the A / B test spectrum there are innovative tests. Rather than making a simple change to a single element on the page, innovative tests attempt to modify the entire surface with a completely different design concept. The purpose of an innovative test is to see if a radical shift can be repsonsible for greater conversion gains. Each of the two versions of the page plays for 50% of the audience, and conversions are measured across both types of pages

Which type of testing is the best for me?

Choosing the test that best suits for you depends on your traffic and goals. With frequent iterative tests, you can see smaller gains of 1-4%, occasionally higher conversion gains. Sometimes there can be no noticeable improvement.

With innovative tests, the results can be significantly higher than with small iterative tests. However, the effort to get there is bigger. Since you need to completely redesign an entire page or section of your site, the time to set up each test is longer. It’s not uncommon for innovative testing to generate 10-25% or even more, so they can truly have a ground-breaking impact on lead capture or conversion rate. The biggest problem with innovative A / B testing is that it can be difficult to know exactly which change or combination of changes triggered the increase in conversion.

We are also experts on the following topics!