You can hire my services

I am Ben Lang an independent web conversion specialist with over 20 years of experience in IT and Digital and 12 years Conversion Rate Optimization (CRO) know-how. I provide a full analysis of your website conversion performance and the execution of tried and tested CRO optimization exercises through AB testing, split testing or MVT (Multivariate testing ) deployed to fix your online conversion issues. Contact me at https://www.benlang.co.uk/ for a day rate or catch up with me on LinkedIn

Culling your test variants


As an Optimization team, we are new to the whole multivariate testing business. We are more than aware that we dont always do things by the book. So when we undertake a multivariate test we dont always adhere to the basic principles of testing.

The biggest rule we tend to ignore is 'Do not tamper with your test'. The trouble is we always have to have an eye on the bottom line. So we are constantly asking ourselves whether what we're undertaking in terms of testing is not impacting upon our sales in a negative way? Are we actually reducing the conversion rate on the website?
My colleague generally monitors what's going on downstream during a test and looks at the basic application submittion rates for our products. If he notices a downturn in conversion rate during the course of a test we get a bit nervous.

Thankfully our multivariate testing tool, Maxymiser allows us to look at how individual variants are performing. If after a period (usually around one week into a test in our case) we start to see a downturn we'll start to examine closely which variants we can 'cull' from the test.
Once we highlight the under-performers we then downweight* them out of the test entirely. This is beneficial for two reasons:

1. You minimise negative impact on conversion and sales.

2. You reduce the number of page combinations in the test.

The lower the number of page combinations the quicker your test period. This is great for us because of the second rule of testing that we frequently ignore 'Allocate enough time for testing'. Basically speaking we run tests for a far shorter period than is recommended.
Most tests, given enough visitor traffic to your site run anywhere from 4 to 10 weeks, or even longer. We tend to have ran tests from 2 to 6 weeks. Our excuse for this is that there is so much other activity going on on the website at any given time by other people that we have a very narrow window in which to test and get a result.

Another key thing when planning your MVT test is knowing how much traffic you get to your site and whether you've got enough traffic to run all your page combinations and see an outright winner at the end of your test. So far we've been reasonably lucky in that we've had enough traffic to run the tests for a relatively short period and still acheive a winner.

Obviously ignoring key testing rules and principles is not recommended. But if lke our Optimzation team, you're stuck between a rock and a hard place, and there's a certain need to get some kind of testing done our early experiences have shown that you can bend the rules to get some kind of learning or outcome in a short space of time.
* In multivariate testing, each variant is usually allocated a weighting. For example, if you give a variant a 50 weighting in the test console it will be served 50% of the time, while the default content is served the other 50% of the time.

Our 1st multivariate test - Killer Questions




We have an application form that is used for various products, in this case we focused on the one used for an online current account.
The first page of this app form is what we call the 'Killer Questions' page. This is where you set the scene for the applicant in terms of the criteria they have to meet to continue with their application and what bits of information they're going to need to have to hand to complete the application process.
Typically we could expect around a 35-40% drop-off rate for this page. This could of course be because people simply dont meet the criteria and depart the app process. Alternatively it could be that the page design, which was by nature quite weighty in terms of the volume of info needed to be conveyed to the user at that time. Based upon the hypothesis that the content may be at fault we decided this was the best candidate for hosting our first multivariate test. Here we would test different content on the killer questions page to see if an uplift in number of people commencing the application could be acheived. We looked to gain a 5% uplift overall.
The Killer Questions page looked rather like this.
The 2 boxes are 'MaxyBoxes'.
To effectively carry out multivariate testing on your website you're going to need a tool to host & serve the alternative page content you want to test and additionally a means of reporting the progress and outcome of your test. We opted for Maxymiser to acheive all this. The MaxyBoxes you see there represent the areas of the page that will have variant content served to the visitor.
So basically in this particular test we wanted to see what would happen when we served different security images in the top-right of the page and different page content or layout in the second MaxyBox on the page (the larger box).
We split the traffic 50-50. Half being served the default content, the other half a combination of new test variants we'd come up with. Coming up with alternative content is a whole new learning experience in itself as no one in our Optimization team is a copywriter or creative type but I'll go into this in more detail in a later post.
After several weeks of multivariate testing we ended up with a winner...

You may notice that this has a different security image to the original, has simplified page copy, all the heavy and intimidating legal wording was removed to a pop-up link.
This page design resulted in 8% more people passing through the killer questions page compared to the default design. It also resulted in 5% more people going on to submit the application. When accept rate* was taken into account the new page design resulted in a 20% increase in productivity over the default design.
This test, although it's our first ever, will probably turn out to be our most important. This is because it was at the very last point in the visitors journey that we can influence whether they buy into the product or not. It's make or break time when they're on the doorstep of the application form, the point at which they choose to purchase or run away.
* note: You can apply for current account - but you might not necessarily be accepted as a customer, hence an accept rate.