You can hire my services

I am Ben Lang an independent web conversion specialist with over 20 years of experience in IT and Digital and 12 years Conversion Rate Optimization (CRO) know-how. I provide a full analysis of your website conversion performance and the execution of tried and tested CRO optimization exercises through AB testing, split testing or MVT (Multivariate testing ) deployed to fix your online conversion issues. Contact me at https://www.benlang.co.uk/ for a day rate or catch up with me on LinkedIn

The optimization of a landing page test

We have an AB test running on a product page and landing page simultaneously to see which location would convert better. The background to this is that we once used landing pages a lot, and then the industry as a whole (banking) decided to move over to dropping people straight onto a product page. It became the established wisdom that product pages were better than landing pages for optimising your traffic. This test sought to revisit this hypothesis. We decided to use a specific traffic source, identify them on page load, split the traffic 50/50, 50% remaining on the product page, 50% redirected to a bespoke landing page.

As well as testing the different locations we wanted to test a few other concepts of optimisation opportunities we'd heard about:

1. Do bespoke welcome messages help in converting users to apply for the product?
2. Do the presence of primary navigation in landing pages actually help visitor conversion?
3. Does a 'band wagon' message help uplift; i.e., "join the thousands of other visitors in choosing this product" etc.

When we started the test we soon found out that the traffic source chosen to use as the test audience was a lot smaller than anticipated. Within a few days we had to adjust the 50/50 weighting and send 100% of the traffic to the landing page with a view to retesting whether product pages work better than landing pages at a future date. If we'd left it as it was we would still have been running the test 2 years later, so low was this traffic to the page from that specific traffic source.

As the test is still running we're still unable to answer question 1. 'Do bespoke welcome messages help in converting users to apply for the product?'. However, within the first 2 weeks we were able to answer questions 2 & 3. The primary navigation in the landing page performed very badly, allowing users to leak out from the test page and not convert. The welcome message using the band wagon theme also failed to convince any users to apply and also performed very badly. Both the presence of the primary navigation and the band wagon message are shown in the test page combination below (as taken from our Maxymiser test console). Upon conclusion of this test we will re-run the winning test variants on the product page to see again which page gets the biggest uplift in conversion, this will be covered by Part two of this post.

What is Statistical Significance?





I've sort of overlooked this topic since establishing this blog but for subject completeness shall we say, I think I should now mention the role of statistical significance in optimisation testing.

One of the biggest headaches to running an AB test or Multivariate test on your website is knowing when your test is complete, or heading towards conclusion at least. Essentially how do you determine signal from noise?

Many 3rd party tools give you the metrics to determine a tests conclusiveness, for example the Maxymiser testing tool displays a 'Chance to beat all' metric for each page combination or test variant within your test.
But more importantly, what underpins these tests is the concept of statistical significance. Essentially a test result is deemed significant if it is unlikely to have occurred through pure chance. A statistically significant difference means that there is statistical evidence that there is indeed a difference.

Establishing statistical significance between two sets of results allows us to be confident that we have results that can be relied upon.

As an example, you have an AB test that has two different page designs. Analysing the data shows there are two results:

Page 1 - 1,529 generations with 118 responses or actions - giving a conversion rate of 7.72%.
Page 2 - 1,434 generations with 106 responses or actions - giving a conversion rate of 7.39%.

Looking at the two results which do you think is the better? Is page 1 better because it as a higher conversion rate that page 2? Using statistics and firing those 2 results through a basic Statistical Significance calculator (I'm using this one Google's Optimizer test duration calculator) tells us that the two results are 0.335218 standard deviations apart and are therefore not statistically significant. This suggests that it is highly likely that it is noise causing the difference in conversion rates, so plough on with your testing. If a 95% statistical significance is acheived you can safely say the test is onclusive with a clear winner. This is also indicative of a strong signal and gives you a result based upon a wholly statistical basis as opposed to human interpretation.

The HiPPO factor


Just learnt a great new acronym today 'HiPPO' - Highest Paid Person's Opinion.
Never been more apt than when you're doing an optimization exercise on your website
and the 'HiPPO' blows your proposed new design out of the water because they dont like it based upon purely subjective thinking.

Found this on Dave Chaffney's site in an interview with Avinash Kaushik, author of Web Analytics - An Hour A Day.

The reptilian brain & your inner buy button

When we do AB testing or multivariate testing you are generally experimenting with a persons unconscious response to your test content. Basically does a red apply button work better than an orange one for instance. Marketing experts refer to this as connecting to or appealing to the primitive or 'reptilian brain'. When someone arrives on a landing page and is making a quick intuitive based decision you are generally dealing with the inner, more reactive brain, not the higher conscious brain. Your not asking for any in-depth analysis of your page content you just measure what works best for getting a person to perform a desired action in your test. It seems for the majority of the time the reptilian brain is at the driving wheel so we need to talk directly to the driver.





This is considered to be neuromarketing in it's most basic form. The study of how we respond to adverts and products at a neurological level. In research in this area volunteers have their brain activity monitored via brain scanners whilst being exposed to marketing media to measure their response. The ability to understand how the inner workings of the brain processes images and messages and reaches decisions potentially gives marketers a new tool to fine-tune ads and marketing campaigns, bolster and extend brands and design better products. Marketers’ use of neuroscience technologies has alarmed some consumer groups who fear that it could lead to the discovery of an 'inner buy button', which, when pressed, would turn us into automated shoppers. Such fears spring from the increase in marketing-related problems such as pathological gambling, obesity and Type 2 diabetes.