In our prior blog we talked about the sheer scale of the customer experience optimization problem.
We did this with the hope that the A/B or MVT testing community will realize that experimentation with only a few page elements is like a drop in the bucket and that effective optimization initiatives requires the use of a tool that can experiment with a broad spectrum of variations at the same time.
The optimization experts who are actually the biggest proponents of A/B testing, will challenge the premise above with the argument that the number of elements tested and MVT capabilities do not matter. Instead, they will contend that optimization success is more dependent on the quality of the optimization variables.
We absolutely agree that the quality of the optimization variables matter. No optimization tool can turn ‘dirt’ into gold. However, good optimization variables formulated on the basis of the expert’s empirical experience, are just a starting point of the optimization journey. They are only hypothesis, albeit good ones. Experienced and informed designers are not clairvoyants, and no sane person can guarantee that they will always produce positive results.
Each company has its own unique set of properties and therefore the results are unique to each company. It is not easy to uncover experiences that perform better than what you already have and the process of optimization carries a certain level of risk. The probability of success not only increases with the quality of the optimization variables, but also with the ability to use a ‘bigger fishing net’ that can try larger numbers of options at the same time, and also through the use of smart technology that can learn and adapt to visitor preferences in real time.
In this post we will add another dimension to customer experience optimization: the importance of interactions between visitor attributes and site changes.
The exciting digital world
By its nature, digital medium is an interactive medium. Visitors have the opportunity to click and control their buying journey and e-Commerce companies have the opportunity to track, segment, and record visitor actions, and ultimately act and make site changes to produce better outcomes.
We use the term customer experience optimization to describe the acting part, the set of site optimization activities, whose role is to find what version of the customer experience is performing the best.
The exciting opportunity is that in the online world almost every aspect of web traffic and visitor’s buying journey can be tracked and sites can be dynamically changed. The biggest challenge is due to the fact that everything is trackable and dynamic and that it is not a trivial undertaking to make the right decision.
One common way of trivializing the optimization problem is to ignore interactions that exist between visitor attributes and page element changes.
We are all in desperate need of simplicity, clarity, or certainty that would rather accept the premise that everything we do is just a set of independent activities rather than face the reality where every activity or change interacts with everything else.
Digital marketing fuzzy math: 4.83%+3.12%=-1.83%
As we already mentioned in our prior blog, it is a common practice to invest in a premium multivariate testing (MVT) solution and when multivariate testing is found to be a very slow process, to retreat and run simple A-B tests.
The rationale is that a company cannot wait forever to get statistically significant results out of a MVT solution. Instead they desire to innovate in an agile way through quick A/B tests, learn quickly, and then compile the winning outcomes into a winning combination, implement and prosper.
This would be super fine if not for the pesky thing called page element interaction.
Here is a real life example. The numbers are real but the web page was changed to anonymize the client’s identity.
A major brand ran two independent A/B tests on their product detail page:
Test #1: Free shipping banner
Test #2: Call-to-Action button
As many companies normally do they combined the test results and created a ‘winning’ new version of the page. This time around they also ran the follow up test only to find that the new version of the product detail page underperformed the baseline:
Although the results from the follow-up experiment were surprising at first, they can be explained rationally: when presented one at a time, each winning element was able to attract the visitor’s attention and produce the expected results. However, once placed together, these elements are competing for the visitor’s attention, which is creating subconscious anxiety and lower results.
The Science Behind Multivariate Optimization
When a visitor sees the web page for the first time the eye is not able to absorb the entire screen image all at once.
Instead the human eye, wired to act through flight response, causes us to notice the highly-visible things first. For example, certain colors like red draw attention to things like calls to action or banner ads. Lesser visible items are noticed and processed later. This also means that seeing a web page is very different than reading a book. The human eye will not process the web page from left to right and top to bottom; rather, the human eye will process visual data in according to a visual order of page objects.
The order in which visual images are absorbed will determine the visitor’s reaction. The goal of your customer experience optimization initiative is to create the optimum order in which the web page is absorbed by the visitor and by doing that deliver a great customer experience that leads towards great end results.
When you make multiple simultaneous changes to your web page, you are in essence ‘shuffling the deck’ and changing the order in which the web page elements are perceived. Often a small, virtually unnoticeable change can significantly affect the order with which page the image is absorbed and that triggers a significant increase in performance.
To illustrate the point, let’s use an example below that shows a typical product detail page with key elements marked in red:
Furthermore, let’s assume that the baseline user experience, an order with which page elements are noticed by the human eye, is in accordance with the order below:
- Upon loading the web page, a visitor’s eye will see a big product picture first
- Next, the visitor will notice social widgets
- Then the visitor will focus on different product views
- If the visitor liked the product, they will notice free shipping
- Visitor will pay attention to the product price
- Visitor will click on Add-to-Cart button
- Before leaving the page a visitor will also notice breadcrumbs placed above the product picture
The measurable average result of this progression is that 9.23% of all visitors will select and buy a product.
Now, let’s play a multivariate experiment through introduction of the following page changes:
- Breadcrumbs: reduce congestion on the top of the web page by moving breadcrumbs to the bottom of the page
- Free Shipping: make free shipping banner more visible by placing orange color behind free shipping label
- Social Links: reduce clutter by moving social links below product views
- Add to Cart: make button more visible through maroon color
Visually a new version of the web page looks as shown below:
The visual impact of the yellow free shipping banner and maroon Add to Cart button could have been that that it made other page elements less visible such that new visitor experiences looked like the picture below:
The new experience can be described as follows:
- Visitor noticed free shipping first which created an incentive to proceed
- He liked the product
- The price was right
- He made a purchase
The final outcome of this new experience was that the percentage of visitors who purchased the product grows from 9.23% to 11.56%, which represents a very significant lift of +25.16%.
Note: No one can actually visualize the average user experience and the example above was our attempt to illustrate the importance of the multivariate approach and the magnitude of element interaction; we routinely see 10% – 20% revenue lift created by changes that most perceive to be “small” or “invisible”.
One view of the interaction between web page changes is that this is a ‘pain in the butt’ because it complicates the customer experience optimization process.
The other view is that the element interactions, if properly harnessed, provide a huge new opportunity for revenue growth.