If Amazon jumped off a bridge, would you do it too?

In the e-commerce world, the answer might just be a resounding “YES!” I mean, why not? Buying on Amazon is a no-brainer. If even your mom can handle purchasing her toilet paper online, it’s easy to deduce that these are the gold standard e-commerce principles that will successfully reach all types of audiences. There’s no risk in following in Amazon’s highly tested and researched footsteps, right?

Wrong.

What works for Amazon’s customers will not necessarily work for your customers. I repeat. What works for Amazon’s customers will not necessarily work for your customers.

I will share my personal experience with you.

It was a day like any other. The members of the HiConversion team were working steadily on their projects. Suddenly, there was a shift in the wind. Out of the corner of our eyes, we saw a client had changed its Add-to-Cart transition to push visitors directly to the cart page. Then, another one. And another one. Soon it was clear that this was no coincidence. After asking around, it came to light that a study was going around, dictating that pushing customers to cart was the best way to go (citing Amazon as an example).

It’s definitely not a bad idea. As a concept, referring visitors to the page that will allow them to move forward in the transaction process makes plenty of sense on paper. However, “how much sense it makes” isn’t enough of a reason to implement a new idea without testing. Human beings can never accurately predict consumer preferences, which is why even the best ideas are just hypotheses worth testing.

The fact is, making a major change on one site based on the performance of another site is nothing more than a shot in the dark. The effectiveness of such a change could be as grand, as measly, or even as negative as anyone’s best guess. Unfortunately, a best guess is still just a guess.

In fact, it could be less than a best guess. Companies publishing these studies are under a lot of pressure to make an impact with big ideas tested on other flashy e-commerce sites. There is little incentive to warn readers that “the data being presented may or may not actually apply to you.”

What it comes down to is this: you can make all of the changes you want based on whatever study you want. However, at the end of the day, without a way to verify performance changes (and year-over-year is not an accurate comparison), you’ll never know if your change was a success or a flop. Making changes without testing is not necessarily progress. Change for the sake of change is not necessarily progress.

Luckily, we talked to our clients and many of them agreed (or even suggested) to test the transition to see what works best for their customers. Let’s take a closer look at a couple of these cases:

Case 1: Minicart Popup

One of our clients had an existing minicart that was potentially being underused. To verify this, we compared the current transition of sending users to the cart page after adding to cart against the new test treatment of keeping users on the same page and displaying the minicart.

At HiConversion, we have multivariate capabilities, meaning that we can test multiple elements of a site at once and see the true impact of each element in combination with one another. In addition to testing the transition, we determined the effect of different success message colors, buttons within the minicart, and other aesthetic changes that were brainstormed with the client. Below you will see the winning combination of the many separate variables tested.

RK MiniCart

This multivariate test allowed four separate AB tests (with five total treatments), amounting to 24 different test idea combinations.

The results themselves were very interesting; going against the Amazon grain, a large majority of the combinations of test ideas proved to have a very positive lift against the baseline of pushing users directly to the cart. Below shows the relative performance of the many different combinations of test ideas, all being compared to a completely clean, test-free baseline (the orange line). As you can see, the majority have a positive revenue per visitor (RPV) lift.

RK Results

The overall winner of this test provided an RPV lift of 36.59% against the baseline, which accounted for $24,597.43 of additional revenue generated during this campaign alone.

Case 2: No Minicart

In this second instance, the client didn’t make the “Amazon change” point-blank, but wanted to test some transition options. This client sells contact lenses, of which users typically only purchase one prescription order at a time. In fact, the client had back-end statistics showing this was true of a strong majority of their customers. This data supports the concept of pushing a visitor directly to the cart. This is logically sound, but even the best, most data-supported hypotheses can prove untrue.

The original site had an orange header bar that would drop down when an item was added to the cart. To test against this baseline is to, of course, push to the cart page, but we also tested a different iteration of the stay-on-the-page success message. Instead of the orange success bar, a small blue “Checkout” button appeared below the cart icon.

This site's baseline shows a success message in the form of an orange banner that drops down when a user adds to cart.

The site’s baseline shows a success message in the form of an orange banner that drops down when a user adds to cart.

 

First Treatment - a blue checkout button appears below the cart icon after an item has been added to cart. The second treatment, pushing to cart page, is not shown.

First Treatment – a blue checkout button appears below the cart icon after an item has been added to cart. The second treatment, pushing to cart page, is not shown.

This test was run as an ABn, as it only had two treatments that would never overlap. Both of these treatments had strong lifts against the orange header baseline. Interestingly, the two treatments and the baseline all had a nearly identical conversion rate – it was the average order value (AOV) and thus RPV that set the different transitions apart.

Checkout popup is blue, Redirect to Cart is red, and the Baseline (orange dropdown banner) is orange

Key: Cart Popup is the blue line, Redirect to Cart is red, and the Baseline (orange dropdown banner) is orange. The overall results can be viewed in the table below the chart.

This test was particularly interesting because it tested two different versions of the stay-on-the-page Add-to-Cart transition. Redirecting to the cart had a strong lift against the orange stay-on-the-page baseline, but lost to the blue stay-on-the-page checkout button. Had we not tested the blue checkout button, the redirect would have won. Clearly, not only is transition type important, but visual appearance plays a key role in performance as well.

Ultimately: It’s not what you test, but that you test.

Both clients in these cases have similar premises, but completely different executions. One used a multivariate testing technique, which allowed all possible combinations to play and found a winning combination with a 36% RPV lift. The second only had two treatments running, garnering a winning treatment with a 10% RPV lift.

What’s most important here, however, is not the method by which these tests were run, or even the amount of revenue lift (okay that’s pretty important), but the fact that these companies did not take a study at face value. They were able to determine through testing which was the right solution for their customers.

At the end of the day, these two completely different clients found a similar winner – a winner that went against the results of a well-researched study.

The real winners, though, are the publishers of the study that sent everyone into this push-to-cart frenzy. They got to keep all of the credibility without actually having to prove anything – my hat goes off to them.

That’s not to say that these studies are pointless. In fact, I’d say the opposite. They provide wonderful ideas for your company to test (it even comes with a strong hypothesis based on previous performance to boot). Read these studies with a grain of salt, and let them fuel your testing idea fires.

Happy testing!