By using A/B testing to see works and what doesn’t work online, you can improve your customers’ experience and, hopefully, drive revenue…
Gaining a competitive edge in business these days is all about producing the best online experience for your customers. But how do you optimise the online experience for different types of users, including first time and returning visitors, especially if they are based in different parts of the world and using different devices to access your content? One way is with A/B testing (or “experimentation”) where you can test what works best before implementing any changes to your website or app.
Based in the United States, Cro Metrics is one of the leading experimentation specialists working with a number of brands including home goods leader Clorox, food delivery app DoorDash, clothing brand Tommy Hilfiger, and children’s charity, Unicef USA. We talk to Cro Metrics’ Senior Growth Strategist Ryan Lucht (pictured below), about how A/B testing can help to give your business an advantage when it comes to delivering online services.
TD: What do you see as the key differences between marketing your site for mobile and marketing for desktop users?
RL: I think it mostly comes down to real estate on the screen. Mobile users are far more easily overwhelmed. Things that work well on a desktop website such as having a live chat widget can become quite annoying on mobile. We find that with mobile, generally speaking, less is often more. Often our work is about what we can take away from a mobile page in order to make it more streamlined.
TD: Can lack of time that phone users tend to have be a factor when it comes to optimising the mobile website?
RL: It can be, but sometimes mobile users are more likely to browse than desktop users. For example, one of the clients we recently worked with ran a test for their SMS marketing which usually takes customers to a product listings page. We thought we should reduce the number of products on there, but in testing we found that no matter how many products we put on the page there were a large number of users who scrolled right through to the bottom.
TD: Is that because those who are coming to a site via SMS marketing are returning, rather than first time, visitors?
RL: Yes, the outcome may have been different if we were driving Facebook ads to a page compared to SMS traffic where we know this is a returning visitor and we have more captive attention. These are things that are really important to think about when optimising for mobile – such as what’s the context the user has when they’re coming to the page. For example, it’s important if they’re a first-time visitor to make everything really clear, to have the right elements above the fold, whereas this might not be so important if it’s the visitors tenth time to the site in a month.
TD: How much do marketers need to segment their online audience?
RL: There’s an extreme view that says every individual user needs to be treated differently. I don’t necessarily subscribe to that viewpoint. For example, it’s not always the case that mobile and desktop pages have to be substantially different. The only way you can find out whether they should be is to test different approaches and then to look at what segments of a user base respond differently than others. Sometimes it may be drastically, directionally different so that what can be a great idea for Facebook traffic might be hugely detrimental to SMS traffic or direct traffic or email.
TD: How much of a factor is someone’s location?
RL: It might be, though I don’t personally see huge swings by location, except where we can remove irrelevant content which isn’t applicable to a user’s location, or if the site sells a location-dependent product. For example, I work with a gym chain where one of the biggest considerations that users have is where the gyms are located. Are the gyms close to their home, or to their work? This is important so users can ensure that it fits in with their lifestyle and plans.
RL: It’s an evolving landscape with battles between Apple and Facebook about what sort of information will be available to personalise and what users will need to opt into. However, something you will always be able to do is personalise for context. We will probably always know that a user came to our site from social media or from search results or that this is their second or third time to the site versus their first time. This type of information is not personally identifiable but it’s often more powerful than knowing, say, how old someone is who is visiting.
TD: Do the A/B testing services you offer require a massive investment?
RL: It’s not so much about the size of the investment – what’s magical about A/B testing is that it allows you to test ideas far more cheaply than if you actually went out and built them. I would say at Cro Metrics only around 30 to 40% of the ideas that we try do we then recommend that the client actually goes out and permanently implements as part of their site. However, the reason why our services tend to skew towards bigger companies is that the quality of insights and the amount you are able to learn depends on how much data you have. Really you need to have a large number of visitors to your site to measure small changes with statistics.
TD: So would you say that companies are using your services as much to rule out what doesn’t work as much as rule in what does?
RL: Yes that’s a big part of it. That and building competitive advantage too. Every test that you run, you are learning something that your competitors don’t know unless they’ve also run a similar test. As an example, we know in the world of online mattresses there are only a couple of companies that run robust testing programs, while all the other brands just blindly follow them. But we know that what works on one site doesn’t always necessarily work on the others. If you’ve got reasonable UX (user experience) skills, you can look at these sites and tell which brands are careful about what’s on their sites, and which just throw every idea on their site because someone else did it first.
TD: So is it just about marginal gains?
RL: Sometimes, but other times it’s about big outliers – finding small changes that can have a big effect. One example I love to bring up is about a non-profit organisation we work with. On their online donation form there’s a checkbox to make your donation recur monthly instead of one time only. We changed that checkbox to a toggle and it resulted in a 30% increase in the number of monthly donations they received, leading to millions of dollars in additional revenue. There’s no UX best practice that says one is better than the other, so this is something that a pure design approach, without testing, would never have uncovered.
“I’m always suspicious of trends and quick wins. I say trust nothing and test everything.”
Ryan Lucht, Cro Metrics
TD: Obviously a lot of companies you work with are very much e-commerce focused. What kind of uplift are they seeing when they make changes following A/B testing?
RL: When we’re talking about a big win in e-commerce we could be talking about something that’s just a 2 or 3 per cent lift in checkouts. But when we’re talking about a B2B site where we’re helping to generate leads then we’re probably looking for somewhere nearer a 25% or 30% increase. But it could be as much as 300% if we manage to find a real outlier winner.
TD: One of the big problems with e-commerce sites that we’ve read about is about cart abandonment – leaving a website with goods in your basket.
RL: Yes, everyone likes to talk about cart abandonment, and, to some extent, it is a real issue and there are improvements you can make to reduce it. However, not a lot of marketers stop to think about how some visitors use the cart as a kind of shortlist they can go back to later. One past example – we were working with a medical supplies company that had an option on their product page to email a link to a friend, which seemed odd when you are talking about specialist industry-specific products. But when we tested removing the link it was catastrophic, because we learned that people were using it to email the product to themselves.
TD: Finally, are there any quick wins you can recommend to companies to help boost their online profile and/or sales?
RL: I’m always suspicious of trends and quick wins. I say trust nothing and test everything. Certainly companies should try whatever is new. One thing I see a lot of companies trying right now are these social proof pop ups such as ‘100 people ordered this item in the last hour’ or someone who lives in X bought Y product. But you should always test. What works for one company might not work for yours!
For more information about A/B testing visit Cro Metrics at www.crometrics.com