Within the domain of digital marketing, there is one discipline that – unlike the others – does not focus on getting more traffic to a website. CRO or conversion rate optimisation concentrates its efforts on getting more visitors to ‘convert’. That is to say: get them to make a purchase, fill in a form, sign up for the monthly newsletter, or any other important goals that you’d want to reach with your site. Of course there are several ways to do that. By deploying interesting persuasion tactics like social proofing or reciprocity, for example. Or to focus on user experience and creating excellent customer journeys.
So instead of conversion rate optimization, we might as well call it customer response optimization. And ‘cause narrowing its purpose down to only conversion kind of misses the point. We can use CRO to make on-page processes and funnels better, easier, and reduce the time it takes for a new visitor to become a lead for example. Or we can find ways to increase the average order value of each customer. The million dollar question though, is how do we manage to achieve that?
This is where a much discussed CRO tool (or rather a method for discovery) comes into play: A/B testing. There’s many forms of A/B testing, like multivariate testing, or split-url testing. But what it all comes down to is this. Using software, you send 50% of your visitors to your standard webpage (A). The other 50%, we allocate to the same webpage, but this variation (B) has some interesting changes opposed to version A. Maybe we use different images or headlines here, or change up the order of sections. Anything’s possible. We want to do this A/B testing thing so we can figure out if our proposed changes to a webpage actually have an effect – and if so what effect and in what capacity. If our variation B does not lead to more user conversion, we have to take learnings. This is one of the most important aspects of this whole CRO thing. Not every idea is golden, in fact, most of them are flawed. The purpose of A/B testing is to find out more and more while we are reiterating and experimenting – until we find out what the actual optimal situation looks like.
Of course: we do not want to blindly start testing stuff on our websites. That is not what a culture of experimentation is about. To begin the process, make sure the proper event tracking is set up on your site. Start off with a quantitative assessment of the available data. Analytics is a great starting point. Besides that, you’ll want to do qualitative research. On-site user polls, heatmaps and session recordings are great tools to figure out the problems and roadblocks on your site. From there we have quite a bit of data from which we can identify the issues we want to tackle. Creating a backlog full of hypotheses and testing ideas and prioritizing these is the next step. Once we have our prio’s, the tests can finally be built by development and activated. We now have our experiment running and in a few weeks time, we’re sure to have some interesting results! In practice there are a few more steps here, but to keep things simple I’m leaving them out.
The main point of CRO is to discover and learn how to improve and reiterate – not to ram changes we personally think are cool down our visitors’ throats. Once you start practicing CRO, you’ll quickly find that most of your ideas are flawed. They’re good ideas in theory, but they do not work for your target audience. Or: they do not work for most demographics within that audience. I honestly believe that there is a correlation between a culture of humility and experimentation and prospering websites. When you look at Booking.com for example, one of the biggest travel and leisure companies world-wide, they have an excellent focus on experimentation in their culture. In 2020 they ran over a 1000 tests consecutively. Same goes for Dutch giants Coolblue and Bol.com. These hugely successful e-commerce platforms are testing and reiterating every single day. Not because they’re insecure and don’t know what to do: but because they know that they have to experiment in order to find the optimal situation.
Life is all about uncertainty, but for some reason we feel we have to (always) be strong and self-assured when it comes to business. Unmoved by the opinions of others, our own thinking is the only way. Personally, I would love to hear more marketeers say: “I don’t know what the best approach is here. But I know how we can find out.” I do honestly believe that is the way forward. Towards better focused innovation and cost effective digital strategies that revolve around user-centered design.
A culture of experimentation and humility has its merits. Not just for marketeers and innovators, but for every company layer. Conventional assumptions about culture and process have to be challenged: top-down. That’s the hard part.