Decisions are hard - or how I learnt to love AB tests - Blog -

About a month ago, we completely redesigned domcomp’s main page. If you haven’t see the old version, don’t worry, there are screenshots, below. The decision was not taken lightly, we knew that we were messing with what they know and love. At the time, it was very difficult to know whether our ideas were changes for the better or worse. We agonised over our choices and went back-and-forth between different designs. This is a short account of the motivations for change, why it was necessary and difficult and how we discovered that A/B tests are pretty awesome.

a Table is a Table is a Table

We launched with only a handful of providers. The goal was to find a no-nonsense way to show domain registration prices for each if these providers. For this purpose, displaying the data in a big table was an obvious choice; the 5-or-so providers along the top and a huge list of domain extensions on the bottom.

big table!

The big-table, which I shall refer to as Layout 1 here on, served us well for a long time. It was great when it worked but the table started to show its cracks as domcomp grew. Each additional provider added a column and expanded the table’s width. This was fundamentally not scalable as once the table width exceeded the screen’s then the user would have to scroll horizontal. This is far from ideal for obviousy reasons; you’d not longer get an at-a-glance view of all prices and a huge number-filled table was not easy to parse. Further, in a mobile-first world, big tables are not…well, very mobile-friendly.

In other words: a table is a table is a table.

Our philosophy is to take user experience very seriously. At the same time, requests to add providers were coming think and fast. We knew that we had to come up with a long-term solution but that didn’t stop us from trying to stave off the inevitable with a number of tactical fixes. For instance, we experimented with fixing the left-most column so that the TLDs would still be visible as users scrolled horizontally. We tried exchanging the X and Y axis so that providers where vertical and TLDs along the top. For a while, we only showed 8 columns and let the user configure which providers to see. All of these patches on the exiting layout solved one problem but introduced another, or simply did not address the underlying problem.

A Better Solution?

We looked for a more radical redesign. One such idea was to depart from a provider-centric view and, instead, order by the prices for each TLD. This would solve the scalability issue as the layout was no longer constrained by the number of providers. The idea, though, was controversial as it did have drawbacks compared to Layout 1. For instance, it would be more difficult to get an overview of prices for a single provider, across all TLDs. It all boiled down to how our users used domcomp and what they were after.

Change for the better?

We discussed the pros and cons in many sessions and asked for a lot of opinions. However, we were never able to reach a unanimous decision as we argued, in circles, about the benefits and drawbacks of each. In fact, even my own ‘gut feeling’ switched one day from the next. This even lead me to doubt my own ability for objective analysis as I wondered whether I was psycologically attached to the original layout that I had created. I was browsing one night and came across this Air BnB article about their use of AB test which was interested. We decided to give it a go.

AB Tests to the Rescue

In case you’re not familiar, an A/B test is, as Wikipedia puts it, “jargon for a randomized experiment with two variants”, which itself is jargon for “letting other people tell you what they think without knowing it”. Fortunately for us, Google Analytics, through which we track domcomp stats, provides built-in support for A/B tests that it calls ‘Experiments’. First, we defined ‘Goals’, metrics by which we judge whether one layout was better than the other. We had a few, but we’ll focus on look at one of them: click-through rate to provider’s sites. We reasoned that, since domcomp exists to help users register domains then more users clicking through to providers reflects success. This, by no means, is the only measurement of quality but is a valid example. We set up the experiment to send half of domcomp users to Layout 1 and other other to Layout 2.

And we waited.

Based on our settings, the experiment would run until it reached a 95% confidence level. Our expectation was to have to wait around a week or so to collect enough data to reach such a threshold. However, as luck would have it, domcomp was featured on ProductHunt the next morning. This generated a lot of traffic which all fed into our experiment. The test reached significance level in a few hours. And the results surprised us.

Taking our metric of provider link click-throughs as an example, Layout 2 vastly out performed Layout1. The results were similar amongst our other goals. With a huge number of views, it then became a no-brainer to switch Layout 2.

1... or 2?

Currently, domcomp is collecting and comparing prices from 18 providers, and counting. Layout 2 has allowed us to scale the number of providers without worry and brought many benefits such as responsive and mobile-friendly design. Now, after a month of switching to Layout 2, we can confirm that the Goals that we defined during the test are still consistent with the measurements during the experiment.

So what did we learn? A/B tests are pretty cool. Of course, there will always be the need for some gut-feeling judgement but A/B tests are certainly add a lot of value and make that processes easier. In numbers we trust.

Checkout Layout 2, for yourself,!