Let me tell you about the time a Google experiment saved my ass.

For those of you who don’t know, Google Analytics has an A/B testing feature. You can create multiple variants of a web page and then setup an experiment where Google Analytics directs your traffic to the different versions of your page. Google records how your visitors respond to the difference on some dimension that’s important to you like signups for your email list or purchases from your website. Google takes care of all the mechanics and statistics and tells you the winning variant or that it was too close to call.

Anyway, back to the story.

My goal

I am responsible for an ecommerce site and we thought it would help our conversion rates if we displayed some testimonials next to our products. Most of our customers are first-timers and we thought that testimonials would be a great addition to the site.

Implementation

We didn’t have any testimonials so I painstakingly contacted customers who reported high satisfaction with our products in a recent survey and asked them if I could use some of their comments on our website and promotional materials.  I collected 94 excellent testimonials in all. It took days.

I put them in a database table and we found a blank area on the site where the testimonials would fit. And then I designed and implemented a feature to call five random testimonials from the database and render them on the product page. Each product page would get a different five testimonials and they were cached for 24 hours and then another five random testimonials would be generated for each product. I thought it was pretty slick.

The Google experiment

I’m not really a front-end guy so I rely on Google experiments to test major changes to our site. I’ve been surprised over and over again with how many ‘good ideas’ end up hurting our conversion rates.

So I setup a Google experiment for the testimonials feature.

Results of our Google experiment

The version with the testimonials converted 47% worse than the original. Ouch!

Here’s a screen capture:

Google experiment results graph

So that wasn’t even close. Clearly my instincts for conversion rate optimization are terrible.

What did I learn?

Well, the obvious thing was that the testimonials didn’t work as we implemented them.

I also wasted a ton of time collecting and organizing all those testimonials. I conducted the Google experiment as an afterthought to prove that the testimonials were better than the original version of the site. But I never considered the possibility that they wouldn’t work. If I was going to try this again with the benefit of hindsight, I would probably just collect a handful of testimonials and hack them into the site for the Google experiment. Then if that experiment worked, I’d go back and collect more testimonials and improve the code to use the database and caching.

I think the bigger lesson is that feedback is extremely valuable. The Google experiment maybe took me two hours and it told me that I was about to tank our conversion rate.

Let’s think about what would have happened if I hadn’t run the experiment and just launched the feature. We probably would have noticed sales were off in a couple of days. But then what would we do? Probably sit on it for a few more days to see if the conversion rate bounced back. And then we might have speculated about the effect of weather or maybe our search rankings dropped or our Adwords weren’t working as well any more. There are so many things changing in our business that I don’t know if we would have ever have suspected the testimonials were killing us.

I speak from experience

I wanted to run the experiment on the testimonials because I had been surprised by a website changes in the past.

The big redesign

Several years back we decided to completely redesign our ecommerce site. It looked dated and lacked features we thought were important. So we hired a graphic designer. She and I worked on the design with feedback from other employees and I implemented the a new version of our whole site from scratch. The whole thing cost a small fortune in time and treasure. And the impact on conversion rates was…basically zero, according to Google Analytics.

I just couldn’t believe it. I studied the stats trying to find a way to explain why they were invalid or misleading but after a couple of hours I had to admit that there was no evidence that the new site converted any better than the old site. Many of my co-workers disagreed with that assessment but no one could find fault with the math so we all stopped talking about it.

Enlightenment

I was so confused by this result that I found a couple of books on conversion optimization and read them cover to cover. And it didn’t take long for me to realize how little we knew about building websites. We had the mechanics down but we didn’t understand conversion optimization at all.

So I evaluated our site with my new knowledge and a bunch of things on our site looked like the examples of what not to do in the books. One thing in particular caught my attention. We had this giant banner on the home page advertising one product that we thought our customers really loved. On the screen sizes popular at the time, the header and this banner took up the entire first screen of the home page. I tried to explain how the banner was an example of what not to do but people loved the banner–they were proud of it–and I couldn’t get permission to remove it.

My first Google experiment

I had read about Google experiments in those conversion optimization books and I convinced my boss to give it a try on the banner. We ran the experiment and removing that banner and one other item from the page boosted our conversion rate 72%! Can you believe that? I was sold on A/B testing from that day on.

What’s the bigger lesson here?

Aside from my plug for Google experiments and A/B testing changes to your website, I think the bigger lesson is that working with feedback is super important. How many things are we doing in our jobs that aren’t contributing to our success? If we aren’t checking if our changes or initiatives are working better than what we did before, how do we know they are better? We need feedback.

Since I started testing, I’ve found quite a few things that aren’t worth doing. For example, we used to produce a paper price list/catalog and mail it to customers. It cost a small fortune every year. But one year we randomly selected a number of names from the database and didn’t send them the price list and then tracked the sales through the year. The result: no difference. Again, people questioned the result. It just seemed impossible that it didn’t help, even a little. But in the end, we couldn’t escape the math and we scrapped the price list and never looked back.

How many things are you doing at your company that might be exactly like our price list–a complete waste of time and money? Could any of those things be effecting your constraint? Can you think of a way to test whether they perform as people suspect?

Wrapping up

Google makes it easy to see if version A or version B of a web page performs the best. And that’s good and you should definitely do those kinds of experiments where the outcome matters. The bigger takeaway is that A/B testing can help you solve really big problems for your company. I guarantee you won’t regret adopting a testing mindset.