Tagged As: Optimization Result
I sometimes hear that optimization is hard work and what works for one site will not work for another. Here is a short story of glass.net, an automotive glass replacement service we worked with, which shows a completely different picture. This test was a highly successful replication that was purely inspired using past test data. Low effort. High probability. Just the way I like it.
This Canned Response pattern (one of many patterns we make our optimization work easier with) was a key starting point for our test. The pattern simply suggested to have a large textarea input box with default copy on a lead generation form. More specifically, the default copy should be written using a friendly first-person narrative, referring to something personalized (ex: a specific product already selected), and ending with a response-expecting statement.
When we were considering to test this pattern it was already skewed with positive probability from 3 winning tests (a really good thing). Our conversion patterns gain their predictive strength from exactly such past test data. The more positive tests we have for a given pattern, the more probable that the pattern will also work on future tests. These patterns often contain a mix of winning and losing tests with different degrees of effect and we factor this in.
For simplicity, the key measure we looked at was the repeatability score. Generally, a highly valid test under a given pattern receives a full ±1 point depending on whether it won or lost. And then we sum this together. The Canned Response pattern had a relatively high repeatability of 1.75 telling us that it would very likely win if we tested this. And so we did.
The Pattern Already Worked On Vivareal.com.br
One of the highly successful tests where Pattern #20 Canned Response worked very well was on Vivareal.com.br – a top Brazilian real estate & apartment rental directory. This means the pattern gained a full 1 repeatability point from this test alone. More so, using the effect data from this test (+16% more leads), we also obtained a sense of what we could expect in a future test. Armed with this knowledge we moved to replicating the pattern on glass.net.
It Replicated On Glass.net Like Butter
When we tested this pattern on glass.net, a few weeks later it turned out to be a solid replication success. Not only was the positive test outcome predicted, the effect was also very close. Vivareal.com.br recorded a 16% lead gain from their test, and Glass.net measured a 17% lead increase (such a tight replication wasn’t expected at all). This was an example of what I would call a really nice copy-paste experiment.
The Key Takeaway
I think the key insight from this is that general patterns do exist. The common belief that what works for one site, does not work for another has been challenged. We transplanted a similar idea from a real estate web site in Brazil into the automotive industry in the United States. Sure there will be cases where patterns won’t repeat as well (and we have examples of this too). The important thing here is to measure the degree of repeatability and begin to separate the more predictive patterns from the ones which are less effective. Using past a/b test data to predict future a/b test is now a possibility. If done right, it’s a perfect way to minimize effort, minimize risk, and maximize impact.
Next Steps: Test Patterns On Your Site
See Which Conversion Pattern Are Relevant For You
I encourage you to run a/b tests. We continuously identify conversion patterns for you to test with on GoodUI Fastforward. Many have positive data, and some have negative effects as well (members get to see all tests for given patterns so that the chances of replication increase).
Get An A/B Testing Tool Such As VWO
To test your ideas/patterns you need a good testing tool such as VWO (our recommendation if you don’t have a tool yet) that won’t hurt your wallet. Some pattern ideas are really simple and can be setup in minutes or hours.