It was last month that I took two screenshots of the after clearing my cookies in between. By doing so I managed to detect an active A/B without anyone even knowing that I knew – oooohh the excitement. Although Netflix is probably too big to share their results publicly (although they do show their amazing process), I also knew that it was just a matter of time before they would make a visible decision. And so I waited. When February came and all traces of the experiment disappeared, it became clear that their decision was made – they rejected the B version, keeping the existing control (A). Here are some possible explanations as to why this might have happened. (with references to actively tested UI patterns of course).

Netflix.com Comparison

The Changes

One word of caution before jumping to conclusions about all of the attributes of version A being positive. Let’s keep in mind that although leap experiments such as these are very valuable (they often have a bigger effect potential for better or for worse), they do suffer from diluting causation by merging multiple variables together. In the end we can’t really tell which of these individual changes were truly flat, positive or negative because they were all grouped together. To know more about the concrete effects of individual changes they need to be tested in isolation – and that’s why track individual patterns. Having said this, here is why I think version A might have been implemented by Netflix, with B being rejected.

  1. Headline: Watching Latest Movies Vs. Watching Anywhere
    The headline or value proposition on the A version promises viewers to “See what’s next” as opposed to “Watching on any device”. Given that A was implemented, this might be a subtle signal in favor of new or latest contest being more valued over how and where something is watched. Viewers might also be already expecting to watch anywhere with or without Netflix telling them so. We are tracking a series of headline patterns right here.
  2. Visible Movie Preview Thumbnails – Similar to Pattern #95
    One of the first dramatic difference between both versions is that the A (control) has more visible movie thumbnails. The movie thumbnails are higher up and take up more space than in B variant where they are positioned lower down. Using realistic imagery to reinforce the value proposition (“Seeing what’s next”) might be more powerful than abstraction in this case.
  3. Consistent Vs. Varied Calls To Action
    The A version shows 2 consistent instances of the primary call to action as “Join free for a month”. The B variant on the other hand shows 4 instances of the primary call to action with 4 diverse messages. Retroactively I suspect that such diversity of messages might raise a the level of uncertainty: does clicking separate buttons have different consequences for the user (could users somehow subconsciously feel that “joining for a free month” is mutually exclusive to “trying without committing”)? I think this would be a great followup experiment on its own.
  4. Ghost Buttons
    All I’m going to say is that we started to track the effects of ghost buttons as a separate pattern.
  5. Answering Price Questions VS. Raising Price Concerns
    Although the A version does not show the price immediately, it does make it accessible within 1 click on the “Pick your price” tab. The B version mentions “one low fee” but does not really provide an answer – leaves users in a state of anxious uncertainty. We started a collection of pricing patterns to begin answering related questions.
  6. Cancellation How-To Visual
    I love the way the A version reinforced how and where in the UI users can cancel easily – whenever they like. This makes the low barrier to entry ultra low and trustworthy. I will definitely be turning this into a unique pattern on its own to inspire future experiments.

Share Your Thoughts

Do you think there is anything else between these two variations that might have played a key role in A leading the way? Share your thoughts. Oh and of course, if you are interested in learning about what patterns tend to win or lose, please see the wealth of evidence-based work we are actively publishing for your use.




Source link https://goodui.org/blog/the--netflix-homepage--experiment-that-nobody-even-/

LEAVE A REPLY

Please enter your comment!
Please enter your name here