Optiverse Updates: Revenue Metrics, Tips & Tricks | DMA

Filter By

Show All
X

Connect to

X

Optiverse Updates: Revenue Metrics, Tips & Tricks

T-56fd3dd716d30-optimizley_56fd3dd716c33-2.png

The “Optiverse” exists as a resource for the community to post and discuss quirks and questions about their Optimizely experiences and this, in a nutshell, describes what this brand new series of blog posts throughout 2016 is about – cherry-picking the most interesting and relevant topics to help you use the tool effectively and ultimately, run awesome A/B and multivariate tests.

About The "Optiverse"

Just over a month ago, in the very first blog post of its kind, I covered some of the exciting things to come from Optimizely in 2016. If you missed it, check it out here.

As Solutions Partners of the A/B and Multivariate testing tool, Optimizely, we love hearing about Product Updates, News and recent Case Studies from our friends over at the Optiverse. Much like The Actual Universe, The Optiverse is constantly expanding and you probably won't have time to explore it all, so due to popular internal demand we have decided to make these posts available to all! Some posts, like this one, will cover a broader spectrum of topics, while others will will have a more central focus in one or two areas.

What's New: Better Revenue Metrics

The main story to come out of The Optiverse at the start of the year was that of improved Revenue Metrics. Optimizely have done well to up their game in this space...

View the distribution of revenue events across variations:

- Historically you could look at overall revenue from a variation vs. the control, but this won’t tell you where the impact is actually coming from.

- With the new revenue distribution view, you can see whether an uplift is coming from a large number of visitors making small purchases, or vice versa.

- Revenue events and conversion rates are distributed by quartile (i.e. divided equally into four groups – each group comprising a quarter of the data).

- View the data by quartiles to see where the biggest improvements are happening, or overall, to identify outliers:

Buckets:

Revenue Metrics - Buckets

Distribution:

Revenue Metrics - Distribution

Measure the impact of your tests on purchases:

- Optimizely now calculates statistical significance on purchases. Utilise this to quantify the impact each variation has on the total number of purchases in your test.

Understand the effects of your tests on revenue per paying visitor:

- In addition to revenue per visitor, a statistical significance value will be calculated on revenue per paying visitor.

Interested? Here’s the fine print:

- Available to all Optimizely price plans (web and app).

- For experiments started after 26 Jan, 2016.

- Revenue tracking must be enabled for the reports to populate, more info on that here.

- Final, crucial point. Revenue is unlike other Optimizely goals in that it does not de-duplicate (more on that later). The Uniques / Totals toggle is meant for binary goals as a way of seeing a duplicated count of conversions, and to avoid confusion should therefore be ignored when looking at Revenue. If you want to see your Uniques / Totals purchase events the best thing to do would be to set up a custom event or pageview goal on the confirmation/thank-you page.

Tips & Tricks

Here at Periscopix we like to get everything out of the tools we use, and amongst the sea of questions discussed over at the Optiverse recently I have picked out a few gems to help keep your Optimizely experience running smoothly:

- "How long does it take Optimizely to refresh the number of unique visitors to my experiments?"

- Many dynamic factors affect event processing across experiments, so Optimizely do not have a specific RLA for this. They do however state that updates occur in “near real-time”.

- So in reality, expect to see new data appear on the Results Page in as little as 3 minutes.

- If you still don’t see any data after half an hour, it’s likely that your experiment is configured incorrectly (checking the Audience conditions would be a good place to start).

- "How does Optimizely measure my monthly quota?"

- If you are running sequential experiments, have you ever considered why the enumeration found within the interface (Account Settings [dropdown menu] > Account > Plan & Billing) is lower than the sum of visitors gathered from your experiments?

- The explanation to this is that Optimizely “shares” your visitors between experiments. So if a user is bucketed into two experiments on two different pages, they still only count once.

- This is good for paying customers (not all A/B testing platforms provide this feature).

- "Why do visitor counts differ between Optimizely and Google Analytics"

As a dedicated GA team we receive this question quite a lot. Often you will see more visitors to an experiment in Optimizely than when segmenting in GA using the Optimizely custom dimension (if you haven't yet linked your GA/UA and Optimizely accounts there are many reasons why you should).

1. The snippet might be misplaced on the page – make sure, as below, that you load the Optimizely snippet before your GA/GTM snippet to prevent GA from winning the race condition:

Correct Optimizely Snippet Placement

Bad times

It can take up to 24 hours for the data to reach GA. However, if GA/GTM loads and fires its tracking request before Optimizely has set the dimension, then the hit will be recorded with no information (bad times).

Even with a “perfect” implementation, the numbers you see within the reports will probably differ slightly. So it’s important to keep these nuances in mind:

2. Unique vs Non-unique. For example, a tyrannous user clicks a ‘Get started’ button on one of your pages 50 times.

- By default, Optimizely visitors and conversions are scoped at the user-level and so from Optimizely’s perspective, this action has been performed by a unique visitor and is therefore counted as one conversion.

- On the contrary, within your Analytics platform this may be counted as 50 clicks or conversions.

Optimizely vs GA Duplication

3. The definition of a “visitor”. Another key reason for observed volume discrepancies is how the platforms define a “visitor”:

- GA uses a session-based tracking call, which means that a single “visitor” can trigger multiple visits over a certain time period.

- In contrast, Optimizely deploys a cookie with a 10-year expiry date and counts unique users.

- If you’d like to learn more about how GA defines a Session, check out this article. If you still encounter issues, I’d recommend turning your attention to Optimizely's troubleshooting page.

- "What do to when an experiment is finished?"

It may sound like a daft question, but acting on a conclusive result is not necessarily straightforward. In fact, there are three potential options:

1. Submit a request to your developers to make a permanent change in the code (or, indeed, yourself if you're more of a lone wolf).

2. Push the winning variation traffic allocation to 100% within Optimizely. This may be preferable if your dev. team is queued up on tasks, or if you can’t wait to push the experiment live.

3. Reiterate. In one of our own recent experiments we found that condensing an introductory paragraph into succinct, bulleted USPs on a Mobile landing page increased clicks to the account signup page from 9.8% to 11.2% - an uplift of 20% - so experimenting with the text size or the order in which they are presented is a logical next step.

That’s all for now..

Thanks for reading. It won't be long before my next post comes around - in the meantime why not get stuck into Optimizely’s Testing Ideas & Successes section for loads of ideas for A/B tests and case studies.

If starting, or doing more, website testing and personalisation is on your 2016 to-do list and you’re looking for expert support, give us a shout. There’s nothing we like more than using data to drive improvements!

To view this blog written by Chris Woods on the Periscopix website, please click here.

Hear more from the DMA

Please login to comment.

Comments