AB Testing with Opera North

Author: Kate Arnold

Over the past half a year, at Substrakt we have had our fingers in many pies. From launching new websites to implementing our innovative Quick Donation tool, we’ve been very busy these past six months. These unprecedented times have made it even more crucial to analyse and optimise purchase flows and key user journeys. We’ve had the opportunity to work on some real data-driven projects, and that’s what I’ve had the pleasure of working on!

Identifying ways to maximise revenue

Enter Lily, a Web and Digital Officer from Opera North. After spending quite some time discussing the best way to optimise their new donation component, we decided now would be the perfect time to try to employ some AB testing. Our objective was to both increase the number of donations made and the donation revenue. We wanted to make some serious content and design decisions based on more than just our gut instincts!

More specifically, we wanted to understand which increments of donations users would respond to more and if this would differ from page to page. We selected four pages to test our donation component out on and then decided upon two sets of donation increments: £10, £25, £50, or user input, versus £20, £50, £100 or user input.

This is known as The Anchoring Principle in terms of user interfaces. In terms of donations, The Anchoring Principle can be useful as:

“People tend to focus on a single, initial piece of information, which influences how they estimate value and make subsequent decisions…

…A suggested donation value can also prod users in the right direction on nonprofit websites, by providing an anchor for deciding how much they should give. This default takes the decision burden off users: depending on their situation and level of interest in the cause, they can donate less or more than the recommended value.”

The technical bit

To run our AB test, we decided to use Google Optimize: a free tool from Google that allows you to run various experiments on your website. You can link the experiments up to multiple conversion points, so you can see how small changes can significantly affect user behaviour (i.e. purchases or newsletter sign-ups).

So when setting up our experiment, we wanted to see how the different iterations of suggested donation prices affected the number of successful donations made, and so set up our experiment accordingly.

Donation Component Variant 1

Opera North Donation Variant 1

Donation Component Variant 2

Opera North Donation Variant 2

One of the most important things to keep in mind when running an AB test is that apart from the variant (in our case, the suggested donation amounts), everything else about the user journey should be the same. This way, you can say for certain it’s the variant change you made that improved or worsened the user experience.

Naturally, we ran into some complications. When we updated the suggested donation amounts through Google Optimize, an additional click was added to the user journey on the updated donation increments.

This added click was a result of the way Google Optimize works: instead of allowing you to edit the HTML of the page, it allows you to add a layer of HTML over the elements that already exist on the page. With this, the new donation increment buttons did not redirect automatically: a user would have to select the donation amount and then click the donate button.

Having one extra click might not seem like that big of a deal, but an extra step in the checkout flow can be the difference between a successful donation and an abandoned cart. We had to get creative!

Google Optimize allows you to edit the HTML of a page, add JavaScript to any elements on the page, as well as decide the weighting of each variant (in our case, 50:50). We had to keep in mind that it was also important for us to be able to monitor these trends in Google Analytics: we needed to be able to clearly see which variant generated the most interactions as well as donations.

Being able to see this data in Google Analytics meant that we could also see this data through various lenses: device breakdown, channel acquisition, and any other events a user might have triggered in their journey.

After many trials, tribulations, tears, and some help from the wonderful dev team at Substrakt, we came up with some clever JavaScript that would give us all the information we needed about the variant while ensuring the user journeys were consistent for each variant.

On the original donation amounts, we simply implemented a data variable that let us know which variant each element was in:

element.dataset.variant = 'variant1';

On the new donation amounts, we added a few more attributes: this changed the donation amount as well as told us which variant each element was in. In the example below, this script changed the donation amount to £100 and added a data variable to let us know it was in the second variant:

element.value = 100;
element.innerText = '£100';
element.dataset.donateAmount = 100;
element.dataset.variant = 'variant2';

With these changes now in place, all that was left for us to do was to run the experiment and watch the data roll in.

Measuring success

At the time of writing, we are still waiting for more sessions to determine the clear winner. Google Optimize needs a minimum of one experiment session per day, per variant to determine a winner. In our case, we are trying to find out “Which variant generated the most donations, and does this differ from page to page?”

What we learned

Creating experiments like this enables us to make decisions about the website based on real data from real users, instead of just our own assumptions and hopes.

Once the final tallies are in, we will be able to make an informed decision as to which donation suggestions generate the most donations in total, and so improve the overall performance of the website: something that certainly cannot be overlooked in these unprecedented times.

Get in touch

If you’re looking to improve the effectiveness of your fundraising ask, content or calls to action we’d love to hear from you and work with you to evaluate and optimise. Get in touch by emailing: team@substrakt.com