A laptop computer on a table, with a screen displaying graphs
Photo: Carlos Muza

6 Digital Fundraising Experiments by Get Up

Introduction

Digital campaigners and fundraisers know that the difference between a successful online action and a flop isn’t always in the issue, the strategy, or the content — sometimes, it can be in the colour of the button. And the only way to find out which factor makes the difference is through testing and experimentation.

GetUp runs a sophisticated internal program of experiments and analytics, focussed on optimising their emails and pages for member action. Below, you’ll find a collection of experiments they ran between 2013 and 2016 – including their methodologies and results – along with insights into the inner workings of their analytics program.

Select your donation amount in email: Does it work?

Background

This was an experiment run by GetUp on selecting donation amounts in emails, that did not show a positive effect. GetUp has seen numerous other online campaigning organisations employ the technique of giving email recipients the option of selecting their donation amount in the email before clicking through to the donation page. GetUp has never previously done this, and we have conducted two experiments to see if we could find an improvement in donations if we implemented this feature.

Hypothesis

If email recipients are able to select exactly how much money they wish to donate from within the email, it may increase clicks or donations.

Structure of experiment – experiment one

The first experiment was on an email sent to our entire list in New South Wales regarding changes to NSW electoral funding legislation. Half of the list was sent a standard email with text hyperlinked asking people to donate without specifying an amount, and linking to a standard page. The other half of the list was sent an email which included graphic for the five standard donation options. The buttons were pure CSS, not images, so wouldn’t have affected the open rate.

Structure of experiment – experiment two

The second experiment was on an email sent to the large majority of the Great Barrier Reef list regarding the campaign to convince Australian banks to not invest in the Abbot Point project. The list was split into three arts. One third was sent a ‘control’ email that only contained a standard link to the donation form. The other two versions both offered donation links for specific donation amounts: one in the form of hyperlinked text, and the other in the form of hyperlinked buttons.

Results

So far, we have not identified any advantage in providing specific donation amount links in emails.

Experiment one

SampleSendsOpensClicksDonationsTotal $Avg $
Control123,04624,2521,287706$12,933$18.32
Variation120,49623,337724420$10,056$23.94

For experiment one, there was massive decline in the clicks/opens rate for the variation, from 5.3% to 3.1%. The rate of donations-per-click increased slightly, from 54.9% to 58%, and the money raised per click increased from $10.05 to $13.89.

Despite the average donation size increasing, and the average money raised per click increasing, the amount of money raised for a similar number of opens dropped by 23%. Examining the breakdown of donations based on the dollar amount, there was a proportional decline in number of donations at the $5, $10, $20 and $50 amounts, and a massive drop-off in the number of $3 donations given. There was no drop-off in the number of $30 or $100 donations, which were the most expensive options given in the email.

This suggests that providing the donation buttons reduced the amount of small donations given, in particular at the very low $3 level. Thus the decline in the total sum raised is less dramatic than the decline in clicks and donations would suggest, but it is still less.

Experiment two

For our second experiment, we considered the possibility that members may have been confused by the use of images as ‘buttons’ – something GetUp has never done before. So we ran two experiments – one using hyperlinked buttons, and the other using hyperlinked text.

SampleSendsOpensClicksDonationsTotal $Avg $
Control64,88216,6291,702749$26,699$35.65
Var. A (button)64,50716,4431,717734$27,374$37.29
Var. B (text)64,50316,5991,890716$25,970$36.27

The number of opens was very similar for all three options. Variation B (the email including hyperlinked text for each donation amount) produced 10.1% more clicks, but this increase in clicks did not translate to any increase in the number of donations, the total dollars raised, or the average donation size.

The three versions have very similar statistics, and there is no clear winner or loser in terms of statistical significance.

Conclusion

At the moment, we haven’t been able to identify any added value in providing specific donation amounts in emails.

Our first experiment produced a significant decline in the amount of clicks, and then the amount of money raised from the email with the options available. The second experiment did not repeat this unusual result, so it is too soon to conclude that donation amount links have a negative impact on the results of an email, but it again did not demonstrate a positive impact.

It is possible that there could be other ways to structure this process which would produce more benefits, such as integrating the option with the ‘quick donate’ system, to allow people to effectively make the donation just by clicking on a dollar amount.

GetUp’s successful trial of a multi-step donation form

Background

GetUp have been experimenting with a new donation form structure. Our previous donation form took place entirely on one page. The donor would input their email address, then provide any additional personal information, then select a donation amount, then provide credit card details.

A screenshot of part of the old GetUp donation form.

screen-shot-2016-09-30-at-12-32-09-pm

The GetUp tech team has developed a new form and separates the parts of the donation process onto three separate pages:

  • Select donation amount
  • Provide personal information
  • Provide credit card details

The first version of the multi-step form (first page).

screen-shot-2016-09-30-at-12-32-09-pm

We have run a series of two experiments to test the effectiveness of the new form. After an original experiment was inconclusive, the second experiment demonstrated that the new form was slightly better than its predecessor.

Structure of experiment – round one

In the first round, visitors to donation pages were either sent to the original donation form or the new donation form. At the time, the new donation form ordered donation amounts in a descending amount, from the largest to the smallest. This reflected a previous experiment on the original form which found that descending amounts were superior to ascending amounts.

Results of experiment – round one

The multi-step form produced slightly more donations than the conventional form, at a statistically significant level. However, there was a slight decline in the average size of donations, from $33.78 to $31.96.

VersionVisitorsDonorsTotal $ raisedAvg $Convert %
Conventional39,5837,604$256,828$33.7819.21%
Multi-step39,3857,919$253,122$31.9620.11%

This data excludes a small number of large donations that skewed the average donation size.

Further analysis revealed that a lot more people on the new form chose the default $10 amount, with less people choosing to input their own amount. On the new form, there was an increase in the number of visitors donating the default amount, and donating one of the suggested amounts, but a decrease in the number who inputted their own amount.

Conclusions from experiment – round one

We concluded that, while the form did produce an increase in donations, it was concerning that this increase in donations did not produce an increase in the total amount raised due to donors choosing the default (rather small) amount rather than inputting their own amount, which is usually higher. You can see in the above screenshot that the first version of the multi-step form had a separate box where the donor can fill in their own donation amount.

Following the first experiment, the tech team redesigned the ‘other’ part of the form to integrate it more closely with the other donation buttons. You can see the final version of this part of the form on the website here.

This version was integrated in the second major experiment. We thus designed a second experiment to try and find ways to encourage more donors to choose larger donation amounts, while still ensuring a larger conversion rate.

Structure of experiment – round two

In addition to the conventional form and the multi-step form as originally designed, two additional versions of the form were added:

  • A multi-step form where donation amounts were sorted in an ascending order (as opposed to the descending order previously used)
  • A multi-step form with smaller buttons (basically the same buttons as appeared on the conventional form)
VersionVisitorsDonorsTotal $ raisedAvg $Convert %
Conventional4,693818$30,482$37.2617.43%
Multi-step4,617829$28,042$33.8317.96%
Small buttons4,780784$25,940$33.0916.40%
Ascending5,1541,040$45,872$44.1120.18%

The multi-step form with ascending amounts was a clear winner. The average donation size and conversion rate were both highest for this form.

As on the first test, the originally-designed multi-step form produced slightly more donations but at a lower average amount. The multi-step form produced the lowest average donation size and the lowest conversion rate. The multi-step form with ascending amounts resulted in over 20% of visitors donating, compared to 16-18% for the other three versions. It also increased the average donation size to over $44, well ahead of the other amounts.

Conclusion

This experiment did not produce ground-shaking results, but does appear to have produced an increase in the number of donations through our new donation form.

We have implemented this new form, with ascending amounts, across our website.

The final version of the form.

screen-shot-2016-09-30-at-12-32-09-pm

How personalised donations amounts boost digital fundraising

Background

A variety of different organisations employ ‘highest previous donation’ technology in determining donation amounts to be suggested to potential donors. GetUp has been discussing employing such a technology, and earlier this year we implemented a tool in the backend of our website where campaigners could specify different amounts for visitors who could be identified as being previous donors.

These amounts could either be in the form of a percentage, or a dollar amount. In the former case, donation amounts would be suggested as a percentage of the person’s highest previous donation. Where a dollar amount was included, that would be shown to all donors regardless of their previous donation – in that case, the tool effectively offered a different, but static, set of suggested amounts to donors as opposed to non-donors.

Experiment design

At the same time as this tool was implemented, an experiment was also created where visitors to the donation page would be offered one of a number of different versions of amounts.

  • “control” – no personalisation at all – donors were offered the same amounts as non-donors.
  • “relative” – full personalization – the donor is offered a series of amounts all relative to their highest previous donation.
  • “static” – previous donors were offered a higher amount than non-donors, but it was the same for all previous donors
  • “static_less_amounts” – same as static, but with a smaller number of options.

The experiment started in February 2015, and has been running for the last six months.

Experiment results

Version# of donationsTotal donatedAverage donation size
Control9,771$274,235$28.07
Relative8,879$329,495$37.11
Static8,911$305,149$34.24
Static less amts8,909$319,550$35.87

All three variations produced less donations than the control – all dropping by about 9%.

However the amounts of money donated tell a different story. All three variations resulted in substantially more money being donated, ranging from an 11% increase for the static version to a 20% increase for the relative version.

These changes translate into a 32% increase in average donation sizes for the relative version compared to the control.

Analysis

We are most interested in the comparison between the “relative” option (fully-implemented HPD) and the control. It is clear that the HPD option results in a drop in the number of people donating, but this is significantly outweighed by an increase in the size of donations, resulting in a substantial increase in the amount of money raised.

In addition, the same figures were generated for shorter time periods within the overall experimental period, and analysis was also performed on particular donation pages which raised large amounts of money, and in every case the same pattern (HPD producing more money, control producing more individual donations) remained steady, although in some cases the figures were not as strong.

For example, the increase in amount of money donated between different donation pages ranged from 13% to 53%. Donations in the last two months were only up 6% on the HPD version, whereas donations in the month before that were up 32%.

Recommendations

While there may be ways in which the personalised donation amounts system can be improved, it is easy to see that it produces a boost in donations when used.

There may be circumstances where a campaign specifically wants to bring in small donations on a new issue and may want to avoid using HPD, and has the ability to turn off the feature, but it should be used for most fundraising.

GetUp should also continue to monitor feedback from donors on whether the HPD system is causing irritation with members by asking them for unreasonable amounts of money, and consider possible modifications to taper off the upper end of donation asks to handle this problem without affecting fundraising.

Finally, we should continue to run this experiment for another two months, and monitor the results over this period to check that there isn’t a continued decline in the effectiveness of the HPD system, considering that the performance has been less impressive over the last two months.

We will also be doing some more investigation to check how the conduct of this experiment interacted with campaigners over-riding default donation amounts, to ensure that the same results are found when excluding non-default donation amounts on pages.

In the near future, if these results hold, GetUp should consider implementing relative amounts as a default, and also run further experiments about the ideal level to set these relative amounts at

To blurb or not to blurb in fundraising modules

Experiment design

Our donation form includes a section near the top where the campaigner can insert some explanatory text. This can be text required for legal or regulatory reasons, or an explanation of the campaign to complement the campaign page that sits in the main body of the page (with the donation form on the right-hand sidebar).

For this experiment, some visitors to the website were randomly allocated to a donation page which did not include any text at the top of the donation form. The two versions of the form can be seen below:

screen-shot-2016-09-28-at-2-33-03-pm

Top-line experiment results

Combining all visitors to the website who participated in the experiment, the experiment produced the following results.

VersionParticipantsConversionsConversion %
Control53,6398,80716.4%
Variation54,0459,16016.9%

This result is close to being statistically significant. Unfortunately, the gap has closed slightly since preliminary analysis.

Detailed experiment results

We have also used Google Analytics to track the experiment results, broken down by different donation pages.

The problem with this experiment is that each donation page has different words in the blurb, and the content on the main body of the page varies from page to page. It appears that the removal of the blurb is more useful on pages with a relatively short body text.

There are five pages on our website which raised significant amounts of money during this experiment. The results vary significantly between pages.

“Thanks for completing our Vision Survey!”

The body text on this page is very short. The inclusion of a blurb can push parts of the donation form below the bottom of the browser on many screen. The removal of the blurb removes the need to scroll down, and we see a big increase in the proportion of visitors donating.

VersionSessionsDonationsDollarsDonations %
Control7,761265$6,927.683.41%
Variation8,134397$9,299.814.88%

“Help us fight Adani”

The body text on this page takes up about as much vertical space as the donation form without the blurb, so the inclusion of a blurb may require otherwise unnecessary scrolling. We saw a large increase in the proportion of donations.

VersionSessionsDonationsDollarsDonations %
Control3,6431,020$50,588.7628.00%
Variation3,6331,108$58,768.2030.50%

“Make sure our Senate see this ad”

On this page, the inclusion of the ad image means that the body of the page covers a larger area. If you scroll down to read the ad, the version of the donation form including the blurb is perfectly lined up. If anything, the donation form without a blurb is too high. This resulted in slightly more donations from the control.

VersionSessionsDonationsDollarsDonations %
Control3,407526$17,049.2515.44%
Variation3,569542$16,891.5915.19%

“Chip in to get this ad in the paper!”

The page text is substantially longer than the donation form, so a person who reads to the end can no longer see the donation form. The proportion of donations is about the same between the two versions.

VersionSessionsDonationsDollarsDonations %
Control3,089769$23,494.3824.89%
Variation3,208793$26,105.4024.72%

“Let’s get loud: will you help slam the brakes on the TPP”

This page also contains much more text, and we see little difference between the control and variation.

VersionSessionsDonationsDollarsDonations %
Control1,460332$12,147.3222.74%
Variation1,276291$11,425.8922.81%

Conclusion

While the overall results suggest a moderately positive impact for removing unnecessary text from the donation form, the more detailed results suggest a more complicated picture.

In cases where the main body of the page contains a lot of text or images, there is no benefit from removing text on the donation form, and it’s possible that a donation form that extends to the bottom of the page could work better.

Recommendations

  1. Create an alert for campaigners when making a donation form if they include a blurb over a set character limit, to encourage them to think carefully about making this form longer.
  2. Investigate making the donation form “sticky” so it moves with the rest of the donation page if the user scrolls.

Does button colour on pages impact action rates?

Background

GetUp has recently implemented a new tool that allows us to run A/B tests in the background of all of our pages without needing to interfere with the structure of email sends.

As the first test of this new technology, we implemented an A/B test on the “donate” button on all of our donation pages.

At the moment, our “donate” button by default is blue. In the experiment, half of all visitors to donation pages were served a button coloured orange.

Hypothesis

There are graphical changes that can be made to our donation buttons that result in larger donations and more donations.

Results

Approximately 57,000 visits were recorded to donation pages as part of this experiment.

There was a slightly higher rate of conversions on the experimental page (orange button). The average size of donations was also larger. These two factors combined to increase the average amount of money raised per visit from $1.71 for the control to $1.86 for the new version. This was an increase of 8.7%.

Version$ raisedConversionsVisitsConv. %$/clickAverage donation
Control$48,5491,26028,3924.44%$1.71$38.53
Variation$53,3291,33028,6834.64%$1.86$40.10

Conclusion

The orange button produced an improvement in donations. It is not possible to say that the increase in conversions was statistically significant, and may require a longer experiment to produce such a result.

Overall the improvement is subtle.

It is important to note that we don’t know how much of the effect is due to orange buttons being generally better than blue buttons, and how much is due to the novelty of the change in the form.

Future steps

We are now running another experiment on a new donation form that uses a multi-step design. This will likely change behaviour in more substantial ways.

Once this is completed, we will look at running further experiments on buttons with different colours and wording to further improve the design.

It may be worth running this test again in the future after the novelty of an orange button has worn off.

How much should your default donation ask be?

For the 2013 federal election, an ask was put out to the GetUp list asking for donations to fund election advertising.

The list was split into three groups: those who have never donated, those who have donated large amounts (over $400) before, and other donors.

The group of people who have given large amounts is quite small, and thus the focus was on the other two groups.

In both cases, we split the list in half (roughly). All recipients received the same email, but the default donation ask on the page they clicked on was different. One confounding factor was that donation pages had very slightly different page names. All other content was consistent.

  • For non-donors, the amounts were either $5 or $30.
  • For donors, the amounts were either $30 or $70.

There is evidence that the increased default ask was effective for previous donors. While the number of members to donate was less for the $70 ask (14.7% of clicks, as against 15.1%), the average amount donated was higher ($71 vs $51), and the total amount raised was higher.

  • The $70 ask resulted in $10.50 raised per person who clicked on the page.
  • The $30 ask resulted in $7.70 per click.

There was a hope that reducing the default ask for non-donors from $30 to $5 would result in a larger number of donors giving money, which would hopefully result in an increase amount of money raised, even if the average donation was smaller.

However, non-donors were slightly less likely to donate on the page with the $5 default ask (3.2% of clicks) as compared to the $30 default ask (3.3%). The average donation was also lower for the $5 default ask ($37 vs $47).

  • The $30 ask resulted in $1.55 raised per person who clicked on the page.
  • The $5 ask resulted in $1.16 per click.

This experiment shows evidence for the idea that pre-existing donors have the potential to increase their donations with a relatively small impact on the number of donors giving money. No evidence is apparent that a very low default ask on the donation page is effective at broadening the number of donors.

There are a couple of design elements which may be hindering any potential for this to be more effective:

  • Flagging default donation asks in email – Non-donors may not be clicking on the link, as the email does not refer to the default ask, and someone who may be more willing to donate if a smaller ask is made may already be deterred before getting to the page.
  • Changing position of donation form so person selects their dollar amount before giving credit card details – at the moment the form asks for the dollar amount at the end, and on a 13-inch laptop screen this part normally does not appear without scrolling. May result in people being deterred before realising that the ask is for only $5.

It is also possible that more sophisticated division of the GetUp list could be helpful in effectively differentiating GetUp members to determine the ideal default ask.