Background
GetUp! recently made a decision to begin using image ‘banners’ at the top of our emails to improve the look and feel of our emails and to create some aesthetic consistency. This decision was not necessarily made for the sole purpose of improving the performance of our emails, but as part of this process we decided to run experiments to test whether there was any measurable impact on how many people clicked on our emails or took action as a result of our emails.
Conclusion
A majority of experiments saw better performance for emails containing image banners, but the experience was not quite universal, with the final test producing more actions off the non-image email. In addition, we had some minor issues with randomising of email lists. GetUp has now implemented image banners in our emails, but will continue to undertake tests.
Structure of experiments
We conducted experiments on four different email pushes, sending some people a version of the email with the banner, while others received the email without the banner.
Banner used on final experiment
Each banner was designed to be used for an entire campaign – not specific emails. Because of previous issues with open rates, we are comparing clicks to the total number of sends.
Results of experiments
Experiment one – email to our media list regarding SBS campaign
Version | Sends | Clicks | Actions | Clicks/Sends | Actions/clicks |
Banner | 57,389 | 13,591 | 13,566 | 23.68% | 99.82% |
No banner | 125,432 | 24,007 | 23,740 | 19.14% | 98.89% |
This email was an ask for people on our media campaign list to sign a petition regarding cuts to the SBS public broadcaster. There was a problem in sending out emails that one list received a lot more emails than another, but other testing indicates that the list-split was mostly random. This result was statistically significant.
Experiment two – email to our budget list
Version | Sends | Clicks | Actions | Clicks/Sends | Actions/clicks |
Banner | 15,000 | 1,499 | 1,611 | 9.99% | 107.47% |
No banner | 15,000 | 1,219 | 1,369 | 8.13% | 112.31% |
This result was also statistically significant. This email was also asking members to sign a petition – a very simple ask – so had a high action rate for both versions.
Experiment three – email to our Great Barrier Reef and mining lists
Version | Sends | Clicks | Actions | Clicks/Sends | Actions/clicks |
Banner | 64,428 | 2,325 | 1,330 | 3.61% | 57.20% |
No banner | 147,250 | 3,947 | 2,231 | 2.68% | 56.52% |
This email was asking members to contact members of Parliament – a higher barrier ask than signing a petition. Again, the banner version won on clicks and actions. Unfortunately we had similar problems with lists being cut with different numbers, but randomisation was mostly successful. Experiment four – email to our pensions list
Version | Sends | Clicks | Actions | Clicks/Sends | Actions/clicks |
Banner | 12,907 | 1,108 | 364 | 8.58% | 32.85% |
No banner | 12,989 | 1,081 | 379 | 8.32% | 35.06% |
Again, this ask was for members to email their MP. While there was a slight increase in clicks for the email using banners, this did not carry through to an increase in actions. Other attempted experiments Another experiment was conducted where, due to the content of the email, clicks were not tracked. While there was an increase in opens for the version of the email including a banner, we did not include this experiment in our conclusions. We also conducted two experiments on our weekly surveys, but these experiments combined the banner with a button experiment, so it is not possible to separate out the impact of the banner.