A laptop computer on a table, with a screen displaying graphs
Photo: Carlos Muza

Do banner images in emails improve click rates?

Background

GetUp! recently made a decision to begin using image ‘banners’ at the top of our emails to improve the look and feel of our emails and to create some aesthetic consistency. This decision was not necessarily made for the sole purpose of improving the performance of our emails, but as part of this process we decided to run experiments to test whether there was any measurable impact on how many people clicked on our emails or took action as a result of our emails.

Conclusion

A majority of experiments saw better performance for emails containing image banners, but the experience was not quite universal, with the final test producing more actions off the non-image email. In addition, we had some minor issues with randomising of email lists. GetUp has now implemented image banners in our emails, but will continue to undertake tests.

Structure of experiments

We conducted experiments on four different email pushes, sending some people a version of the email with the banner, while others received the email without the banner.

screen-shot-2016-09-30-at-12-40-51-pm

Banner used on final experiment

Each banner was designed to be used for an entire campaign – not specific emails. Because of previous issues with open rates, we are comparing clicks to the total number of sends.

Results of experiments

Experiment one – email to our media list regarding SBS campaign

VersionSendsClicksActionsClicks/SendsActions/clicks
Banner57,38913,59113,56623.68%99.82%
No banner125,43224,00723,74019.14%98.89%

This email was an ask for people on our media campaign list to sign a petition regarding cuts to the SBS public broadcaster. There was a problem in sending out emails that one list received a lot more emails than another, but other testing indicates that the list-split was mostly random. This result was statistically significant.

Experiment two – email to our budget list

VersionSendsClicksActionsClicks/SendsActions/clicks
Banner15,0001,4991,6119.99%107.47%
No banner15,0001,2191,3698.13%112.31%

This result was also statistically significant. This email was also asking members to sign a petition – a very simple ask – so had a high action rate for both versions.

Experiment three – email to our Great Barrier Reef and mining lists

VersionSendsClicksActionsClicks/SendsActions/clicks
Banner64,4282,3251,3303.61%57.20%
No banner147,2503,9472,2312.68%56.52%

This email was asking members to contact members of Parliament – a higher barrier ask than signing a petition. Again, the banner version won on clicks and actions. Unfortunately we had similar problems with lists being cut with different numbers, but randomisation was mostly successful. Experiment four – email to our pensions list

VersionSendsClicksActionsClicks/SendsActions/clicks
Banner12,9071,1083648.58%32.85%
No banner12,9891,0813798.32%35.06%

Again, this ask was for members to email their MP. While there was a slight increase in clicks for the email using banners, this did not carry through to an increase in actions. Other attempted experiments Another experiment was conducted where, due to the content of the email, clicks were not tracked. While there was an increase in opens for the version of the email including a banner, we did not include this experiment in our conclusions. We also conducted two experiments on our weekly surveys, but these experiments combined the banner with a button experiment, so it is not possible to separate out the impact of the banner.


  • Author:
  • Organisation: Online Progressive Engagement Network
  • Release Date: 2015

© All Rights Reserved

Contact a Commons librarian if you would like to connect with the author