Slide Presentation Title Slide - title reads 'Surviving the Pile-On Navigating Online Culture Wars'. Quiip logo is top left. Photo on right of young person sitting outside on shop ledge looking at their phone.

Surviving the Pile-On: Navigating Online Culture Wars

Introduction

Learn how to navigate online culture wars and survive social media pile-ons with tips and information from Larah Kennedy, an online community and social media specialist who is General Manager at Quiip.

Larah gave this presentation at FWD+Organise 2024, a conference hosted by Australian Progress in Naarm/Melbourne, Australia.

Social Media Trains Us for Outrage

Humans are motivated by social reward such as praise, recognition, attention or acceptance. We feel socially rewarded when we have our opinions validated, receive a strong reaction to something we’ve said or get a sense of belonging/identity from feeling like we are part of a group. We are particularly sensitive to social reward when it comes to expressions of outrage.

Social media algorithms amplify content that sparks outrage, because they are programmed to facilitate social reward. So not only are we motivated by interactions that result in social reward, but we are also more likely to see it across social media platforms.

Participating in moral outrage can be a learned behaviour and on some platforms expressions of outrage becomes normative (Twitter). This is often where we see a strong outrage culture, where the default is to react with heightened anger to a perceived injustice.

Researchers at NYU (Dr William Brady et al) have found that the use of moral and emotional language increases virality on social media, which is also where things like click-baiting and rage-baiting come into play. More problematically, they also found that moral outrage in online environments is linked to hate speech.

A study out of NYU analysed 500,000 tweets and found that the use of moral & emotional language resulted in a 15-20% increase in retweeting for each moral and emotional word.

Read more:

Pile Ons

Defining Key Concepts in a Pile-on

There are 3 types of piles ons including:

Outrage

An extremely strong reaction of anger, shock, or indignation. (Oxford dictionary)

  • Moral Outrage
    The feeling of outrage that is triggered by a sense of right and wrong and makes us want to hold someone accountable/punishable.
  • Manufactured Outrage
    Deliberately spreading or creating outrage for the purposes of creating profit, disruption, or to advance specific ideologies or political objectives. Chaos entrepreneurs benefit from the conflict and instability that ensues.

Misinformation & Disinformation

  • Misinformation
    False or misleading information that is shared regardless of intention to deceive.
  • Disinformation
    False or misleading information shared knowingly with the intent to deceive.

Cancel Culture

Boycotting or shunning an individual based on something they’ve said or done. There is a tension between being held accountable and being bullied.

Main Components of a Pile On

Presentation slide - Text reads 'Main Components of a Pile On
The Instigator
The Mistruth
An outright lie, or more often a bending of the truth.
The Angry Mob
The pile on of a group of people who are angered by the mistruth outlined by the instigator.
This can be amplified by disingenuous accounts (bots, fake profiles etc).' Quiip logo on bottom right.

There are 3 main components of a pile on:

The Instigator

The ringleader, initial person or organisation who sparks the moral outrage.

In many examples of outrage culture online there is a catalyst that kicks off the moral panic. Sometimes this is an individual (well-known or not), sometimes it is a media organization. We have so many well known examples of this i.e. J.K. Rowling or Elon Musk on Twitter.

The Mistruth

An outright lie, or more often a bending of the truth.

Misinformation is a massive factor in online outrage culture. Most moral panics involve at least one piece of information that is either completely fabricated or only partially correct. The most effective mistruths in sparking outrage involve an inkling of truth.

The Angry Mob

The pile on of a group of people who are angered by the mistruth outlined by the instigator. This can be amplified by disingenuous accounts (bots, fake profiles etc).

For the outrage machine to really kick off there needs to be a group of people/accounts seeking justice/accountability for a perceived wrong.

Pile On Tactics

Pile on Tactics include:

  • Astroturfing
    Creating a false impression of grassroots support or opposition when it’s actually a coordinated and manufactured campaign, we saw some examples of this during the Voice referendum.
  • Bandwagoning
    When a large influx of new users jump on a given topic because it’s become a hot trend, then take over discussions around it, often creating worse content for the original fan base of that topic.
  • Sock Puppets
    Fake online identities used to deceive others about the identity of the user, often with multiple profiles. These allow people to say things while avoiding accountability.
  • Brigading
    Coordinated group efforts to influence or disrupt online discussions or content.
  • Threats of Safety
    These often come at the peak of a pile on.
  • Sea-lioning
    Sea-lioning is a form of trolling where someone persistently and politely demands information or justification to disrupt or frustrate a discussion, often while ignoring previous responses.
  • Trolling
    Deliberately posting provocative or offensive messages to elicit emotional responses and disruptively intervene in discussions,
  • Doxxing
    The publication of personal information online in order to implement threats to safety.
  • Black Pilling
    Promoting a deeply pessimistic worldview, often to discourage or demoralise others.
  • Shitposting
    Posting irrelevant or inflammatory content to derail discussions or provoke reactions.

Various tactics can be used in combination. These tactics are also being compounded by the use of bots and generative AI.

Online Pile Ons in Action

Know what sparks outrage and be prepared. Some current themes that spark outrage include:

  • Nuclear power
  • Immigration
  • Transgender Rights
  • Palestine/Israel
  • Race (including DEI)
  • Reproductive Rights
  • Renewables

Hot-button or contentious topics are much more likely to attract outrage. If you have a well-moderated and managed online space with strong governance, it is possible to post about contentious issues without sparking outrage, however this requires a foundation of consistent community management and even then you should be prepared for co-ordinated attacks.

What you can do:

  • Understand the difference between organic outrage and manufactured outrage and know which one you are dealing with before taking any action.
  • Look for extremes and moderate them swiftly.
  • Be aware of moral and emotional words i.e. evil, disgrace, how dare you, disgusting, woke
  • Encourage a diversity of opinions expressed with empathy so that outrage doesn’t become normalised.
  • Be prepared for outrage. It largely comes down to having complete processes and governance such as escalation procedures, pre-approved responses, Tone Of Voice (TOV) aligned to organisational values.

Example: Gamergate

  • The Instigator
    Eron Gjoni published ”The Zoe Post”
  • The Mistruth
    In the blog post, indie games developer Zoe Quinn was accused of trading sex for positive game reviews.
  • The Angry Mob
    Lots of gamers got angry under the guise of ‘ethics in the gaming industry’ but ultimately expressed moral panic about women in a traditionally male space.

Gamergate happened a decade ago and is the OG play from the online outrage playbook. If you haven’t been chronically online like me for the past decade, Gamergate centred around Zoe Quinn, an independent games designer and a blog posted by her ex-boyfriend that included a bunch of unfounded accusations that Quinn had entered into an unethical romantic relationship with a games journalist.

This escalated into a year-long co-ordinated and systematised harassment campaign against Quinn and other women in the gaming industry. The harassment was dished out under the premise that games journalists were unethical using #gamergate, which amplified an already well organised outrage machine. It’s also worth noting that a lot of the abuse was targeted at women in the games industry that weren’t journalists.

Gamergate utilised all of the main tactics we see in hate campaigns – sock puppeting (fake accounts), bandwagoning (influx of people jumping on a topic and creating worse content), heckling, threatening, doxxing which forced some of the women to have to move, and eventually escalating to real world harms, including planned attacks at in-person gaming conventions.

We now have a much better understanding of how online harms translate to real world harms, and we also have stronger legislation thanks to the eSafety Commission regarding online safety but there is still the question of the role of law enforcement and understanding how to effectively handle online harassment.

The outrage playbook is full of examples of racial minorities and women being publicly punished when they step outside of gender and racial hierarchies.

Social media exacerbates this by making it easy to silence marginalised voices through coordinated attacks/harassment.

Gamergate is just one example of this that played out largely on Twitter but there are so many other examples including outrage over Leslie Jones, a woman of colour, playing a ghostbuster, which ultimately forced Leslie Jones off Twitter for a period or Yassmin Abdel-Magied, backlash against her social media post about ANZAC Day ultimately led to her leaving the country.

Gamergate is the perfect example of manufactured outrage where the angry online mob continued to generate reasons to be outraged in order to carry out justice (read: harass and abuse) on their targets.

How to Address a Pile On

Here are actions you can take to address a pile on:

Research and Address Misinformation & Disinformation

This has exploded so much in the past few years, and there are so many opportunities for us to be better addressing this as social media managers. Some of the options we do have are:

  • knowing what is likely to spark outrage and being prepared
  • removing misinformation before it has a chance to gain traction,
  • correcting with factual information (and empathy) and
  • changing the narrative. We do this with pre-bunking & debunking.

Media literacy is a key part of being able to identify mis/disinformation – learn to recognise misinformation markers.

Use First Draft – SHEEP standing for Source, History, Evidence, Emotions and Pictures

image with text - text reads '
DON'T GET TRICKED
BY ONLINE MISINFORMATION
Remember these checks when browsing social media
Source
Look at what lies beneath. Check the about page of a website or account, look at any account info and search for names or usernames.
History
Does this source have an agenda? Find out what subjects it
regularly covers or if it promotes only one perspective.
Evidence
Explore the details of a claim or meme and find out if it is backed up by reliable evidence from elsewhere.
Emotion
Does the source rely on emotion to make a point? Check for sensational, inflammatory and divisive language.
Pictures
Pictures paint a thousand words. Identify what message an image is portraying and whether the source is using images to get attention.
Think SHEEP before you share'.
FIRSTDRAFT logo in bottom left.

or

SIFT – stop, investigate the source, find better coverage, trace claims to original context),

Infographic titled 'SIFT'. There are four icons. 1st icon is a sign with a hand indicating stop. Text reads 'Stop'. 2nd icon is a magnifying glass - text reads 'Investigate the source'. 3rd icon is a tick with text 'find better coverage'. 4th icon is an icon with 3 dots and lines leading to down to one dot connected by a line - text reads 'trace the original context.'

and check against fact checking sources like RMIT Fact Check.

Moderate Extremes (Ban, Delete, Hide)

Extreme views and extreme language are more likely to generate outrage; moderate this quickly and stop the spread of emotional contagion. Look for predictors of divisiveness and step in.

Use the Platform Tools Creatively

Close/limit comments; be creative with keyword filters.

Have Good Governance + Act Quickly

Be in alignment + have complete processes – clear lines of approval. Make sure spokespeople are briefed with the correct information.

Get a Seat at the Table

Social Media Manager’s are well placed to understand and identify what will spark outrage. That knowledge can be invaluable to a business and help shape how you communicate business decisions that could get caught up in outrage culture.

Identify & Remove Disingenuous Contributions

Identify bots and remove them; look for patterns in language that indicate potential manufactured outrage.

Checking where the profile is from, i.e. from the US, can be a red flag in Australian-based debates, checking if the account is newly created, has any information and friends, etc.

Encourage Self-governance

Actively encourage regular page contributors to shut down outrage, redirect conversation away from negativity, and correct misinformation. They can also report issues to the page administrator.

Be in Alignment

Make sure everyone in your organisation is in alignment on values and positioning statement. A recent example is when the Olympics opening ceremony in Paris was accused of mocking The Last Supper.

A spokesperson for the Paris 2024 Olympics apologized if people had taken offence instead of correcting that it was intended to depict Dionysus, the Greek god of wine: “Clearly there was never an intention to show disrespect to any religious group. On the contrary, I think that (artistic director) Thomas Jolly really tried to celebrate community tolerance,” Descamps said at a press conference. “We believe that this ambition was achieved. If people have taken any offense, we are really sorry.”

Pre-bunking, Debunking + Changing the Narrative

Pre-bunking

Pre-bunking relates back to being prepared, having aligned messaging, good governance, and proactively preparing content to address key misconceptions.

It works well when we have good insight into the audience and the wider media and political landscape but can be hard to do entirely proactively. Another form of pre-bunking can be a bit more reactive, i.e., we start to see emerging themes, and we create educational content to try and cut it off at the pass.

Debunking

We only ever do this when the act of not doing it will have a significant ongoing impact.

When doing this, the best approach is

  • to not assume bad intent,
  • stay calm and
  • respond with empathy,
  • ask reasoning questions, and
  • state facts.

It can also be successful to point out that something is misinformation or not entirely correct but don’t get involved in a back and forth or scold. The Fact, Myth, Fallacy, Fact framework can be something that can be used to debunk misinformation. Always end with the fact, not the misinformation.

Context is so important here though, first investigate if the comment is from a disingenuous player and gauge the likely impact – is it getting traction? How persuasive is the argument?

Changing the Narrative

Storytelling, lived experience, and being on the front foot with messaging can be really important.

Encouraging Positive Behaviour

Help maintain psychological safety and self-governance by modelling the kind of comments you want to see and by celebrating and responding positively to civil comments. You can also directly message regular users who deal with pile-on tactics in an effective way to thank them for their efforts.

Mental Health

Outrage is infectious. Don’t let yourself get infected by it.

Witnessing social media outrage can have impacts on our mental health. Constant exposure to negativity, divisiveness, and aggression can create mental fatigue, a sense of hopelessness, or anger. Anger impairs strategic decision-making. Angry decision-makers are more likely to distrust and blame others and oversimplify complicated issues.

Tips for Coping

  • Take breaks
  • Pause before responding and read responses out loud
  • Lean into tools for emotional regulation – breathing, talk therapy, exercise
  • Short shifts for difficult or disturbing work and share the workload across team members if possible
  • Practice good digital hygiene when not working
  • Cultivate empathy
  • Where needed ask team members and allies to help you so that you are not taking on the full burden.

Positivity can be a useful counteraction to outrage.

Humour and fun can help inject positivity and defuse an online pile-on. One of my favourite examples of this has been previously in the Sephora Beauty Talk online community where members have been known to respond to trolls by flooding the thread with cute pictures of llamas to drown out the negativity.

Download Full Presentation Slides

Download Slides

Explore Further