Recommended Sponsor - Buy Original Artwork Directly from the Artist

Source: Massey University

The history, community dynamics and impact of online abuse is explored in new book (photo/Erik Mclean:Unsplash)

Dr Kevin Veale is based in Massey University’s School of Humanities, Media and Creative Communication and specialises in researching storytelling and popular culture in the digital space. His new book, Gaming the Dynamics of Online Harassment (Palgrave Macmillan) explores how online abuse works to reveal the deadly impact it can have, while increasing profits for social media companies that enable it. 

His research uncovers a deeply disturbing culture of online abuse and harassment that can result in people losing jobs and even being driven to suicide. Among more sinister insights is the notion that; “There’s no real distinction between online and offline spaces anymore, or online and offline harm.”

While most of the examples and evidence he cites are American, the 2019 Christchurch mosque killings by an Australian white supremacist who live-streamed the massacre of 51 people are the introductory scene-setter for the chilling chapters that follow, and a frequent reference point throughout.

He cites cases (in the United States) where people are fired because of techniques that frame victims through Google searches as paedophiles, terrorists or animal abusers by corrupting their online profile. While constant threats of death and rape occur in online hate campaigns, people targeted also get dead animals in their mailboxes. And though it might seem incredible this could happen, Dr Veale says increasingly refined harassment techniques succeed because; “Google seems reliable – we rely on it to verify a lot of things, but it can be taken over. 

“It’s hard to challenge accusations that you’re a monster when Google’s saying it too, and it’s hard to complain that you’ve been treated unfairly if you’re fired because workplaces can defend themselves by saying they verified information with online searches before acting.”

Who are the perpetrators and why do they do it?

Online harassment is based in a “kaleidoscope of everyday bigotries – primarily sexism, racism, homophobia, transphobia and ableism, among others,” says Dr Veale. There are, he explains, two central, overlapping spines to organized online harassment: White supremacists, and anti-trans bigots. He says they may present as entirely normal people in real life. Some are social misfits, but not all. 

“It’s often not obvious who online harassers are, and it can be surprising and hurtful to find that family or friends have been online hurting people for fun,” he says.

They are spurred by a variety of motivations, but “at heart, they think it’s fun to be part of a community that shares the same views they have, and to solve problems together. It’s just that those problems are other people, and the fact they exist.”

What are Alternate Reality Games?

Alternate Reality Games (ARGs), says Dr Veale, can be “wonderful” and are by no means inherently bad – in their most benign form they operate as a form of online community problem-solving. “What you are doing is solving a problem by telling a story. People who are part of the community who are telling the story collaborate with those who are building the story.”

By drawing parallels with the world of ARGs, he sets out to provide a comprehensive understanding of how online abuse works – and has come up with solutions that could help protect people from its worst excesses. The ARG model is applicable to the world of online harassment when; “somebody picks a target and says ‘this person deserves to have their life ruined’ and a bunch of people chime in and say,’yeah OK – how can we do that?’”

The economics of abuse

He explores the contention that the worst people on a social network are the most valuable. The monetisaton of abuse has arisen, he says, because social media companies make money by selling advertisements and sponsored content based on high levels of engagement, or activity. 

If you get one person being dog-piled by thousands of people sending them images of the Holocaust, that looks great if you just call it ‘engagement.’”

In the introductory chapter, Dr Veale explores the long history of abuse online, together with the cultural dynamics which fuel it. “The goal is to provide context, given that the modern environment of online harassment represents a refinement in how abuse is delivered rather than an increase in its volume. “

Following chapters explore the concept of the ‘networked public’ as a vital feature for understanding online harassment, as well as why tools frequently suggested for solving online harassment don’t work, or make the situation worse. He argues that; “one of the central ‘elephants in the room’ regarding online harassment is the fact that social media companies and online platforms profit from it directly, which makes them very unmotivated to try and stop it.”

In another section he uses case studies to offer ways to help make online harassment harder by giving people as much control over their engagement with online spaces as possible. And in conclusion he considers the impact of the Christchurch Call to Action Summit in May of 2019 as a response to the white supremacist terrorist attacks in New Zealand, where governments and technology companies discussed solutions to the ways that the internet and social media have been used to coordinate, inspire and organize offline terrorist attacks.

Despite losing a friend (in the US) to suicide several years ago as a result of online harassment, he doesn’t think the Internet has made humans worse overall. “The beauty is that it puts people in touch with one another. We can accomplish great things – the flip side is we can also accomplish great things that are also monstrous. 

Why we need to understand this?

Dr Veale, who’s even had to warn his academic colleagues and manager of possible backlashes from Internet harassment campaigns as a result of publishing his damning insights, says online harassment is “everywhere and has serious impacts on peoples’ lives. We’ve seen recent rapid escalations in how organized the delivery of online harassment can be, particularly connected to white-supremacist groups, anti-trans bigots, and major political campaigns. 

“People are learning from these campaigns and applying their techniques to achieving political ends – something we can see in the way people involved in the QAnon conspiracy were part of the fascist, white-supremacist attacks against Washington DC recently. Meanwhile, social media platforms and online companies profit from misery and threats to American democracy while claiming that nothing can be done to change the status quo.”

The point of the book, he says, is “to provide historical context for online harassment, show how the community dynamics of hate-mobs and online harassment campaigns work, and to show that we don’t just have to tolerate the way things are because there ARE things that can be done to make things better. However, online companies will provide a significant hurdle to that change, which we can also try to circumvent.”

Dr Veale writes too, that the subject of the book is; “a very personal experience for me, and one I’ve grown up beside. Normally, I write about storytelling, and the ways that storytelling can be influenced by the forms of media it’s communicated through – meaning you can use those media forms as storytelling tools in themselves. I’ve done work in the past on how ARG communities work, which meant that when organized online harassment campaigns really started kicking off in the 2010s and started targeting my friends and people I knew, I could see that they were functionally ARGs focused on ruining peoples’ lives. 

“Since then, online harassment has harmed uncountable numbers of people, driving them out of industries, forcing them to move homes or countries, or in some cases driving people to suicide. Friends of mine have lost their lives because of online harassment that sought to kill them. This book is about documenting all of that, and what can be done to change it.”