After the Christchurch terrorist attack and the Capitol Hill insurrection, we spoke about regulating the internet. Yet, we’re no closer to holding social media companies accountable. 

 

 

I’m all for social media as a tool for activism, communication, and connecting with others. But because social media is exceedingly democratic, every user has the ability to broadcast whatever lies, conspiracy theories, and rubbish they want. The problem is that without guardians on the gate, a minority of anti-vax-COVID-is-a-myth and Trump-won-the-election nutcases can band together and form political echo chambers that propagate fake news, lies, and real-world extremism. 

But while the average person can generally engage their frontal lobes and scroll past the insanity on their screens, the investigation into Russia’s interference in the 2016 US election showed us that it doesn’t take much to subtly influence our viewpoints with careful but deliberate use of social media and trolls

Furthermore, the rise and fall of Parler – the unregulated free speech ‘utopia’ which allowed the extreme right to coordinate their insurrection on American democracy – demonstrates the necessity for strong platform regulation.

Traditionally, our information and perspectives about world issues were based on what we read in the newspaper and heard on the radio – institutions bound by clear legal parameters – but largely unregulated social media platforms are now operating outside these parameters.

Worth noting too, is that the spiral into hyper-partisanship within US media followed a Reagan Administration move to abandon the FCC’s ‘Fairness Doctrine, which necessitated balance reporting and viewed the media as “public trustees.” 

However, in most countries (including NZ and Australia), ‘old media’ (think TV, newspapers, radio) are liable for prosecution if they publish blatant falsehoods, and are required to provide balanced reporting. NZ, for example, has two complaint bodies, the Press Council and Broadcasting Standards Authority which foster and uphold the public’s role in holding news media accountable, in accordance with collectively held ethical standards. So, while ‘old media’ has clear legal boundaries, complaints authorities, and well-established ethical obligations, social media does not.

At best, the global nature of companies such as Facebook makes them incredibly difficult to regulate. Despite NZ leading international calls for better regulation of social media following the Christchurch terrorist attack, we are yet to have new legislation introduced. The terrorist perpetrating the attacks live-streamed his atrocities on Facebook – for a horrifying near 17 minutes – and despite being removed fairly quickly, 1.5 million copies of the video had been taken down from the platform in 24 hours. Jacinda Ardern condemned the concerning trend of social media playing a part in terrorist attacks saying in a New York Times opinion piece that social media needs reform. No one should be able to broadcast mass murder.But despite conferences in Paris bringing together world leaders and social media executives, little in the realm of definitive legislative change has occurred. Facebook refused to slap a time delay on the live-streaming, and the modus operandi steadily returned to business-as-usual as Christchurch began to fade in the memories of those privileged enough not to have experienced it.

 

Around the world, the age-old push and pull between anti-hate speech and pro-free speech interests perseveres.

 

Australia is ahead of us in that sense, with their amendment to the Criminal Code, ‘Sharing of Abhorrent Violent Material’, passed in 2019. This allows for social media companies to be fined up to “10% of their revenue, and their executives up to three years jail if they fail to remove “abhorrent violent material expeditiously.”

While clearly a much-needed piece of legislation, it only deals with one facet of social media harm.

But in NZ, and around the world, the age-old push and pull between anti-hate speech and pro-free speech interests perseveres. Both sides of the political aisle continue to raise important and valid points, but no middle ground seems to arise out of these discussions. 

The real reason why compromise seems such a struggle is that we are approaching the issue from the wrong angle. Most of the debate is predicated on whether or not the government should be regulating social media. 

The compromise is to place the onus on social media giants to monitor their own platforms, with the threat that they, as well as the individual (who posts something in breach of the regulations), will face prosecution. 

That way, we start treating social media platforms the same way we treat news media entities. If something ‘offensive’ appears on their platform, website, broadcast – they themselves are accountable.

But even then, many news broadcasters such as Sky News Australia included footage of the Christchurch attack as part of their report – facing little more than a slap on the wrist and a strongly worded email. Clearly, news media are wriggling through the pre-existing legislation, and must be held just as accountable as social media should be.

The establishment of complaints and regulatory bodies is a no-brainer – but it cannot be left up to the government to pour through social media looking for harmful material. They simply don’t have the time and resources. 

Social media is undoubtedly responsible for the seamless distribution of hate messages (the Christchurch gunman’s accompanying manifesto), the coordination of extremism (the Capitol riots), the spreading of deadly false information (the COVID-19 conspiracies), the creation of political echo chambers (Parler, Twitter, et al) and a tool for political and social manipulation (Cambridge Analytica). They must take at least some responsibility for opening pandora’s box of internet anarchy. 

Let’s reign in that anarchy and provide a legislative framework of responsibilities that social media companies must operate within. Australia’s Abhorrent Violent material bill is a good basis for how to gear the new legislation in a way that provides sufficient incentive for the platforms to regulate the content they uphold.

There is a very real risk, however, that Social media companies will threaten to remove their services in the face of proposed changes. Google, of course, threatened to pull the plug on its search engine operating in Australia, due to some legislation that proposed to make Google, Facebook and potentially other tech companies pay media outlets for their news content.We should be prepared for a similar backlash to the introduction of harsh penalties for ineffective regulation of social media platforms – and if they raise the middle finger in response – then it’s a price I think society should be prepared to pay.  

Social media must be treated like ‘old media’ with the same ethical responsibilities and consequences.

All that stands in the way of taming the lawlessness of the internet’s wild west, is political will.

 

 

 

 

Keelan Heesterman was the winner of Education Perfect’s ‘Student Voice’ writing competition.

 

 

 

Share via