CultureSpeeches

Oliver Dowden – 2021 Article on Free Speech

The article written by Oliver Dowden, the Secretary of State for Digital, Culture, Media and Sport on 18 January 2021.

If the last decade has been defined by anything, it’s the power of social media. Its opening saw the hope of the Arab Spring, whilst its closing witnessed last week’s disgraceful scenes at the US Capitol.

Both were the product of social media’s unprecedented ability to spread ideas and bring people together, for both good and bad. Put simply, we have a new printing press – but it’s an invention whose implications society and governments are just beginning to grapple with.

With so many of us now consuming our news and information through social media, a small number of companies wield vast power in shaping how we see the world. To an outsider, it doesn’t always seem this power is wielded transparently or consistently.

Iran’s Ayatollah has a Twitter account, whilst the elected President of the United States is permanently suspended from holding one. Trump’s supporters have labeled that move censorship; the other half of the country has asked what took so long.

Norway’s Prime Minister has had posts defending freedom of expression deleted on Facebook because they contained the iconic “Napalm Girl” photo – an unintentional violation of the site’s child nudity policy – whilst in Myanmar, the same platform has been used to whip up hatred towards Rohingya Muslims.

Those facts alone should make anyone who loves democracy pause for thought. The idea that free speech can be switched off with the click of a button in California is unsettling even for the people with their hands on the mouse.

Just this week, Twitter’s CEO, Jack Dorsey, said that while he felt that it was right for his platform to ban Trump, leaving platforms to take these decisions “fragments” the public conversation and sets a dangerous precedent.

So as we enter a new era in our relationship with tech, who should decide its rules?

We need to be able to define what social media is and isn’t. Given it is now so crucial a part of public discourse, should we compare it to a utility? Or should we see social media companies as publishers, akin to newspapers – and therefore liable for everything they publish?

In reality, neither the passive “platform” nor the editorialised “publisher” truly hit the mark. Holding companies liable for every piece of content – for 500 hours a minute of uploads on YouTube alone – would break social media.

But equally, when these companies are curating, editorialising, and in some cases removing users, they can no longer claim to be bystanders with no responsibility whatsoever.

However we categorise social media, one thing is clear: as with other forms of mass communication, democratically elected governments must play a role in regulating it.

In the UK, we are leading the world by starting to deal with this dilemma. I have been clear that we are entering a new age of accountability for tech.

At the end of last year, we outlined plans for a groundbreaking new rulebook for social media companies: one that would make sites like Facebook and Twitter responsible for dealing with harmful content on their platforms, while also holding them answerable for their wider role and impact on democratic debate and free speech.

We can no longer outsource difficult decisions. There’s now a burning need for democratic societies to find ways to impose consistency, transparency, fairness in the online sphere.

And it needs to be flexible enough to adapt as social media evolves. There’s always another platform somewhere else on the horizon. No-one had ever used TikTok in the UK before 2018 but now 17 million of us do: almost double the total newspaper circulation in 2019.

It also means navigating some complex philosophical disputes. How do you resolve the inherent tension, for example, between protecting people from dangerous misinformation in a global pandemic, whilst also protecting their right to express an opinion?

There are no easy answers. But we are setting out the parameters.

We need to do everything we can to protect our most vulnerable citizens, and particularly children, from harm. Our upcoming Online Safety Bill holds them as our number one priority.

The second is the protection of free speech. As Lord Justice Sedley put it in 1999, that definition has to include “not only the inoffensive but the irritating, the contentious, the eccentric, the heretical, the unwelcome and the provocative.” Without the latter, we are making an empty promise.

So, under our legislation, social media giants will have to enforce their terms and conditions consistently and transparently. This will prevent them from arbitrarily banning any user for expressing an offensive or controversial viewpoint.

If users feel like they’ve been treated unfairly, they’ll be able to seek redress from the company. Right now, that process is slow, opaque and inconsistent.

And it’s absolutely vital that internet regulations can’t be used as a tool to silence an opponent or muzzle the free media. So news publishers’ content on their own sites will be exempt.

The decisions governments around the world take will shape democracies for decades to come. As the UK takes up the G7 Presidency this year, we want to work with our democratic allies to forge a coherent response.

We are just taking the first steps in this process. But decisions affecting democracy should be made democratically – by governments accountable to parliament, not executives accountable to shareholders.