Imagine a car manufacturer called Fastbook. Its job is to design and build high-performing cars. The best cars in the world, even.
It’s not its intention for people to drive its cars at an irresponsible speed. Unfortunately, it does sometimes happen. But it's not its job to enforce what people can or can't do with their cars – that’s where governments come in.
Governments can create their own guidelines for how fast people can drive and Fastbook is happy to participate in that conversation. But it doesn't come up with the guidelines or prevent its customers from driving its cars past a certain speed.
Even if it was plausible that it could do this to our satisfaction, do we really, truly, actually want Facebook, a for-profit tech company, to be the moral arbiter of what content is acceptable or not in our respective societies? To decide what views are acceptable and what’s not? To reflect this in each country around the world, with all the nuances of their respective cultures and histories?
No. This is what we have governments for. Facebook is a business and has a primary objective of making profit. There are organisations whose primary objective is an ethical one; these are called charities. Governments are there to protect their citizens.
If we look at other industries (eg press, radio and television), they are regulated to protect public interest. If they break their guidelines, they get penalised for it.
Facebook and other social platforms aren’t policed by their own regulatory system and have largely relied on self-governance. Now, we’re asking them to come up with all-or-nothing rules of what’s acceptable, and when they do it's still not enough. This is literally the government’s role: to provide that governance (the clue's in the name).
Who exactly gave Facebook that responsibility – and why? Its background is in tech, not in governing society.
Free speech and regulatory frameworks
The issue of government involvement in limiting freedom of expression is highly complex; the debate is centuries old. That’s exactly why privately run tech companies are not equipped to make these types of decisions for society.
Mark Zuckerberg himself has said that Facebook, and other online content, should be regulated with a system between the one used for telecommunications companies and media industries.
The Leveson Inquiry into the ethics and practices of the UK press of 2011 is a great example of why very serious discussions need to be had around regulation and free speech wherever publication platforms exist.
The result of the inquiry was not unequivocal; the UK now has both the approved regulator IMPRESS and Ipso as an independent regulator. Meanwhile, the broadcasting and telecoms industries all operate under the government-approved Ofcom. It’s unlikely that the US could’ve ever had such an inquiry, since the constitution says that Congress can make no law abridging the freedom of the press.
While there may not be one universally accepted standard of government involvement, at least the UK government and the press have continuously discussed the matter with the nuance it requires. We should all agree that it’s the government’s role to safeguard the lives of its citizens – and it should make decisions off the back of that. Why else do we elect them to power?
Whether it’s regulation, a code of ethics, an independent body or more sanctions, it’s clear that a more standardised approach to social platforms is necessary.
Is Facebook really the enemy?
Since everyone seems to be enjoying hating on Facebook, I feel it’s important to remember what it does get right.
I’m not denying its flaws – far from it. I just tire of everyone always looking for the negatives. Call me a relentless optimist, if you will.
It has progressed the world of technology (eg machine learning to combat fake news, natural language processing for text-to-speech, artificial intelligence for object recognition), the economy (millions of jobs, billions in economic impact) and beyond. It is leaps ahead of other companies in terms of investment in research and development, and is paving the way for global internet connectivity, developing robotics and sustainable energy. As Ben Thompson pointed out, the main reason that we've all heard of George Floyd is because the video of his murder was posted on Facebook – allowing it to get the attention it deserves.
And since advertisers are the ones calling Facebook out this time, I just want to remind them that media advertising before Facebook (and Google) was slow, inefficient and rotten to the core, with backhanders and dodgy deals. (In fact, it’s still not great outside Facebook and Google, but that’s for another rant.) Facebook has helped to define a new era of advertising, defined by data, measurability, relevance and transparency.
Can we stop with the virtue-signalling, please?
Some companies seem to jump on the anti-Facebook bandwagon when it suits them to do so (and they get press coverage for it), when their own moral compass has historically been questionable.
A handful of the headlining organisations have been criticised in the past for doing a lot more damage to the world than Facebook. They sell products that cause diabetes, destroy our planet by cultivating palm oil, conduct animal testing, contribute exponentially to worldwide pollution, have been embroiled in tax-evasion scandals, consistently put independent competitors out of business, are known for poor labour conditions…
The list goes on, and on, and on.
And now they claim that advertising on Facebook wouldn’t "add value to society" or demand "greater accountability". When is it their turn to meet these criteria?
While we're at it, why only gang up on Facebook, a user-generated content platform? Let’s boycott NBC and any streaming platform that has aired the episodes of 30 Rock that featured derogatory humour including blackface – they actually curate their own content and still manage to get it wrong.
More work is needed all around
Government intervention and support are required. At the end of the day, governments are here to make difficult decisions like these on our behalf and hold organisations accountable.
Now, I don’t mean a government that doesn’t understand social media. The Competition & Markets Authority's recent report rightly says that current laws are not suitable to regulate the ad market. We need people who understand tech, data and digital content – both in terms of their limitations and space needed for future innovation. This goes beyond what governments have traditionally been capable of and definitely requires collaboration with platforms (such as the European Union Code of Conduct on countering illegal hate speech).
On the other side of the coin, Microsoft chief marketing officer Chris Capossela has said that their experience shows that effecting change comes from direct dialogue with media partners. Unilever former chief marketing officer Keith Weed said something similar in 2017 – that the best way to negotiate is one-to-one and in private, not through public statements. This position has been reinforced by Facebook's recent memo to advertisers, where it iterated that it does not make policy changes based on revenue pressure.
In the meantime, advertisers and agencies alike should take this opportunity to look inwards, not just outwards. Let's all take an active role in promoting positive change in our industries.
Daniel Gilbert is founder and chief executive of Brainlabs