Facebook – Dark arts cast a shadow online

Facebook’s corporate online newsroom has shown an unusual willingness to tell the world what the company is doing to fight fake news and Russian trolls. Too bad the messages do not necessarily match the reality, says Jason Sumner.

Spend an hour browsing Facebook’s corporate online newsroom (http://newsroom.fb.com), and even the most ardent Facebook sceptic could experience a wobble.

Here the company has built a formidable resource to convince the world that it is doing all that it can to fight fake accounts, fake news, election interference, privacy breaches and misuse of data:

  • ‘News’ promotes positive features but currently shows Facebook mostly in defence mode, and you can trace the history of recent scandals by some of the links: ‘Response to Six4Three Documents’; ‘Elliot Schrage on Definers’; ‘New York Times Update’ (in and amongst more fluffy titles, ‘A New Way to Share Gift Ideas on Facebook’).
  • ‘Inside Feed’ is Facebook’s attempt to let us peer behind the curtain. ‘With Inside Feed, we aim to shed more light on the people and processes behind our products’. And the section is effective at doing this, with well-made videos showing real employees doing real things to ‘make advertising transparent’, fight child exploitation on the network, hunt false news (‘both those we caught, and some we caught too late’), etc. One video, ‘Facing facts’, is particularly persuasive. https://newsroom.fb.com/news/2018/05/inside-feed-facing-facts/
  • ‘Hard Questions’ is a forum for Facebook, and external contributors, to explain the company’s approach and thinking about controversial subjects, such as ‘Who Reviews Objectionable Content on Facebook – And Is the Company Doing Enough to Support Them?’ or ‘How Does Facebook Investigate Cyber Threats and Information Operations?’ The section appears to steer away from specific controversies, leaving that for ‘News’. We praised ‘Hard Questions’ in a BC tip, for allowing and responding to criticism in comments below these pieces. (https://www.bowencraggs.com/Our-thinking/BC-Tips/Unfriendly-engagements)

The articles and videos are full of facts, to their credit. An article published in November by Facebook’s head of cybersecurity policy said the company, in the previous week, had removed ‘36 Facebook accounts, 6 pages and 99 Instagram accounts’. Another article by the vice president of product management said that in ‘Q3 2018, we took action on 15.4m pieces of violent and graphic content… more than 10 times the amount we took action on in Q4 2017’; they also took down 800 million fake accounts in Q2 and 754 million in Q3.

Even if you disagree with some of the above, or all of it, it is hard to come away from the site arguing that Facebook is ignoring its problems. It is unconventional, unusually open and thoughtful, acknowledges criticism and does not try to shift blame. For those reasons, it is a largely successful piece of online digital communications.

The problem is that anyone who has been on the internet or near a newspaper recently will know that what the company is saying its online newsroom does not necessarily reflect what was happening behind the scenes.

Facebook’s travails show that it doesn’t matter how sophisticated the online communications are, when the message does not reflect reality, companies are going to be caught out.

First published 12 December, 2018
< Back to Commentaries