Edelman’s Barometer has highlighted that 7 out of 10 respondents to its survey are worried about the spread of fake news - with many apprehensive about use of false information to attack others – so called “weaponisation”.
The public is also troubled about the spread of fake news across social media platforms, with 48% of respondents now considering social media to be a part of news media and no longer just a platform.
Whilst last year’s Barometer had business leaders in the stocks, the heat is now on Facebook, Twitter and LinkedIn and other social networking platforms, who stand accused of doing too little to stop fake news reaching their users. This raises an interesting question – who holds responsibility for the content they host?
Traditionally, social media platforms have been seen as sites that facilitate communication, rather than as publishers of content. However, over the last few years this has changed.
Calls for social media operators to take responsibility for the content published on their platforms are getting louder. Facebook has been addressing this - implementing a number of changes to the publication of news on its platform.
Not only is Facebook reducing the amount of news stories that appear on a user’s feed, but they are encouraging their users to personally determine what news they think is reliable and trustworthy. Mark Zuckerberg, founder of Facebook, has said this change will ‘shift the balance of news you see towards sources that are determined to be trusted by the community.’
But critics argue that this is not enough. In July of last year Germany passed a new law to pressure companies to put their own houses in order, with measures to fine German companies up to €50 million (£43 million) if they fail to remove hate speech and fake news. This controversial law places the onus straight on the companies themselves. It will be interesting to see if other countries follow Germany’s example.
In response, Elliot Schrage, Facebook’s communications and public policy chief, recently addressed an audience of Europe and Silicon Valley’s tech elite in Munich, and challenged this burden of responsibility. Schrage argued that ‘the law places the responsibility on us to be judge and jury and enforcer determining what is legally compliant and not. I think that is a bad idea.’
All these measures are designed to restore trust in social networks outlets. But the Barometer’s findings show that it may be too late - revealing that the public’s trust in social media, and also the media itself, is at an all-time low. Media is now the least trusted institution globally, with 63% of respondents unable to tell good journalism from rumours and fake news.
The question we must ask ourselves is if this can be remedied – how can you report the news in an environment that increasingly distrusts what you say?
With the public now viewing social media platforms for news and well as networking, then responsibility for resolving the fake news dilemma has to be embraced by Facebook and its peers. By encouraging users to determine whether a source is reliable – as Facebook suggests – there is a glimmer of hope providers can regain the public’s trust in the news they provide. However it may not be enough.
With the Barometer showing that more than half of users already struggle to tell good journalism from fake news - then network and global media providers may well need to show a greater willingness to police their own sites, if they are to regain the public’s trust in the news they provide.