Fake News Fact Checkers Working With Facebook

  • Post author:
  • Post category:News
  • Post comments:0 Comments
  • Reading time:5 mins read

London-based, registered charity ‘Full Fact’ will now be working for Facebook, reviewing stories, images and videos, in an attempt to tackle misinformation that could “damage people’s health or safety or undermine democratic processes”.

Why?

The UK Brexit referendum, the 2017 UK general election, and the U.S. presidential election were both found to have suffered interference in the form of so-called ‘fake news’ / misinformation spread via Facebook which appears to have affected the outcomes by influencing voters.

For example, back in 2018, it was revealed that London-based data analytics company, Cambridge Analytica, which was once headed by Trump’s key adviser Steve Bannon, had illegally harvested 50 million Facebook profiles in early 2014 in order to build a software program that was used to predict and generate personalised political adverts to influence choices at the ballot box in the last U.S. election. Russia was also implicated in trying to influence voters via Facebook.

Chief executive of Facebook, Mark Zuckerberg, was made to appear before the U.S. Congress in April to talk about how Facebook is tackling false reports, and even recently a video that was shared via Facebook (which had 4 million views before being taken down) falsely suggested that smart meters emit radiation levels that are harmful to health. The information in the video was believed by many even though it was false.

Scoring System

Back in August 2018, it was revealed that for 2 years Facebook had been trying to manage some misinformation issues by using a system (operated by its own ‘misinformation team’) that allocated a trustworthiness score to some members.  Facebook is reported to be already working with fact-checkers in more than 20 countries. Facebook is also reported to have had a working relationship with Full Fact since 2016.

Full Fact’s System

This new system from third-party Full Fact will now focus on Facebook in the UK.  When users flag up to Facebook what they suspect may be false content, the Full Fact team will identify and review public pictures, videos or stories and use a rating system that will categorise them as true, false or a mixture of accurate and inaccurate content.  Users will then be told if the story they’ve shared, or are about to share, has been checked by Full Fact, and they’ll be given the option to read more about the claim’s source, but will not be stopped from sharing anything.

Also, the false rating system should mean that false content will appear lower in news feeds, so it reaches fewer people. Satire from a page or domain that is a known satire publication will not be penalised.

Like other Facebook third-party fact-checkers, Full Fact will be able to act against pages and domains that repeatedly share false-rated content e.g. by reducing by their distribution and by reducing their ability to monetise and advertise.  Also, Full Fact should be able to stop repeat offenders from registering as a news page on Facebook.

Assurances

Full Fact has published assurances that among other things, they won’t be given access to Facebook users’ private data for any reason, Facebook will have no control over what they choose to check, and they will operate in a way that is independent, impartial and open.

Political Ad Transparency – New Rules

In October last year, Facebook also announced that a new rule for the UK now means that anyone who wishes to place an advert relating to a live political issue or promoting a UK political candidate, referencing political figures, political parties, elections, legislation before Parliament and past referenda that are the subject of national debate, will need to prove their identity, and prove that they are based in the UK. The adverts they post will also have to carry a “Paid for by” disclaimer to enable Facebook users to see who they are engaging with when viewing the ad.

What Does This Mean For Your Business?

As users of social networks, we don’t want to see false news, and false news that influences the outcome of important issues (e.g. elections and referendums) have a knock-on effect to the economic and trade environment which, in turn, affects businesses.

Facebook appears to have lost a lot of trust over the Cambridge Analytica (SCL Elections) scandal, findings that Facebook was used to distribute posts of Russian origin to influence opinion in the U.S. election, and that the platform was also used by parties wishing to influence the outcome of the UK Referendum. Facebook, therefore, must show that it is taking the kind of action that doesn’t stifle free speech but does go some way to tackling the spread of misinformation via its platform.

There remains, however, some criticism in this case that Facebook may still be acting too slowly and not decisively enough, given the speed by which some false content can amass millions of views.

Leave a Reply