In a recent announcement, Facebook revealed new measures to help combat the problem of ‘fake news’ on the website, after growing concerns from both sides of the political spectrum over the effect of the spread of misinformation. We often hear the phrase ‘undermining democracy’ thrown around in the media, but what exactly does this mean, and can Facebook’s new measures do anything to stop this in its tracks?
In order to understand whether the new measures will successfully reduce the prevalence of fake news, we must understand the causes of the problem. The issue of inaccurate information being disseminated as ‘news’ on Facebook can be traced back to its decision to reshape the way it delivers the ‘trending’ feature in 2016. When it was first introduced in 2014, the ‘trending’ topics were determined by a team of people within Facebook, who were obligated to check if a story was credible with a respectable news outlet before putting it into the trending section. However, after complaints that the team showed anti-conservative bias in their editing, they were replaced by an algorithm that did not perform the same rigorous accuracy checks on stories, which almost immediately resulted in ridiculous hoaxes skyrocketing to the top of the trending list. For example, false headlines about Fox News’ Megyn Kelly were promoted, along with false reports of the existence of an inappropriate video involving a McDonald’s sandwich.
Considering this, Facebook’s plan to take down the ‘trending’ feature seems like a sensible first step in tackling the problem of fake news; it will be much harder for fake news stories to reach a large number of people quickly. However, the company acknowledged that this would by no means eradicate the problem, and thus are currently working on a breaking news feature to highlight important news stories from legitimate sources, so users are encouraged to read accurate updates on developments both locally, and around the world. This measure has the potential to be highly successful, as it recognises the growing importance of Facebook as the primary news source for many in the Western world, and assumes some responsibility for pointing its users in the right direction of reliable news. However, ultimately, we must acknowledge that it is largely down to the individual user to employ a greater level of scepticism to the information they view on Facebook. It would be unreasonable to expect the company to take responsibility for every single piece of false information spread on its website, especially considering the sheer mass of content uploaded every second.
Now let’s think about how fake news might ‘undermine democracy’, and why it is important that Facebook and its users work together to reduce the prevalence of false stories. If people are frequently exposed to misinformation, many are likely to be in some way influenced by it, whether this be outright believing something false to be true, or forming subconscious biases against certain things, people, or groups of people. Thus, those affected by fake news are more likely to make less informed choices when voting, which threatens the benefits our society gains from democracy; for example, it is much more difficult to use one’s vote to hold someone in power to account if you are misinformed on the actions of that individual.
While this danger to the democratic process can seem frightening, the future of Facebook and fake news is looking increasingly positive. For the first time, Facebook is taking direct action, and doing what it can to limit the spread of misinformation. The planned steps address both the original cause of the problem, and the changing nature of Facebook as a hub for news stories as well as updates on family and friends. However, all this will be for naught if we as individuals do not recognise our own responsibility to not take everything at face value, and actively seek out reliable information. So, if you plan to share this article, how about taking a few minutes to do a quick Google search and check that what I’ve told you is accurate?