There is no such thing as #NoFilter

When Ian Hargreaves came to speak to us about the challenges facing the creative economy and the creative industries, he mentioned ‘filter bubbles’ as something that might have a detrimental effect on digital democracy. And indeed, these effects may already have shown themselves in real life democracy.

Tunnel vision

We all know 2016 was a weird year. First Brexit happened. Then Trump was elected President of the USA. Both events came as huge surprises. No one thought it was actually going to happen. From where we were sat in the middle of our very own ‘filter bubble’, we could not see that anyone actually held a different perspective to ours.

This is exactly what ‘filter bubbles’ do. In actual fact, they are not bubbles, but algorithms that personalise what we see online – be it your Facebook feed or a Google search. The algorithms increase people’s chances of coming into contact with news or media that carry the same ideas, themes or perspectives as the news or media you have previously digitally engaged with. As a result, we develop tunnel vision as news consumers. The only news we come across is of the same view as our own, and we become “victims of our own biases”, according to Wired.

Eli Pariser coined the term back in 2011. In a Ted Talk from the same year, he explains that he first noticed these algorithms when his conservative Facebook friends’ post started disappearing from his news feed. As a liberalist, he enjoyed sharing opinions and discussions with them, but Facebook no longer showed him their point of view.

Source: Copyright: CC BY-SA 2.0

The dangers of living in a bubble

This lack of differing opinion is what makes filter bubbles so scary and, as Hargreaves says, possibly harmful for democracy. As increasingly more people consume news through social media, the less accurate our news pictures will become. People might think they are receiving a representative view of the world but in reality, they are only seeing the insides of their own bubbles. This is what many believe is to blame for the surprises of Brexit and Trump’s successes.

However, during this time, digital society has also seen the increase of fake news. Thanks to social media and their filter bubbles, people now consume news from fewer media outlets than they would before and are also willing to believe far more out-there stories. For example, after the US President Election it was found that some pro-Trump news stories that had been actively shared on Facebook prior to the election, were indeed fake news. Although filter bubbles aren’t fully to blame for fake news, “they incubated them and helped them spread”, in the words of Pariser, and are damaging people’s faith in journalism.

Eli Pariser at Knight Foundation’s Media Learning Seminar 2012. Source: WikiMedia Commons. Copyright: CC BY-SA 2.0.

Even though Facebook claims their vision is to “make the world more open and connected”, they still refuse to see themselves as a media corporation. Considering the development, this is a scary thought. Still, they have said to be working on new ways to filter out fake news.

How to burst the bubble

Already in 2011, Pariser called for the coders of the Internet to embed ethics into their algorithms. He claimed that the Internet, which was supposed to open doors that were previously controlled by gatekeepers, were now controlled by algorithmic gatekeepers that did not yet have the same ethics as editors of pre-Internet media outlets. As this is yet to happen, it is time we, as citizens of the Web, start taking back control and burst our own filter bubbles.

Here are some ways in which you as a social media user and news consumer can do that:

  • Don’t rely solely on social media. Go find our own news by looking at varied, even contradicting, news sites. This will make your filter more inclusive.
  • If you do see a news story on social media and want to share it, do a quick Google search to see if ‘trustworthy’ news outlets are reporting the same story.
Source: Flickr. Copyright: CC BY 2.0

Cover photo made by the author is free to use under CC BY 2.0.