Facebook filter-bubbles distort opinions

Content curation and news filtering by Facebook leads to ideological biases in the information presented to users...
08 May 2015

Share

Content curation and news filtering by Facebook, as well as other social media websites, likely leads to ideological biases in the information individuals see and read, a Facebook Likenew study has shown.

Nearly one and a half billion people use Facebook regularly. The information they see in their "news feed" includes stories shared with them by other individuals who are designated as friends.

Facebook's computer systems then further filter the data according to an algorithm that uses your profile and patterns of previous activity to prioritise some items and suppress others.

The result is the possible creation of an online "echo-chamber" effect, also dubbed a filter bubble, whereby individuals may not be exposed to a representative range of perspectives, but instead see only a narrow swathe of opinions and topics restricted largely to those aligned with a person's own viewpoint.

While the impact may be more minor for softer subject sectors, like sport or entertainment, for harder hitting areas, like world affairs, politics or finance, the effect of skewing opinion in this way could be more dramatic.

To measure the potential scale of any such effect, Facebook employee Eytan Bakshy and his colleagues, writing in Science, analysed data from 10 million US-based Facebook users who have also declared their ideological opinions.

The team extracted seven million links to news stories shared by these users and divided them into softer- or harder-hitting items.

About 13% were in the latter category, and how users shared and responded to these news-feed items was followed up amongst the 10 million users being studied.

If a user acquired information from a random selection of other users, the team found, there would be a 40-45% chance that the content would include hard-hitting material from other viewpoints.

However, the reality was significantly different with only 24% of the hard content shared by liberals representing other viewpoints. For right-wing users, that number was a more liberal (!) 35%.

Taking into account the volume of material being shared, and the number of users from a given ideology, between 5% and 10% less cross-cutting content is viewed by individuals of either ideology as a result of this filtering process.

The effect could be significant. "A healthy functioning democracy," says Harvard political scientist David Lazer, "requires that we listen to multiple perspectives. This requires vigilance. A small effect today might become a large effect tomorrow."

Comments

Add a comment