Author Topic: Has Facebook finally popped its filter bubble? Not quite  (Read 634 times)

0 Members and 1 Guest are viewing this topic.

Offline EC

  • Shanghaied Editor
  • Hero Member
  • *****
  • Posts: 23,804
  • Gender: Male
  • Cats rule. Dogs drool.
Has Facebook finally popped its filter bubble? Not quite
« on: May 11, 2015, 12:43:05 am »
Is Facebook to blame for people failing to read content that questions their world view? Apparently not, say researchers at the social media giant. They say user choice is the main driving force that skews what someone sees. Does that mean the social network is off the hook? Not quite.

The study, published yesterday in the journal Science, dives into the Facebook activity of 10.1 million users in the US, looking at what liberals and conservatives read, shared and engaged with on the platform over six months in 2014. The aim was to explore "filter bubbles" – the idea, posed by Upworthy CEO Eli Pariser, that online algorithms limit exposure to information that conflicts with someone's political leaning.

The authors found that Facebook's algorithm had a modest effect on the kind of content people saw, filtering out 5 per cent of news that conflicts with conservative views and 8 per cent for liberals. However, they found that self-screening had a much bigger effect. People showed a clear preference for stories that fit their own world view: liberals clicked on only 7 per cent of conflicting content, while conservatives clicked on 17 per cent.

The authors conclude that individual choices, more than algorithms, limit exposure to diverse content. Or, as Christian Sandvig, an internet policy researcher at Harvard University, put it in a blog post yesterday: "It's not our fault."

While the researchers I spoke to highlighted the study's merits as a big data analysis, they noted some important limitations.
Not representative

Nick Diakopoulos at the University of Maryland points out that it only looked at people who listed their political affiliation on their profile and who fit into the researchers' five-point ideological scale. That turns out to be 4 per cent of users, who are probably not representative of the US as a whole.

"What about the other 96 per cent?" he says. "It's important to get that out there and say that this is based on a very specific sample, a very specific type of person."

In addition, the study can only reflect the algorithm's influence over the six months that the researchers looked at. But Facebook can and does tweak its algorithm regularly. As David Lazer, a political scientist at Northeastern University in Boston, points out in a companion essay in Science Express, Facebook announced changes just last month that could skew results – for example, by programming a preference for updates from "the friends you care about", who may tend to be more ideologically aligned with you.

Facebook should get credit for giving a glimpse of how its algorithm works, says Karrie Karahalios, a computer scientist at the University of Illinois in Urbana. After last year's uproar over the company's emotional contagion study – when it manipulated the amount of positive or negative news content to see the impact on posting behaviour – the company could have easily decided not to publish its research.

Read more: http://www.newscientist.com/article/dn27487-has-facebook-finally-popped-its-filter-bubble-not-quite.html

Seriously, read it. It's not long and well worth it.
The universe doesn't hate you. Unless your name is Tsutomu Yamaguchi

Avatar courtesy of Oceander

I've got a website now: Smoke and Ink