According to a recent study by Pew Research Center, 44% of Americans regularly get their news from Facebook.
Facebook's algorithm gathers popular topics from their user's posts and moves them to a "trending" box, where people can see the days hot topics at a glance. But what happens when false news is circulated among facebook users? Will the algorithm know to not promote the story even more by saying it is "trending?"
Kate Starbird, assistant professor of Human Centered Design & Engineering, researches how rumors and misinformation spread on social media. Starbird tells BuzzFeed News that relying on an algorithm instead of human editors to do the filtering may be a mistake.
“We’re just beginning to understand the impact of socially and algorithmically curated news on human discourse, and we’re just beginning to untie all of that with filter bubbles and conspiracy theories,” Starbird told BuzzFeed News.
Starbird suggests training the algorithm to identify information that is deemed interesting and valid to a diverse group of people, as opposed to isolated, small groups.