skip to Main Content

Yes, There's Lots Of Fake News On Facebook, But Is It Really Changing Anyone's Mind?

Source: Techdirt.com
We’ve already written about how silly and dangerous it is that some people (especially journalists) rushed to blame Facebook for their disappointment that Donald Trump won the election. I’ve explained why I think the whole “fake news” problem is completely overblown — but the issue has gotten a new blast of energy from an interesting analysis done by Craig Silverman at BuzzFeed, saying that in the weeks leading up to the election there was more engagement with fake news on Facebook than real news. Here’s the key chart that everyone’s passing around.

That’s a pretty scary looking chart. But it’s not clear it really supports the argument that fake news was actually an influencing factor. First of all, there are some questions about the methodology here and whether or not BuzzFeed is actually overselling the true story based on the headline (and, yes, there’s irony in the idea that a story claiming that fake news is shared more than real news may have a misleading or “fake” title…). Beyond just questions of how you track Facebook “engagement,” it’s also not always clear if all engagement is the same. Hell, what if many of the comments on a fake news story are versions of “this is fake.” That counts as engagement, but undermines the idea that people are interacting with fake news only because they believe it. Even the author of the piece, Silverman, weighed in on Twitter with a bunch of caveats about what the story doesn’t actually show (even as many reading it are assuming it does).

But, also there’s still the much larger question of whether or not fake news actually has an impact, or if it’s just being shared. In another interesting article, the folks over at the Guardian got a group of people who identified as either strongly “left” or strongly “right” and tried to get them to use a Facebook feed designed for someone at the other end of that spectrum. And guess what? It didn’t change people’s minds. In most cases, it just caused people to dig in deeper with their positions, getting angrier at the other side for the stuff that was published on “that side.” It would be funny how “tribal” people get if it wasn’t resulting in a huge and ridiculous division in our society. In that article, people on both sides used the “opposing” side’s feed as some sort of evidence of just how dishonest/angry/evil the other side was. The only person who changed his mind only did so to decide not to vote for President at all.

The problem here doesn’t seem to be “fake news.” The problem (and we see this in our own comments as well) is that people have decided that there are these monolithic groups — “the left” or “the right” — and they ascribe all sorts of evil motivations and intentions to them. I can’t tell you the number of times we’ve seen comments about “the left blah blah blah” or “the right blah blah blah” and they’re always extreme and ridiculous stereotypes that have very little basis in reality. People are rooting for their “team” rather than good ideas or good policy. That’s dangerous. And, yes, fake news that people can share that enforces their viewpoints is a reflection of that attitude, but it’s not the cause of it. If people want to fix the “problem” of fake news getting shared, maybe work on ways to get people to stop playing “red team / blue team” and to start recognizing that what matters is the actual policy decisions and actions. That’s likely wishful thinking, but it would be nice if people were at least working towards a path to avoid politics as team sports.

Permalink | Comments | Email This Story
Source: Techdirt.com

Back To Top