Facebook is becoming a stronger and stronger media source, a large outlet for receiving the news. And unfortunately, not all the news that circulates this giant social media and marketing tech giant is true. Facebook’s algorithm is essentially reshaping how the populace views the current state of the country and what’s to come – not purposely, but nonetheless.
Roughly 30% of the entire nation get their news directly from Facebook. And that’s scary, especially given that about 59% of individuals will share an article based solely on its title. That issue can be attributed to what’s called “clickbait”. A clickbait is a title or content that’s strategically and provocatively written to get more shares, likes, and clicks to drive more traffic toward a business’s website or blog. And it’s become dangerous. Especially with Facebook’s algorithm that’s designed to bring users the news they need to know; which, unfortunately, isn’t always true.
If you’re on Facebook, you’ve seen the social chaos, the political debates between one another, and the anger swirling ‘round your newsfeed. You’ve also seen numerous articles shared daily regarding the election, global issues, celebrity gossip, etc. that many times spark these angry debates. A lot of these articles are either misinformed or even designed to misinform. News sites are competing to get their content read. They get the reactions they want using clickbait, writing content that surrounds what’s “trending” on social media. And what’s trending right now is anger, as well as significant dispute between people and their opinions on the future of our country. These articles are designed to intrigue us and back our opinions. Psychologically, we feel inclined to share claims that do both.
There are plenty of reasons people share misinformed articles. We lead busy lives. It’s difficult for us to fact-check every article we see on our newsfeed. We believe it’s up to these news sites to be doing the fact-checking. But many of these news sites are simply writing intriguing articles and citing other news sites that are merely citing other news sites. And where do these trails lead? Possibly to news sites that spread false claims for their own agenda. This is how lies are spreading around even quicker than the truth.
News sites see a decline in their Facebook reach. Many turn to bad practices, such as utilizing clickbait or sharing misinformed claims that get a large audience’s attention and expand the news site’s reach in social media. These articles are liked, shared, and commented on because they are intriguing or support belief structures. These misinformed news sites performing bad practices now receive more organic reach because Facebook’s algorithm sees its users engaging more and more with their content. And the cycle continues. The misinformed misinform – so on, so forth.
And from the looks of it, it’s creating a heated division between us. People feel obligated to back up their opinions. And so, they share articles they believe to be fact-checked, but aren’t. This problem has recently become significantly visible because of “election anxiety” and stress over the future of our country. The changes we may be facing are huge and truly attention-grabbing for the entire nation. Some news sites are exploiting this, sharing outdated information or claims that simply aren’t backed, knowing they will get rewarded for doing so. But at what cost? The future of our country, people’s rights, economic change, etc. – these are all topics that should never be lied about. All the mayhem and angry social media debate could very well be a product of this significant problem.
It is now up to us to fact-check what we consider sharing. We may always be divided in opinion. But this division shouldn’t be so strong that it leads to physical or verbal violence. And it’s as if some news sites are fueling a fire just to meet their own agendas. This needs to stop. And Facebook needs to take more responsibility for its role in allowing false information to circulate its platform.
Many have been quick point fingers at Facebook for its role in allowing misinformation to cycle people’s newsfeeds. They’ve been criticized for firing their human editors who once curated its “trending news” section and replacing it with an algorithm which is, as we can see, easy to fool. Since it has become a media source, Facebook should accept the editorial role they have in the content allowed to be broadcasted to its users.
Adam Mosseri, Facebook’s VP of product management, responded to a series of questions by TechCrunch about this very problem. In his response, Mosseri claims that they “use various signals based on community feedback to determine which posts are likely to contain inaccurate information, and reduce their distribution.” Mosseri wraps up his statement conceding that despite their efforts thus far, there is much more that needs to be done.
And we should expect more. News sites that misinform should be penalized. Facebook’s algorithm needs work to tackle this problem. And we’re happy that Facebook recognizes this and plans to do more.
We urge everyone who turns to Facebook for news to fact-check news articles. Debate is perfectly natural, until it turns verbally or physically violent, which is what we’ve been witnessing. It is easy to see how misinformation can lead to social upheaval. These misinformed news sites performing bad practices need to be stopped – and it may be up to us to stop them. Keep it simple. Don’t share an article or a claim until you, yourself, have fact-checked it. Lies are circulating Facebook – false claims made by seemingly credible news sites. And these lies are psychologically damaging a majority of people. We hope this ends – sooner rather than later.