There’s been extensive discussion during the past weeks about Facebook’s fake stories and whether they have contributed to the results of the US elections. Mark Zuckerberg has decided to address this issue and announce Facebook’s future plans.
There is a great controversy regarding fake stories on social media and how they may affect users’ opinions over a subject, and Facebook received a significant amount of criticism on its lack of measures to avoid turning misinformation viral.
Mark Zuckerberg has decided to speak about this issue, in attempt to defend the popular social platform and most importantly, to announce the network’s future plans on handling similar cases.
In fact, he mentioned seven key points about Facebook’s future plans.
Stronger detection
Facebook is determined to improve its ability to classify misinformation. This will be achieved by enhanced technical systems that will spot the stories will flag as false, even before they actually do it.
Easy reporting
Facebook insists that the focus is on its community and that’s why it plans to facilitate the reporting of fake stories.
Third party verification
Mark Zuckerberg mentioned that Facebook is planning to consult fast checking organisations to learn more from them on how to deal with misinformation.
Warnings
Facebook is examining the display of warnings to stories that users have already flagged as false. This will maintain all the stories to the platform, but it will also help users discern their level of accuracy.
Related articles quality
Another attempt to improve Facebook’s quality of content is the idea of raising the bar for the stories that appear as related articles in the news feed. This may reduce the chances of turning a false story viral.
Disrupting fake news economics
According to Mark Zuckerberg, most false stories derive from financially motivated spam and that’s why they are planning to improve ad farm detection in an attempt to disrupt the economics with ads policies.
Listening
This may be obvious, but it’s always useful to keep listening from experts (journalists, publishers, etc) in order to understand the best ways that fast checking and editorial control work. This is benefiting both Facebook’s reputation, but also the users’
Are these enough to deal with a growing filter bubble?
Although the suggested ideas are encouraging, we can’t help but wonder whether they are enough for such a large (and powerful) platform to deal with the way fake news are spread through the platform.
Facebook may not be keen on accepting its responsibility as a curated media publisher (and it doesn’t see the platform as such), but it certainly needs to admit how its ambitious plan to reach more people and attract more publishers have their consequences.
How do you protect both the audience and the publishers by the so-called false stories of financially motivated spam? Is there a way to be stricter towards fake news and fast checking? How about investing in more people who will handle the way information is spread and react accordingly when needed?
However, the problem is complicated and it’s not just Facebook’s fault about the way fake news stories are spread.
The age of social media and the way publishing speed became more competitive is certainly the main reason fact checking became a luxury for many publishers, while users got addicted to an increasing content consumption (and a filter bubble that serves the relevancy they like).
Moreover, we are also experiencing a changing nature of media, with the integration of digital technology being demanding and challenging.
How does a traditional publication head towards social media? What’s the best way to apply the traditional editorial guidelines to digital platforms? On the other hand, how does a thriving online media publisher gain editorial judgement when click-baiting becomes appealing?
Last but not least, it may be a good time for all of us to acknowledge our own responsibility to become critical towards everything we read through Facebook or any other social platform.
There’s no need to believe every story we come across and even more, there’s no need to rule out any other opinion except ours. It’s not always easy, but it may be worth the effort.
source https://searchenginewatch.com/2016/11/22/did-facebook-find-a-way-to-deal-with-fake-news/
No comments:
Post a Comment