Facebook has announced that they will be instituting another new policy to combat the misinformation surrounding the COVID-19 coronavirus pandemic. Facebook will now be putting pop-ups linking to the WHO’s page that confronts COVID-19 myths in the news feeds of users who have recently engaged with content classified as misinformation.
A Facebook spokesperson has said that this is being done to “connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources, in case they see or hear these claims again off of Facebook.”
This new directive expands Facebook’s policy of removing content that has the potential to cause immediate physical harm. That policy was put in place as an extension of Facebook’s previous efforts to combat misinformation by including warning messages that call into question the validity of the source underneath links that have been flagged on the social media platform.
“Facebook Sits at the Epicenter of the Misinformation Crisis”
The change is in response to a study that recently linked Facebook to the spread of falsehoods surrounding COVID-19. The study conducted by the independent activist group, Avaaz, revealed some concerning insights into the way Facebook had been combating COVID-19 misinformation. They found that the platform was failing to label 29% of identified pieces of false content on the English speaking version of the website. The percentages increased to 50%, 68% and 70% for the Portuguese, Italian and Spanish versions, respectively.
Avaaz also found that there is a significant delay between content being identified as false and the warning label appearing, sometimes up to 22 days. Avaaz’s campaign director, Fadi Quran, noted that “Facebook sits at the epicenter of the misinformation crisis” before conceding, “But the company is turning a critical corner today to clean up this toxic information ecosystem, becoming the first social media platform to alert all users who have been exposed to coronavirus misinformation, and directing them to life-saving facts.”
The New Policy is Great, but is it Too Late?
Facebook is struggling to find a balance between curbing the spread of misinformation and avoiding censorship. Every step that they take in the direction of achieving both goals should be applauded. However, these new policies have all been retroactive in nature. Misinformation spread through the social media platform has already been damaging.
A study by Zignal Labs has identified the most popular piece of COVID-19 misinformation on Facebook as the conspiracy theory that links Bill Gates to the spread of the virus. While that may seem preposterous to most, that theory is gaining traction within the alt-right and is now being re-circulated by figure heads of that movement.
Other pieces of misinformation on Facebook have had more dire immediate consequences than the ideological ramifications of the Bill Gates conspiracy. The conspiracy theory that ties the rollout of 5G cellphone towers with the spread of COVID-19 has lead to cases of arson in the UK targeting the new towers. While the three cases have not been linked to each other directly, they certainly are tied to the COVID-19 conspiracies, as evidenced by the threats the mayor of Liverpool (one of the towns home to an attacked 5G tower) received that were related to the “bizarre theory.”
The recent policy change by Facebook is evidently very necessary. Although they are operating from a reactive rather than a proactive position, that is most likely for the best when it comes to preserving the free flow of information. This new directive is directed solely at the COVID-19 pandemic, but that has not stopped people from wondering if this would — or should — be enacted on a broader scale.
“It’s Got to be Worth a Try”
It remains to be seen how effective the pop-ups will be. Emily Taylor, an associate fellow at the international affairs think-tank, Chatham House, and a noted expert in social media misinformation is not entirely sold on these new measures. She succinctly sums up the ongoing link between Facebook and misinformation, the general consensus of their new directive and the reason why the platform’s actions are important in her statement:
“I think this latest step is a good move from Facebook and we’ve seen a much more proactive stance to misinformation in this pandemic than during other situations like the US elections … We don’t know if it will make a huge difference but it’s got to be worth a try because the difference between misinformation in a health crisis and an election is literally that lives are at stake.”
Mark Zuckerberg has stressed that his companies are working to address the crisis, emphasizing that, “On Facebook and Instagram, we’ve now directed more than two billion people to authoritative health resources via our Covid-19 Information Center and educational pop-ups, with more than 350 million people clicking through to learn more.”
That percentage is not exactly what one would call substantial, but it is certainly better than nothing.