The Washington Post has reported that Meta is reversing its covid disinformation policies in nations like the US, where the pandemic’s national emergency designation has been revoked as advised by its independent monitoring board earlier in April, this year.
The World Health Organization’s worldwide public health emergency declaration has ended, according to Meta, which updated its July announcement that it has asked the Meta Oversight Board to look into the safety of doing so. The WHO declaration reads thus ‘Our Covid-19 misinformation rules will no longer be in effect globally as the global public health emergency declaration that triggered those rules has been lifted.’
Meta has now claimed that going forward, rules will apply specifically to regions. The WHO has decreased the pandemic’s emergency status, according to Meta, so it won’t be immediately addressing some of the board’s concerns, according to its transparency centre page, which takes into account the board’s recommendations.
Among so many recommendations is one advises that Meta reevaluates the false material it removes and takes steps to make government requests to delete content from the site more transparent. Instead, Meta claims that its response to the fourth recommendation of the board—that the company develops a procedure to assess the risk of its misinformation moderation policies—addresses the first recommendation’s intent. To determine the status of covid globally, it states that it will be “consulting with internal and external experts” and that it will publish information about localized enforcement in “future Quarterly Updates.”
On May 5th, 2023, the WHO cancelled their proclamation of a worldwide emergency, six months after Twitter stopped upholding its own anti-misinformation policies immediately after Elon Musk acquired the company in November 2022. Both TikTok and YouTube still uphold their anti-covid disinformation standards, however, YouTube recently modified its anti-election misinformation policies.