Facebook imposes stricter restrictions on vaccine misinformation targeting children
Just as the U.S. Food and Drug Administration officially approved Pfizer’s COVID-19 vaccine for children aged 5 to 11, Facebook’s new identity Meta company announced that it will introduce stricter policies on vaccine misinformation against children.The platform previously restricted the misinformation of the COVID-19 vaccine at the end of 2020, but did not distinguish between policies specifically for children.
Meta said in a new blog post that it is working with the Centers for Disease Control and Prevention (CDC) and the World Health Organization (WHO) to remove harmful content related to children and the COVID-19 vaccine circulating on the platform. This includes any posts suggesting that the COVID-19 vaccine is unsafe, untested, or ineffective for children. In addition, Meta will provide reminders in English and Spanish that the vaccine has been approved for use in children, and will also provide information about where the vaccine is supplied.
Meta pointed out that since the beginning of the epidemic, the team has removed a total of 20 million COVID-19 and vaccine misinformation from Facebook and Instagram.
However, these figures are inconsistent with what we have seen from internal documents leaked by Facebook. Facebook documents clearly show how unprepared the platform is for misinformation related to the COVID-19 vaccine. If Facebook is more prepared, it may launch activities to combat misinformation in the early stages of the pandemic, including activities aimed at children and adults, and may remove more false content as a result.