Facebook has long refused to intervene in the spread of misinformation on its platform. But this year, as rumors and myths about COVID-19 took over people’s feeds, adding more confusion to a time already filled with uncertainty and fear, Facebook has decided to step in.
The social media giant announced on Tuesday that it will no longer allow ads that explicitly discourage people from getting vaccinated on its platform. This follows its decisions over the past few months to remove false information related to COVID-19 and to prohibit ads about vaccine hoaxes.
“Our goal is to help messages about the safety and efficacy of vaccines reach a broad group of people, while prohibiting ads with misinformation that could harm public health efforts,” said Kang-Xing Jin, Facebook’s head of health initiatives, in a company blog post. “We don’t want these ads on our platform.”
The new policy does not ban anti-vaccine posts. It also does not prohibit ads that advocate for or against legislation and government policy around vaccines, including a COVID-19 vaccine. These ads are required to be authorized and will include a “Paid for by” label so that people can see who is behind them, Jin said.
A day after Facebook’s decision, YouTube announced that it would ban “any content that includes claims about COVID-19 vaccinations that contradict expert consensus from local health authorities or the World Health Organization.”
Instagram, owned by Facebook, took similar actions last year by blocking hashtags it found to be spreading misinformation about vaccines, including #VaccinesCauseAutism.
Besides banning anti-vaccination ads, Facebook is also taking on a more active role in helping increase immunization rates. One of its initiatives is a flu vaccine information campaign that shares reminders in the News Feed and identifies the nearest location users can go to get vaccinated.
Facebook will also be working with global partners including WHO and UNICEF on vaccine education campaigns.
“Building demand for vaccination in communities worldwide is key to saving lives,” said Diane Summers, UNICEF’s senior advisor in vaccine acceptance and demand, in the blog post. “Our collaboration with Facebook is part of our efforts to address vaccine misinformation and share resonant and reassuring information on vaccination.”
Facebook's new efforts to crack down on misinformation come after years of public pressure and severe consequences. During the 2016 presidential election, Russian operatives used social media platforms, including Facebook, to spread disinformation in an attempt to divide the American electorate.
Since then, engagement with deceptive outlets on Facebook has risen, roughly tripling from 2016 to 2020, according to recent study by the German Marshall Fund.
Besides taking action on COVID-19 related falsehoods, Facebook has made other decisions in recent weeks in an attempt to minimize potential harm from posts on its platform, especially with regards to the 2020 US presidential election.
Last week, it announced that it would suspend political ads indefinitely after polls close on Nov. 3 to keep political candidates from using the platform to manipulate the election’s outcome and its aftermath.
As the US faces uncertainty regarding its upcoming presidential election, and the world continues to wait for the distribution of a successful COVID-19 vaccine, Facebook and other social media platforms are becoming increasingly aware of their role in ensuring a stable and informed society.