With COVID-19 vaccinations just beginning, Twitter will ramp up its efforts to tamp down conspiracy theories that might discourage people from getting the vaccine.
The newly expanded rules apply to debunked information about the adverse effects of getting vaccinated, misleading tweets claiming the vaccine is not necessary and conspiracies that claim COVID-19 vaccines are used to “intentionally cause harm to or control populations.” Twitter’s updated policy will go into effect on December 21.
Twitter will require users who tweet something that falls in one of those categories to delete the content before being allowed to tweet again. Addressing vaccine misinformation that doesn’t meet the threshold for removal, Twitter says that it will begin placing warning labels on “unsubstantiated rumors, disputed claims, as well as incomplete or out-of-context information about vaccines” starting in early 2021. Those tweets may also be hidden, have their engagement limited and be accompanied by public health information labels.
The company said that it will prioritize removing misinformation with the greatest potential to do harm, and we’ve asked Twitter if that decision is made based on how much exposure a tweet is getting or the nature of its content. The new policies will be enforced through a hybrid approach of automation and human moderation.
Early in the pandemic, Twitter created a set of new content policies specific to COVID-19 misinformation, which was just beginning to take off. While bogus and potentially harmful misinformation about how the virus was transmitted were the big worries then, the company’s new policy update will address concerns that online misinformation might lead a significant portion of the population to refuse the vaccine.