TikTok update targets Covid vaccine misinformation

TikTok is cracking down on Covid-19 vaccine misinformation through a suite of new changes aimed at protecting vulnerable users from harmful conspiracy theories.

The video-sharing app will introduce a new tool to detect content relating to the Covid-19 vaccine as part of a series of updates being released later this month.

From that date, any relevant videos will come with a banner message attached, stating: “Learn more about Covid-19 vaccines.”

Coronavirus vaccinations developed by Pfizer and BioNTech began to be administered in the UK last week and have since begun in the US and other countries.

However, growing resistance to the immunisation campaign has been fuelled by false claims spread online, with a poll conducted last week finding that only half of American adults plan to get the Covid-19 vaccine.

Misinformation surrounding coronavirus vaccinations has plagued social media platforms in recent months, with Facebook already introducing similar measures to tackle the issue earlier this month.

The US technology giant said it would remove any Facebook or Instagram posts that contain false information relating to vaccines.

Facebook’s policy applies to misinformation that could lead to “imminent physical harm”, while TikTok’s new update is focussed on all content regarding the Covid-19 vaccine.

“We need to be realistic that there will always be a small minority of people who will try to use our platform to share content that goes against our policies,” Kevin Morgan, TikTok’s head of product, wrote in a blog post explaining the move.

“From the NGOs and experts that we talk to, we know it is more important than ever to ensure that misinformation that could harm wider public safety is not allowed to proliferate online.”

TikTok will update its in-app information hub to direct users to trusted sources on 17 December, while the new Covid-19 vaccine banner will begin rolling out globally from 21 December.

The update announcement coincided with new proposals put forward by the UK Department for Digital, Culture, Media and Sport that would require technology firms to protect their users or face fines of up to 10 per cent of their annual revenue.

Part of the department’s Online Harms Bill highlighted vaccine misinformation, stating that Big Tech companies must enforce “clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological harm to adults.”

Leave a Reply