Slow the Spread: The Rise of Covid-19 Vaccine Misinformation on Social Media

Slow the Spread: The Rise of Covid-19 Vaccine Misinformation on Social Media

With more than 1,000 daily virus deaths and 2.8 million Covid-19 cases in the UK just a week into the New Year, citizens are arguably now more than ever awaiting the end of the pandemic.

Yet the global health crisis is still in our midst. With new lockdown measures in place in the United Kingdom, and the vaccination programme in its earliest stages, we must continue to focus on slowing the spread.

This process may well begin on social media.

The Progression of Covid-19 in the United Kingdom

The first week of 2021 marked the highest daily death toll from Covid-19 in the UK since April 2020.

On 6 January, more than 62,000 cases were recorded – the highest daily spike since mass testing became available, despite MPs supporting the lockdown in England.

These figures are quite dire. Current virus patient numbers are nearly 39% higher than the peak cases from last spring. In turn, the number of patients who have died within 28 days of testing positive for Covid-19 is 37% higher than it was a week ago.

With the latest lockdown in full swing, it’s become apparent that we must focus on what we can control. Prime Minister Boris Johnson announced 1.5 million people in the UK had been vaccinated the first week of January – under 2% of the total population.

And while the government has announced the seven locations of mass vaccination hubs in England, they arguably have yet to address one of the key risks people are facing: the spread of misinformation on social media – particularly where the Covid-19 vaccines are concerned.

The Perils of Social Media During the Pandemic

Do social media platforms have an obligation to address anti-vaccination (‘anti-vax’) propaganda?

The truth is that the policy depends entirely on the platform, which presents a significant threat to users’ well-being. Sites like Pinterest enforce strict policies against anti-vaccine content, yet Facebook allows anti-vax sentiments to flow freely.

It’s worth noting the tech giant, which also owns Instagram, has addressed this – in the same way it did while responding to Holocaust denial content: that is, by stating that banning false claims will simply push them to other corners of the internet.

Over time, however – as the Covid-19 death toll skyrocketed – Facebook pledged to target vaccine-related misinformation. In early December 2020, the company announced a policy to reduce ‘imminent physical harm’ linked to false public health claims on Facebook and Instagram. This includes limiting the reach of designated Groups and Pages that spread vaccine misinformation, with the goal of ensuring fewer people see the content.

Similarly, Twitter began to remove dangerous Covid-19 vaccine misinformation from its platform – even going so far as to label misleading tweets with a disclaimer.

But how effective are these somewhat recent measures? For example, Twitter mentioned previously that it would not respond to every post featuring disputed information, but instead focus most of its efforts on removing coordinated conspiracy theories.

And so we have to ask: Can these platforms truly slow the super-spread of vaccine misinformation?

Video Platforms and Covid-19 Misinformation

As we enter the second year of the Covid-19 pandemic, it’s become increasingly clear that social media sites are something of a double-edged sword. Residents will likely be spending more time online during the current UK lockdown – making it all the more vital that the main platforms use their power for good.

It turns out, in this way, that social media can in fact be used strategically for the greater good. Video platforms especially – among them YouTube and TikTok – were some of the first to establish policies specifically involving the Covid-19 vaccine.

In October 2020, YouTube expanded its medical misinformation policy to prohibit claims in opposition of the World Health Organisation and local health authorities. (The Google-owned site has also pledged to remove all Covid-19 anti-vaccination videos.)

TikTok, in turn, developed a broader set of rules against vaccine misinformation.

‘We recognise the responsibility we have to our community to be nimble in our detection and response when new kinds of content and behaviours emerge,’ said Kevin Morgan, TikTok’s European head of product and process, in a recent blog post.

The app claims not only to remove all false information involving the Covid-19 vaccine, but also to suspend the accounts responsible for spreading fabricated claims. Both TikTok and Instagram add notices to many posts concerning the pandemic, directing users to reputable sources from local and global health agencies.

Social Media Can Be Used to Promote Vaccine Uptake

TikTok, in an effort to combat false Covid-19 vaccination claims (and after removing more than 29,000 videos posted by European users about the virus in the summer of 2020), has attracted a number of healthcare professionals.

For what purpose? From misleading views about wearing masks, to false claims about the Covid-19 vaccine, scientists and healthcare workers are creating content designed specifically to slow the spread of misinformation.

There’s undeniably a lot to sift through on TikTok. The app’s ‘For You’ page offers a seemingly endless selection of content from dance videos to historical tours. Yet, medical professionals are leveraging their expertise to – hopefully, thoughtfully – bridge the gap between vaccine rollout and the end of the pandemic.

Take scientist Morgan McSweeney, PhD, known as @dr.noc on TikTok. Dr Noc makes a point of dispelling common myths on the platform, incorporating animation into his content to engage users in a casual yet science-backed format.

According to McSweeney, the app is an ideal platform for connecting with young people – particularly those who may be more likely to come across misinformation. His content, he finds, is more personal than the posts official organisations create. ‘When it’s just you in front of a camera, it’s a little bit more like a conversation,’ he told Time magazine.

The benefits of this approach are expansive. Many children and young adults are searching for reliable information, yet they don’t know where to look for it – and doctors and scientists can leverage social media to connect with them directly. Healthcare workers are taking this as an opportunity to educate, promote vaccine uptake and ultimately help people from around the world make a difference in their respective communities.

Some healthcare providers – including Christina Kim, an oncology nurse practitioner at Massachusetts General Hospital in the United States – are fighting misinformation head-on by connecting with their own audiences. Under the handle @christinaaaaaaanp, Kim hand-picks questions from her 228,000-person following to help slow the spread of Covid-19.

‘I genuinely want this pandemic to end,’ she said recently. ‘I want people to recognise what we need to do to make it end. And I have a responsibility, now with this platform that I have, I think it would almost be irresponsible to step away from that.’

The pandemic is still ongoing. Covid-19 vaccine rollout has yet to reach much of the UK. But during this interim period – during the ongoing lockdown – perhaps we can join forces and turn to the healthcare professionals on social media sites like TikTok for invaluable, personalised information. Together, perhaps we can slow the spread of the virus.

Do you have insights, questions or comments about Covid-19 vaccine misinformation on social media? Please contact me for more information – or subscribe to stay up-to-date on all things Unsocial Media.