91Ƶ

Skip to content

Health misinformation creates 91Ƶwhack-a-mole91Ƶ situation for tech platforms

World Health Organization analysis found misinformation in up to 51 per cent of social media posts
web1_2024031817038-63930360bdcd42a4da8befa3c7791e074832fd58736454c9a4e63ddfecfe1747
For social media companies, how to address health information has become a perennial question that has only grown in importance as the number of platforms multiplied and people began spending increasing amounts of time online. Now, it91Ƶs not uncommon to spot medical misinformation with almost every scroll.This March 18, 2010, file photo shows the YouTube website in Los Angeles. THE CANADIAN PRESS/AP/Richard Vogel

When Dr. Garth Graham thinks about health misinformation on social media platforms, he envisions a garden.

No matter how bountiful or verdant that garden is, even the head of YouTube91Ƶs global health division admits it91Ƶs often in need of tending.

91ƵHow do you weed and pull out the bad information?91Ƶ he questioned.

91ƵBut also91Ƶhow do you plant the seeds and make sure people have access to good information as well as high quality information?91Ƶ

For social media companies, these have become perennial questions that have only grown in importance as the number of platforms multiplied and people began spending increasing amounts of time online.

Now, it91Ƶs not uncommon to spot misinformation with almost every scroll.

A 2022 paper published in the Bulletin of the World Health Organization reviewed 31 studies examining how prevalent misinformation is. The analysis found misinformation in up to 51 per cent of social media posts associated with vaccines, up to 28.8 per cent of content associated with COVID-19, and up to 60 per cent of posts related to pandemics.

An estimated 20 to 30 per cent of YouTube videos about emerging infectious diseases were also found to contain inaccurate or misleading information.

The consequences can be harmful, if not deadly.

Research the Council of Canadian Academies released in 2023 said COVID-19 misinformation alone contributed to more than 2,800 Canadian deaths and at least $300 million in hospital and ICU visits.

Platforms take the risks seriously, Graham said in an interview. 91ƵWe are always concerned about anything that may produce harm.91Ƶ

That concern often leads platforms to remove anything violating their content policies.

YouTube, for example, has banned content denying the existence of some medical conditions or contradicting health authority guidance on prevention and treatment.

Examples embedded in its medical misinformation policy show the company removes posts promoting turpentine, gasoline and kerosene as a treatment for certain conditions because these substances cause death. Ivermectin, used to treat parasitic worms in animals and humans, and hydroxychloroquine, a malaria drug, are also barred from being promoted as COVID-19 cures.

When it comes to vaccines, YouTube bans videos alleging immunizations cause cancer or paralysis.

Facebook and Instagram parent company Meta Platforms Inc. refused to comment for this story and TikTok did not respond to a request for comment, but in broad strokes, these companies have similar policies to YouTube.

Yet Timothy Caulfield, a University of Alberta professor focused on health law and policy, still spots medical misinformation on platforms. He recently asked his students to search for stem cell content and several posts spreading unproven therapies came up easily.

Still, he sympathizes with some of the challenges tech companies face because he sees conquering health misinformation as a game of 91Ƶwhack-a-mole.91Ƶ

He says there91Ƶs a nimbleness to spreaders of misinformation, who are often motivated to keep finding ways to circumvent removal policies because their posts can boost profits and brands or spread an ideology.

91ƵThey can work around the moderation strategies, but that just shows how we91Ƶre not going to fix this with one tool,91Ƶ Caulfield said.

91ƵThis is going to be an on ongoing battle.91Ƶ

In its misinformation policy posted on its website, Meta acknowledges the difficulties, saying 91Ƶwhat is true one minute may not be true the next minute.91Ƶ

91ƵPeople also have different levels of information about the world around them and may believe something is true when it is not,91Ƶ the policy says.

In an attempt to keep up with everything, Meta relies on independent experts to assess how true content is and whether it is likely to directly contribute imminent harm before it is removed. Third-party fact checking organizations are also contracted to review and rate the accuracy of its most viral content.

At YouTube, workers, including some who form an 91Ƶintelligence desk91Ƶ monitoring posts and news to detect trends that might need to be mitigated, are used along with machine learning programs, which the company says are well suited to detecting patterns in misinformation.

Some responsibility is also put on credible health-care practitioners and institutions, whose content platforms highlight recommendations to make it easier for users to find trustworthy information.

YouTube, for example, has partnered with organizations including the University Health Network and the Centre for Addiction and Mental Health in Toronto.

CAMH runs a YouTube channel where medical professionals explain everything from schizophrenia to eating disorders. Production funding came from YouTube, but the institution91Ƶs resources were used for script writing and clinical review, CAMH spokeswoman Hayley Clark said in an email.

Graham sees it as a good example of the health-care profession 91Ƶmeeting people where they are,91Ƶ which he said is 91Ƶhow we battle misinformation.91Ƶ

91Ƶ(Credible information) has to be in the palm of people91Ƶs hands so that they can have dinner conversations, so when they91Ƶre sitting down in their couch that they91Ƶre empowered,91Ƶ he said.

But when it comes to other organizations and doctors, 91Ƶwe can91Ƶt assume that all of them have the capacity to do this,91Ƶ said Heidi Tworek, an associate professor at the University of British Columbia, whose research focuses on the effects of new media technologies.

These organizations want to get credible information out, but in the cash-strapped and time-lacking health-care industry, there91Ƶs always another patient to help.

91ƵSome health-care institutions would say, 91ƵOK, we91Ƶve got X amount of money, we91Ƶve got to choose what we spend it on. Maybe we want to spend it on something other than communications,91Ƶ91Ƶ Tworek said.

In some instances, doctors are also 91Ƶdoing it off the side of their desk91Ƶbecause they think it is valuable,91Ƶ but that subjects them to new risks like online attacks and sometimes even death threats.

91ƵSome people don91Ƶt want to enter those spaces at all because they see what happens to others,91Ƶ she said.

To better combat medical misinformation, she would like platforms to act more responsibly because she often notices their algorithms push problematic content to the top of social media timelines.

However, she and Caulfield agree health misinformation needs an all-hands-on-deck approach.

91ƵThe platforms bear a lot of responsibility. They91Ƶre becoming like utilities and we know the impact that they have on public discourse, on polarization,91Ƶ Caulfield said.

91ƵBut we also need to teach critical thinking skills.91Ƶ

That could begin at school, where students could learn how to identify credible sources and detect when something could be incorrect 91Ƶ lessons he91Ƶs heard Finland begins in kindergarten.

No matter when or how that education takes place, he said the bottom line is 91Ƶwe need to give citizens the tools to discern what91Ƶs misinformation.91Ƶ

READ ALSO:





(or

91Ƶ

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }