by AFP / pic by AFP
As the new coronavirus spreads globally, the online battle to keep misinformation about the disease is also stepping up.
Google, Facebook and other platforms are struggling to keep ahead of scammers, trolls, and others with ill intent who routinely use major tragedies or disasters as opportunities to swindle or manipulate people.
“The public concern about coronavirus is being used as a vehicle to get people to transmit misinformation and disinformation,” said University of Washington biology professor Carl Bergstrom.
Internet companies took part in a meeting with the World Health Organization last week at Facebook offices in Silicon Valley to discuss tactics such as promoting reliable information and fact-checking dubious claims about the coronavirus referred to as COVID-19.
“(We must) combat the spread of rumors and misinformation,” WHO Director General Tedros Adhanom Ghebreyesus told AFP recently.
“To that end, we have worked with Google to make sure people searching for information about coronavirus see WHO information at the top of their search results.”
Google search ranks authoritative sources higher when people are seeking information on health and labels results or news stories that have been fact-checked.
Ghebreyesus said that social media platforms including Twitter, Facebook, Tencent and TikTok have also taken steps to limit spread of misinformation about coronavirus.
Facebook said in a recent online post that it is focusing on claims which, if relied on, could increase the likelihood of someone getting sick or not getting proper treatment.
“This includes claims related to false cures or prevention methods — like drinking bleach cures the coronavirus – or claims that create confusion about health resources that are available,” Facebook head of health Kang-Xing Jin said in the post.
“We will also block or restrict hashtags used to spread misinformation on Instagram, and are conducting proactive sweeps to find and remove as much of this content as we can.”
Selling snake oil
Bergstrom said some virus misinformation is “people trying to sell snake oil products” such as bogus cures or treatments, while others use attention-grabbing deceptions to drive online traffic that yields money from advertising.
Misinformation is also spread by “actors” out to fuel distrust for the establishment in China or foment societal instability overall, according to Bergstrom.
“There’s appetite for up-to-date, real-time information,” said Jevin West, co-author of a book on misinformation with Bergstrom.
“These actors can take advantage of that; things with crazier scenarios are more likely to be clicked on than the report from that doctor at WHO trying to calm down the fears.”
Facebook said that when users seek information related to the virus, the social network will show “educational pop-up” boxes with information considered credible.
Facebook is also giving free advertising credits to organizations running coronavirus education campaigns.
Google-owned video sharing platform YouTube has been modifying policies and products for several years to remove harmful content and give priority to authoritative content deemed trustworthy.
“We currently do not allow content promoting dangerous remedies or cures, like videos which claim that harmful substances or treatments can have health benefits,” YouTube said.
YouTube last year began providing links to reliable information along with videos on “subjects prone to misinformation,” and added coronavirus to that list.
Fact-checks helping or not?
Social media giants have also beefed up ranks of fact-checkers, hiring outside parties such as AFP News Wire, to sort truth from fiction, even if there are questions on their effectiveness.
A recent study published in the journal Science Advances suggested fact-checking did little to stem the tide of misinformation about other epidemics such as Zika, Ebola and yellow fever.
The researchers said that “current approaches to combating misinformation and conspiracy theories about disease epidemics and outbreaks may be ineffective or even counterproductive,” and could even cause “collateral” damage by undermining trust in fact-based disease information.
Bergstrom and West questioned whether social media giants optimized for virality could stem the tide of disinformation and deceit.
“Social media company claiming it’s active participant in the fight against misinformation is like (tobacco maker) Philip Morris saying they’re active participant in the fight against lung cancer,” Bergstrom said.