Facebook downgrades posts that promote miracle cures

0
58


Facebook has revealed that it’s downgrading content that makes dubious health claims, including posts that try to sell or promote “miracle cures.”

The big technology platforms have faced growing criticism over the spread of fake or misleading content — reports emerged last year that Facebook had been featuring homemade cancer “cures” more prominently than genuine information from renowned organizations such as cancer research charities. And a few months back, a separate report found that YouTube videos were promoting bleach as a cure for autism.

Elsewhere, Facebook also recently said that it would crack down on anti-vaccine content.

Fight against misinformation

The fight against this kind of misinformation is ongoing, and isn’t limited to spurious health cures. Back in January, YouTube announced plans to curb conspiracy theory video recommendations, including claims that the moon landings were faked and the Earth is flat. At the time, YouTube also confirmed that it would reduce video recommendations for phony miracle health cures, revealing the extent of YouTube’s multi-faceted fake-information problem.

Facebook’s latest announcement appears to have been in response to a Wall Street Journal (WSJ) investigation into the spread of bogus cancer treatments on both Facebook and YouTube, with Google’s video-streaming off-shoot telling the publication that it has cut-off advertising revenue for such videos as part of its efforts.

“In order to help people get accurate health information and the support they need, it’s imperative that we minimize health content that is sensational or misleading,” Facebook product manager Travis Yeh wrote in a blog post.

Yeh said that the company had made “two ranking updates” last month that seek to reduce the visibility of posts that exaggerate or sensationalize a particular health-related remedy. Related to that, it added that it will specifically target posts that strive to sell products and services based on such claims — this could be a cancer cure, or it could be a pill claiming to help a user lose weight.

“In our ongoing efforts to improve the quality of information in News Feed, we consider ranking changes based on how they affect people, publishers and our community as a whole,” Yeh added. “We know that people don’t like posts that are sensational or spammy, and misleading health content is particularly bad for our community.”

As with other content-moderation initiatives on Facebook, the company is utilizing an automated approach to this downgrading, and has identified phrases that are commonly used in such posts to “predict which posts might include sensational health claims or promotion of products with health-related claims,” Yeh said.

Of course, if past efforts are anything to go by, this often turns into a game of whack-a-mole — content creators and promoters find new ways to circumvent algorithmic sensors. Today’s news comes just a day after an independent audit found that Facebook’s policy on white nationalism content was too narrow, as it only prohibits explicit praise or support for “white nationalism” where specific terminology is used. “The narrow scope of the policy leaves up content that expressly espouses white nationalist ideology without using the term ‘white nationalist,’ the auditors wrote. “As a result, content that would cause the same harm is permitted to remain on the platform.”



Source link

ОСТАВЬТЕ ОТВЕТ

Please enter your comment!
Please enter your name here