Facebook To Take Action Against Users Repeatedly Sharing Misinformation

Facebook
FILE PHOTO: A Facebook logo is displayed on a smartphone in this illustration taken January 6, 2020. REUTERS/Dado Ruvic/Illustration

Facebook Inc said on Wednesday it would take “stronger” action against people who repeatedly share misinformation on the platform.

Facebook will reduce the distribution of all posts in its news feed from a user account if it frequently shares content that has been flagged as false by one of the company’s fact-checking partners, the social media giant said in a blog post.

It added that it was also launching ways to inform people if they are interacting with content that has been rated by a fact-checker.

False claims and conspiracies have proliferated on social media platforms, including Facebook and Twitter, during the COVID-19 pandemic.

“Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps,” the company said in a statement. 

Earlier this year, Facebook said it took down 1.3 billion fake accounts between October and December, ahead of an inspection by the U.S. House Committee on Energy and Commerce into how technology platforms are tackling misinformation.

Reporting by Tiyashi Datta in Bengaluru; Editing by Amy Caren Daniel; Reuters

Total
0
Shares

Leave a Reply

Previous Post
Nvidia

Nvidia Forecast Beats Expectations but Crypto Mining’s Role Remains Unclear

Next Post
Bitcoin

China’s Crypto Crackdown Speeds Shift to Central Asia, North America Mining

Related Posts