Facebook announced that health groups will not be allowed to appear on recommendations |
Facebook said on Thursday that it would not feature any groups focusing specifically on health in recommendations sent to users on social media, adding that people should obtain health information from "reliable sources."
Facebook is the largest social network in the world with around 2.7 billion monthly active users. "Last year, it cracked down on more than a million organizations that violated policies of misinformation and harmful content," the company said in its blog.
Avaaz, a human rights organization, said in a report released last month that deceptive health content received 3.8 billion views on Facebook in the past year and that reports on the novel coronavirus (COVID-19) COVID-19 reached its peak during the pandemic.
Under pressure to curb this misinformation on its platform, Facebook has made substantial, reliable health information a critical part of its response. It has also removed false claims about the (Covid-19) virus that were believed to be imminent.
Facebook also said it will discourage managers and moderators from online forums that have been deleted for some time due to policy violations.
Facebook said in its post: It is now also restricting the spread of violent groups from its proposals and research and will soon reduce page content (breaking news). Last month, the company cracked down on nearly 800 groups linked to the QAnon conspiracy because the posts exaggerated violence, showed intent to use weapons, or lured followers with violence.
Twitter also said in a tweet on Thursday: The platform has lowered awareness of tweets related to QAnon by more than 50% thanks to its "work to break down content and accounts related to conspiracy theories." In July, the social media company announced that it would stop recommending QAnon content and accounts based on the assumption that some 150,000 accounts have been affected.
Twitter discussed in a post on Thursday how groups and content coordinate malicious activity, saying: You need to find evidence that people associated with the group or activity are involved in a format that could be harmful to others.
The company said: This format can be technical, for example: one person managing multiple accounts to send the same message, or social media, for example, using a messaging app to organize multiple people who tweet at the same time.
Twitter said: It will prohibit all forms of technical coordination, but for social coordination to violate its rules, there must be evidence that false or misleading content can cause personal or psychological harm, or harm "information." .