Meta refuses to block a feature that helped target children

Meta refuses to block features that specifically target children. Meta refuses to block features that specifically target children

Meta executives are refusing to disable Facebook's People You May Know recommendation feature after an employee said it could lead to child exploitation, according to a new report from The Wall Street Journal.

David Erb, Facebook's then-head of engineering, founded a team in 2018 dedicated to identifying risky user behavior within the platform.

By studying inappropriate interactions between adults and minors on Facebook, the team discovered that Facebook's "People You May Know" algorithm is a common method used by adults to find children to target on the platform.

“It's worse than we thought. There are millions of pedophiles targeting tens of millions of children,” Erb told the Wall Street Journal.

In some cases, adults requested lewd photos of teenage girls in exchange for money and threatened to publish nude photos, the Wall Street Journal reported.

Meanwhile, Meta executives are discussing encrypting Facebook messages to ensure user privacy.

Erb sought advice from his WhatsApp colleagues on how to combat child exploitation on the Facebook platform because he feared that encryption projects would make child exploitation more difficult to detect.

Erb's team suggested that Facebook's "People You May Know" feature stop recommending minors to adults, and Erb explained that Meta's management rejected the proposal.

Meta decided to continue encrypting Facebook messages, and Erb was fired, resigning shortly after and leaving Facebook in December 2018.

Andy Stone, a spokesman for Meta, said Erb's claims were false and made clear that former employees have the right to express their opinions, not their ideas.

“We have long been invested in the safety of children, and in 2018 we began restricting tips to suspicious adults, continued our efforts to remove a large number of incriminating accounts, and supported a successful case reporting update campaign within the National Center for Child Protection.” trusted. “There are many undocumented cases,” Coverstone said. » Previously listed.

Since Meta announced in December that it would make end-to-end encryption a default option in the Messenger app, Facebook has been divided on how to combat child exploitation.

“The extra layer of security provided by end-to-end encryption means that the contents of your messages and calls with friends and family are protected from the moment they leave your device to the moment they appear on your device, and the recipient is protected,” Meta said. “This means that no one, not even the dead, can see what you send or say,” Mehta said. Unless you notify us.

The newspaper reported that the company developed an internal tool called Masca to detect accounts that engage in suspicious activities involving minors.

A Meta spokesman told the newspaper that the company has removed 160,000 accounts linked to child exploitation since 2020.

Documents obtained by The New York Times say Meta intentionally had millions of underage users on Instagram.

33 US states accuse Meta of continuing to routinely collect personal data from children, even though collecting data from children under 13 is illegal under US law.



Save 80.0% on select products from RUWQ with promo code 80YVSNZJ, through 10/29 while supplies last.

HP 2023 15'' HD IPS Laptop, Windows 11, Intel Pentium 4-Core Processor Up to 2.70GHz, 8GB RAM, 128GB SSD, HDMI, Super-Fast 6th Gen WiFi, Dale Red (Renewed)
Previous Post Next Post