InvestorsHub Logo
Followers 172
Posts 11378
Boards Moderated 1
Alias Born 12/29/2009

Re: Strategyone post# 49203

Thursday, 01/18/2024 11:55:15 AM

Thursday, January 18, 2024 11:55:15 AM

Post# of 49396
Children on Instagram and Facebook Were Frequent Targets of Sexual Harassment, State Says -- WSJ
Mentioned: META
By Katherine Blunt and Jeff Horwitz

Children using Instagram and Facebook have been frequent targets of sexual harassment, according to a 2021 internal Meta Platforms presentation that estimated that 100,000 minors each day received photos of adult genitalia or other sexually abusive content.

That finding is among newly unredacted material about the company's child-safety policies in a lawsuit filed last month by New Mexico that alleges Meta's platforms recommend sexual content to underage users and promote underage accounts to predatory adult users.

In one 2021 internal document described in the now unredacted material, Meta employees noted that one of its recommendation algorithms, called "People You May Know," was known among employees to connect child users with potential predators. The New Mexico lawsuit says the finding had been flagged to executives several years earlier, and that they had rejected a staff recommendation that the company adjust the design of the algorithm, known internally as PYMK, to stop it from recommending minors to adults.

In comments appended to the report, one Facebook employee wrote that the algorithm had in the past "contributed up to 75% of all inappropriate adult-minor contact."

"How on earth have we not just turned off PYMK between adults and children?" another employee responded, according to the lawsuit. "It's really, really upsetting."

Meta declined to comment on the newly unsealed references to internal documents, referring the Journal to a previous statement in which it said New Mexico "mischaracterizes our work using selective quotes and cherry-picked documents." Calling child predators "determined criminals, " the company has said it has long invested in both enforcement and child safety-focused tools for young users and their parents.

New Mexico alleges that Meta has failed to address widespread predation on its platform or limit design features that recommended children to adults with malicious intentions. Instead of publicly acknowledging internal findings such as the 100,000 child-a-day scale of harassment on its platforms, the suit alleges, Meta falsely assured the public that its platforms were safe.

Much of the internal discussion described in the newly unredacted material focused on Instagram. In an internal email in 2020, employees reported that the prevalence of "sex talk" to minors was 38 times greater on Instagram than on Facebook Messenger in the U.S. and urged the company to enact more safeguards on the platform, according to documents cited in the lawsuit.

One employee that year reported that an Apple executive had complained that the executive's 12-year-old child was solicited on Instagram. The Meta employee, tasked with addressing the issue, noted that "this is the kind of thing that pisses Apple off to the extent of threating [sic] to remove us from the App Store," and asked whether there was a timeline for when the company would prevent adults from messaging minors on the platform.

A November 2020 presentation titled "Child Safety: State of Play" said that Instagram employed "minimal child safety protections" and described policies regarding "minor sexualization" as "immature." It further noted the platform's "minimal focus" on trafficking.

Despite knowing the scale of the problem, New Mexico alleges, Meta leaders didn't take action to prevent adults from sexually soliciting children until late 2022 -- and they stopped short of the broad messaging limitations its safety staff had recommended. Rather than to broadly stop recommending the accounts of children to adults, Facebook and Instagram sought to block such suggestions to adults who had already demonstrated suspicious behavior toward children.

Meta's approach of limiting contact with only known suspicious accounts was bound to be less effective than shutting down the recommendations, New Mexico says, because both malicious adults and children routinely lied about their age. Meta internally acknowledged in 2021 that the majority of minors on Meta's platforms falsely claim to be adults, New Mexico's complaint says, and a study of accounts disabled for grooming children found that 99% of those adults failed to state their age.

Meta in June established a task force to address child-safety problems on its platforms after an article in The Wall Street Journal revealed that Instagram's algorithms connected and promoted a vast network of accounts openly devoted to the commission and purchase of underage-sex content.

Additional Journal articles last year showed that Meta is struggling to fix problems on Instagram as well as on Facebook, where it recently introduced encryption for direct messages. The company's safety staff had long warned of the dangers of enshrouding exchanges that could be used to prosecute child exploitation, the Journal reported. Meta said it had spent years developing safety measures to prevent and combat abuses.

In addition to New Mexico's suit, more than 40 other states sued Meta in October alleging that it misled the public about the dangers its platforms pose to young people.

Meta this month said it would start automatically restricting teen Instagram and Facebook accounts from harmful content including videos and posts about self-harm, graphic violence and eating disorders.

Write to Katherine Blunt at katherine.blunt@wsj.com and Jeff Horwitz at jeff.horwitz@wsj.com

(END) Dow Jones Newswires

January 17, 2024 22:38 ET (03:38 GMT)
Copyright (c) 2024 Dow Jones & Company, Inc.
Volume:
Day Range:
Bid:
Ask:
Last Trade Time:
Total Trades:
  • 1D
  • 1M
  • 3M
  • 6M
  • 1Y
  • 5Y
Recent META News