Fri. Feb 23rd, 2024

«Meta's historical reluctance to protect children

Open in full screen mode

Meta founder Mark Zuckerberg is expected to testify before a Senate committee on child safety in late January.

Agence France-Presse

Newly unredacted documents from New Mexico's lawsuit against Meta highlight the company's “historical reluctance” to keep children safe on its platforms, according to New Mexico's attorney general.

Raúl Torrez sued Facebook and Instagram owner Meta in December, claiming the company failed to protect young users from exposure to child pornography and allowed adults to to request explicit images from them.

In unredacted portions of the suit Wednesday, internal messages and employee presentations from 2020 and 2021 show that Meta was aware of numerous issues, including the ability of adults to contact children on Instagram, the sexualization of minors on this platform and the dangers of using the people you may know feature, which recommends connections between adults and children.

But Meta dragged his feet when it came time to act, as the passages show.

Instagram, for example, began restricting the ability of adults to message minors in 2021. An internal document presented at the trial shows Meta scrambling in 2020 to address an Apple executive whose #12-year-old was solicited on the platform, pointing out that “this is the kind of thing that annoys Apple to the point of threatening to remove us from the App Store.” #x27;'.

According to the suit, Meta knew that adults soliciting minors was a problem on the platform and was prepared to treat it as an urgent problem when it had to.

Open in full screen mode

The supercomputer, called Meta's RSC (Research SuperCluster)

In a July 2020 document titled Child Safety – State of Play, Meta listed immediate product vulnerabilities that could harm children, including difficulty in reporting disappearing videos, and confirmed that the protections offered on Facebook were not always present on Instagram.

LoadingDissipated during his defamation trial, Trump is brought back to order by the judge

ELSE ON NEWS: Dissipated during his defamation trial, Trump is brought back to order by the judge

At the time, Meta's reasoning was that she didn't want to prevent parents and older relatives on Facebook from contacting their younger family members, according to the suit. The report's author called the reasoning unconvincing and said Meta sacrificed children's safety for a growth bet. However, in March 2021, the Instagram app announced that it was banning people over the age of 19 from messaging minors.

Meanwhile, during an internal conversation in July 2020, an employee asked: What are we doing specifically about handling children [something which I just heard about and which happens a lot on TikTok]?

Another employee's response was: somewhere between zero and nothing. Child safety is not an explicit goal of this half [meaning presumably one semester], according to the lawsuit.

In a release, Meta said it wants teens to have safe, age-appropriate online experiences. The company says it has spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.

The complaint misrepresents our work by using selective citations and documents that support its position.

A quote from From the Meta press release

< source srcset="https://images.radio-canada.ca/q_auto,w_700/v1/ici-premiere/16x9/medias-plateformes-meta-jeunes-informations-cellular.jpg" media="(min-width: 0px) and (max-width: 1023px)">Open full screen

Instagram began restricting the ability of adults to message minors in 2021.

Instagram also failed to address the issue of inappropriate comments under minors' posts, the complaint states. This is what Arturo Béjar, former director of engineering at Meta, mentioned to a committee. The latter, known for his expertise in combating online harassment, recounted his own daughter's disturbing experiences with Instagram.

I stand before you today as a father with first-hand experience of a child receiving unwanted sexual advances on Instagram, he told a panel of U.S. senators in November . She and her friends began to have horrible experiences, including unwanted and repeated sexual advances, harassment.

A March 2021 child safety presentation noted that Meta is not sufficiently invested in addressing the sexualization of minors on [Instagram], including sexualized comments on content posted by minors. Not only is this a terrible experience for creators and internet users, but it is also a vehicle for bad people to identify and connect with each other.

Open in full screen mode

Mark Zuckerberg, CEO of Meta (File photo)

Meta, based in Menlo Park, California, has updated its protections and tools for younger users, although critics say it hasn't done enough. Last week, the company announced that it would begin hiding inappropriate content from teen accounts on Instagram and Facebook, including posts about suicide, self-harm and disorder. ;nutrition.

New Mexico's complaint follows a lawsuit filed in October by 33 states that claim Meta harms young people and contributes to their problems mental health by knowingly and deliberately designing features on Instagram and Facebook that get children addicted to its platforms.

For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation said Raúl Torrez in a statement.

As the company continues to minimize illegal and harmful activities that children are exposed on its platforms, internal data and Meta presentations show the problem is serious and pervasive.

A quote from Raúl Torrez, Attorney General of New Mexico

Meta founder Mark Zuckerberg, along with executives from Snap, Discord, TikTok and X, are expected to testify before a U.S. Senate committee on child safety in late January.< /p>

By admin

Related Post