The supercomputer, called Meta's RSC (Research SuperCluster)
In a July 2020 document titled Child Safety – State of Play, Meta listed immediate product vulnerabilities that could harm children, including difficulty in reporting disappearing videos, and confirmed that the protections offered on Facebook were not always present on Instagram.
Dissipated during his defamation trial, Trump is brought back to order by the judge
ELSE ON NEWS: Dissipated during his defamation trial, Trump is brought back to order by the judge
At the time, Meta's reasoning was that she didn't want to prevent parents and older relatives on Facebook from contacting their younger family members, according to the suit. The report's author called the reasoning unconvincing and said Meta sacrificed children's safety for a growth bet. However, in March 2021, the Instagram app announced that it was banning people over the age of 19 from messaging minors.
Meanwhile, during an internal conversation in July 2020, an employee asked: What are we doing specifically about handling children [something which I just heard about and which happens a lot on TikTok]?
Another employee's response was: somewhere between zero and nothing. Child safety is not an explicit goal of this half [meaning presumably one semester], according to the lawsuit.
In a release, Meta said it wants teens to have safe, age-appropriate online experiences. The company says it has spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.
The complaint misrepresents our work by using selective citations and documents that support its position.
A quote from From the Meta press release
< source srcset="https://images.radio-canada.ca/q_auto,w_700/v1/ici-premiere/16x9/medias-plateformes-meta-jeunes-informations-cellular.jpg" media="(min-width: 0px) and (max-width: 1023px)">
Instagram began restricting the ability of adults to message minors in 2021.
Instagram also failed to address the issue of inappropriate comments under minors' posts, the complaint states. This is what Arturo Béjar, former director of engineering at Meta, mentioned to a committee. The latter, known for his expertise in combating online harassment, recounted his own daughter's disturbing experiences with Instagram.
I stand before you today as a father with first-hand experience of a child receiving unwanted sexual advances on Instagram, he told a panel of U.S. senators in November . She and her friends began to have horrible experiences, including unwanted and repeated sexual advances, harassment.
A March 2021 child safety presentation noted that Meta is not sufficiently invested in addressing the sexualization of minors on [Instagram], including sexualized comments on content posted by minors. Not only is this a terrible experience for creators and internet users, but it is also a vehicle for bad people to identify and connect with each other.
Mark Zuckerberg, CEO of Meta (File photo)
Meta, based in Menlo Park, California, has updated its protections and tools for younger users, although critics say it hasn't done enough. Last week, the company announced that it would begin hiding inappropriate content from teen accounts on Instagram and Facebook, including posts about suicide, self-harm and disorder. ;nutrition.
New Mexico's complaint follows a lawsuit filed in October by 33 states that claim Meta harms young people and contributes to their problems mental health by knowingly and deliberately designing features on Instagram and Facebook that get children addicted to its platforms.
For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation said Raúl Torrez in a statement.
As the company continues to minimize illegal and harmful activities that children are exposed on its platforms, internal data and Meta presentations show the problem is serious and pervasive.
A quote from Raúl Torrez, Attorney General of New Mexico
Meta founder Mark Zuckerberg, along with executives from Snap, Discord, TikTok and X, are expected to testify before a U.S. Senate committee on child safety in late January.< /p>