1. Meta intentionally designed its youth safety features to be ineffective and rarely used, and blocked testing of safety features that it feared might be harmful to growth.
  2. Meta required users to be caught 17 times attempting to traffic people for sex before it would remove them from its platform, which a document described as “a very, very, very high strike threshold.”
  3. Meta recognised that optimising its products to increase teen engagement resulted in serving them more harmful content but did so anyway.
  4. Meta stalled internal efforts to prevent child predators from contacting minors for years due to growth concerns, and pressured safety staff to circulate arguments justifying its decision not to act.
  5. In a text message in 2021, Mark Zuckerberg said that he wouldn’t say that child safety was his top concern, “when I have a number of other areas, I’m more focused on like building the metaverse.”