In Silicon Valley, there has been a years-long debate over where a user’s right to privacy ends and a platform’s responsibility for public safety begins. New court documents, revealed as part of the State of New Mexico’s lawsuit against Meta, indicate how the social media giant has navigated these turbulent waters. The case concerns the implementation of end-to-end encryption on Messenger and Instagram, which, according to internal warnings from executives, may have drastically reduced the company’s ability to detect abuse of minors.
The disclosed correspondence shows that the Met ‘s top security and content policy directors expressed deep scepticism about Mark Zuckerberg’s plans as early as 2019. Monika Bickert, head of content policy, outright called the measures “irresponsible”. Concerns were not unfounded – internal analyses estimated that the number of reports of child abuse sent to the relevant services could fall by 65% after the introduction of encryption. In absolute numbers, this meant thousands of cases a year that might never reach the desks of investigators.
This case is a case study about product risk management on a global scale. The Met finds itself caught between two paradigms. On the one hand, encryption is considered the gold standard for data protection, widely used by Apple or Google. On the other hand, the specificity of Facebook and Instagram – platforms based on open social graphs – makes it much easier to connect between strangers than with services like WhatsApp. It is this ease of ‘finding victims’, as Antigone Davis, global head of security, wrote about, that has been the main argument against default encryption in these particular ecosystems.
The Met’s official position highlights the evolution the company has undergone since the critical 2019 emails. Spokesperson Andy Stone points out that it was these internal concerns that spurred the creation of new safeguards ahead of the final implementation of encryption in 2033. Among other things, restrictions on adult contact with minors have been introduced, along with artificial intelligence-based tools to detect suspicious patterns of behaviour without having to directly ‘look’ at the content of messages.
From a market perspective, the New Mexico dispute is just the tip of the iceberg. The Met is currently facing waves of lawsuits from more than 40 state attorneys general and school districts accusing the company of negatively impacting the mental health of young people. The outcome of these lawsuits may redefine the ‘duty of care’ standards for the entire Big Tech sector.
