Access the Editor’s Summary without charge
Roula Khalaf, the FT’s Editor, curates her preferred articles for this recurring publication.
Proceedings involving juries do not represent the optimal method for settling significant matters of public policy, for instance, the most effective strategies to shield youngsters from the potential detriments of social media platforms. Furthermore, the evidence put forth in a singular court setting, or the prevailing opinion of a particular group of jurors, doesn’t inherently offer substantial foresight regarding the outcomes of subsequent legal actions. However, a ruling issued recently, which deemed two technology firms accountable for a user’s psychological difficulties experienced during childhood, ought nonetheless to serve as a strong impetus for social media platforms to enhance their efforts concerning the protection of minors.
The central question in the litigation, heard in a Los Angeles state court, revolved around whether Meta’s Instagram and Google’s YouTube had engineered their offerings to captivate the focus of young individuals, disregarding unequivocal cautions that their approaches might induce psychological damage. The panel of jurors found against the corporations following nine days of careful consideration, with a 10-2 majority. Despite entreaties from the claimant’s attorneys for a substantial punitive award, they awarded damages of $6 million, yet even so, they issued a distinct cautionary signal in this inaugural legal proceeding of its nature to reach a judicial hearing.
A significant aspect confirmed by this case is that the free expression entitlements held by social networking firms do not shield them from accountability for creating flawed merchandise, although this matter is anticipated to be revisited during an appeal. As part of its rebuttal, Meta cited Section 230 of the 1996 Communications Decency Act, a provision that exempts technology platforms from culpability for the content posted by their clientele, in addition to its First Amendment prerogatives to disseminate information as it deems appropriate. Nevertheless, the jurors concurred with the complainant’s assertion that, through the incorporation of functionalities seemingly engineered chiefly to retain users, like endless scrolling and algorithm-driven suggestions, the corporations had forfeited their entitlement to assert such safeguards.
Predictably, the claimant’s legal representatives endeavored to liken the social media enterprises to the tobacco sector, which deliberately concealed unequivocal proof of the health risks associated with its offerings. The revelations made public during the trial underscored a growing association between engagement with social media and declining psychological well-being among young people, albeit not the kind of established causal link that major tobacco corporations strove to keep hidden.
Assertions that the quest for financial gain had rendered the firms deliberately oblivious to their users’ welfare were met with contentions that these companies were, in reality, prioritizing their clientele. During his testimony in the proceedings, Meta’s Chief Executive Officer, Mark Zuckerberg, faced scrutiny regarding Instagram’s permission for “enhancement filters” intended to improve users’ appearance, notwithstanding alerts from 18 specialists indicating potential detriment to some users’ psychological state. His rationale cited a disinclination towards an overly controlling approach and a profound wish for Instagram’s community to articulate themselves freely. Subsequent juries, confronted with comparable evidence in a series of other impending litigations, could arrive at differing judgments compared to the one delivered recently.
However, the corporations must, as an initial measure, demonstrate that they are enforcing their internal regulations with efficacy. Testimony presented during the trial indicated approximately 4 million Instagram users were under its prescribed age requirement of 13. Meta has implemented certain actions to bolster security, for example, discontinuing its method of gauging achievement by how long it can retain an individual on the service.
Technology corporations ought not to delay in discovering if adverse judgments accumulate before expanding their protective measures for adolescents. Concerns among parents are escalating, and judicial pressure is merging with governmental initiatives. With Australia taking the lead, twelve nations, alongside the European Union and over 20 American states, have now put forward or implemented provisions prohibiting minors from utilizing social media. For major technology firms confronting a burgeoning “tech backlash,” guaranteeing the security of their clientele is not merely a moral imperative but also integral to safeguarding their commercial operations.

