Lessons from US courts on social media liability

Restricting access to social media for minors has been debated for years now. Hopefully, the US courts’ decisions in KGM and New Mexico — and more particularly, their grounding in access to material detailing the internal functioning of social media platforms — should ease the conundrum the government faces (the choice between light-touch regulation to promote innovation and the need to protect users) (REUTERS)

Even as Gen Z appears to have intuitively decided posting online is passé and India ponders social media bans, courts in the US have taken a strong stand, holding social media platforms liable for online harms, addiction, and its health ramifications. The Los Angeles county superior court’s recent verdict in KGM v. Meta et al (KGM) could prove a watershed moment for a clear-eyed re-examination of intermediary liability. The raison d’être of safe-harbour exemptions is often lost as social media platforms transition into business behemoths.

Restricting access to social media for minors has been debated for years now. Hopefully, the US courts’ decisions in KGM and New Mexico — and more particularly, their grounding in access to material detailing the internal functioning of social media platforms — should ease the conundrum the government faces (the choice between light-touch regulation to promote innovation and the need to protect users) (REUTERS)
Restricting access to social media for minors has been debated for years now. Hopefully, the US courts’ decisions in KGM and New Mexico — and more particularly, their grounding in access to material detailing the internal functioning of social media platforms — should ease the conundrum the government faces (the choice between light-touch regulation to promote innovation and the need to protect users) (REUTERS)

KGM has sparked a long-delayed deep dive into the functioning and controlling interests driving social media. KGM is not the lone judicial decision pointing fingers at social media platforms for driving content towards users and/or adopting technologies or processes that have causal connects to online harms afflicting users. In State of New Mexico v. Meta Platforms, Inc, the New Mexico department of justice succeeded in getting a $375-million penalty imposed on Meta for endangering children by exposing them to sexually explicit material and to paedophiles.

That the intermediary exemptions were predicated on third parties sharing information on their platforms is now reasonably well understood. The genesis of this principle — from the US Section 230 Communications Decency Act, 1996, to our own safe harbour exemption under Section 79 of the Information Technology Act, 2000 (as amended periodically) or the IT Act — has been acknowledged by the judiciary.

It is time India not only evaluated its social media regulations with more rigour but also analysed and delinked exemptions or protections given to social media — indeed, all intermediaries — for third-party information from other business ventures of such intermediaries. Any action, promotion, or dissemination of information other than that of genuine third parties should automatically end protection under Section 79 of the IT Act for such intermediaries. For instance, a social media platform promoting advertisements isn’t merely providing an objective platform for third-party content, it is engaging consciously with third parties to contractually promote their products or services. It is thus no longer a passive platform. A platform disseminating its own content, as opposed to merely providing a base for third parties to use, is also not a passive platform. The “intermediary” tag does not — and should not — protect platforms from liability if there is a blend of both third-party content and self-generated content. A gaming platform, for instance, that provides its own games and also hosts third-party content, should not be able to claim exemptions or safe-harbour protections with regard to its self-generated content.

India’s safe harbour norms under Section 79 of the IT Act are explicit in qualifying and limiting the protection to only third-party content that a platform has no control over. However, a gap is perceived in its implementation — likely because of ignorance or misinterpretation — and needs to be resolved. The rules framed under Section 79 of the IT Act — the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, or the Intermediary Guidelines, which have been amended several times since 2021 — must stay within the remit of the parent Act when regulating third-party content.

The two American court judgments discussed here earlier posit that social media platforms ceased to be mere passive providers when they actively promoted content and pushed the same onto users. This moves the narrative from intermediary liability to product- or service-liability, wherein the platforms failed to warn users of the harms — stemming from addiction, sexually explicit content harmful to minors, and sexual predators.

These two decisions may set the trend for many more pending cases in the US. But, they also become critical in the Indian context, where intermediary exemptions have been tightened periodically to reduce timelines for compliance and to provide alternative solutions, particularly takedowns of harmful content.

Restricting access to social media for minors has been debated for years now. Hopefully, the above decisions — and more particularly, their grounding in access to material detailing the internal functioning of social media platforms — should ease the conundrum the government faces (the choice between light-touch regulation to promote innovation and the need to protect users) and address regulatory impasse.

This will lay the foundation for stringent laws, not only in terms of restraints, if any, for minors’ access to social media platforms but also on protecting the rights of all users against unscrupulous practices of social media platforms that harm their lives and safety. The Consumer Protection Act, 2019, leads the way for product- and service-liability in India, but this is limited to paid usage. Hence, a compensation framework applicable to intermediaries such as social media platforms — which offer their services free but earn substantially because of their userbase — for online harms through non-passive processes is essential. Indian users can still benefit from tort-claims, including class action litigation, in instances similar to those examined judicially in KGM and New Mexico.

NS Nappinai is a senior advocate practising before the Supreme Court of India, and is the founder of the non-profit, Cyber Saathi. The views expressed are personal

Source

Posted in US

Leave a Reply

Your email address will not be published. Required fields are marked *

4 × 3 =