top of page

Hooked By Design: The Need For A Law Against Addictive Social Media Features

  • Writer: Centre for Advanced Studies in Cyber Law and AI CASCA
    Centre for Advanced Studies in Cyber Law and AI CASCA
  • 20 hours ago
  • 8 min read

By- Shoptorishi Dey (3rd year member, CASCA) and Uday Gupta (2nd year member, CASCA)


ree

Introduction


Over the past decade, there has been an alarming rise in the usage of social media among children. In India alone, it is estimated that around 76% of adolescents have accounts on various social media platforms. This has become a cause for concern as an increasing number of people are becoming addicted to social media, which is being popularly termed as ‘doom scrolling.’ This behaviour is primarily attributable to the presence of addictive features in social media platforms to maximise screen time.


In a landmark step aimed at reducing addiction of teenagers to social media, Florida has filed a lawsuit against Snapchat, accusing the platform of “features designed to foster addiction in minors.” This blog analyses the allegations against Snapchat and examines the rise of social media addiction among children. It further explores Florida’s House Bill 3 (“H.B. 3”), under which Snapchat has been sued, and compares it to similar laws in other jurisdictions. Finally, it proposes a legislative roadmap that India needs to adopt to regulate addictive features.


Florida v. Snapchat


The State of Florida’s recent legal action against Snap Inc., the parent company of popular social media app Snapchat, comes at a pivotal period of unprecedented social media usage and subsequent need for better legislation concerning social media. Florida Attorney General James Uthmeier sued Snapchat for alleged violations of H.B. 3, which intends to restrict minors’ access to social media platforms deemed harmful due to addictive functionalities.


The law requires that social media platforms prohibit users under 14 years old from creating accounts and require these platforms to obtain parental consent for account registrants who are 14 or 15 years old. The law also imposes age verification requirements for online services that knowingly distribute a significant amount of “harmful” content.


With Snapchat specifically, the state contends that Snapchat employs features intended to promote addiction in minors, and creates accounts for users under the age of 13, without parental consent. The lawsuit represents Snapchat’s conduct as “particularly egregious”, stating that Snapchat allows easy access to explicit content, which includes illegal substances and pornography, which can be significantly harmful when exposed to minors.


The lawsuit also highlights the disappearing nature of the content you share on Snapchat, which encourages users to come back to the app regularly, preying on minor users who are sensitive to missing content. Furthermore, during the course of the proceedings, the plaintiffs alleged that Snapchat lenses significantly alter the appearance of the user have been referred to as “Snapchat dysmorphia”, leading to anxiety and sometimes driving users to seek real-life cosmetic procedures among young teens and adolescents.


Understanding H.B. 3: Florida’s Attempt to Curb Digital Addiction


Defining addictive features


HB-3 applies only to social media platforms that use at least one of its enumerated 5 “addictive features”, and does not apply to platforms whose “exclusive function” is email or private direct messaging. These features are designed by social media platforms to encourage compulsive use. As identified by HB-3, Snapchat utilises 4 out of 5 such features, comprising of “infinite scroll”, which is the use of pages with no visible or apparent end or page breaks, “push notifications”, which are alerts that appear on a user’s phone screen when they are not using Snapchat that are designed to entice the user back onto the app, “personal metrics”, specifically emphasising on the Snapstreaks feature of Snapchat and branding it as the most problematic interactive metric, and “auto play” which encourages addictiveness similarly to infinite scroll by eliminating the friction between watching content.


Other Provisions

In addition to the social media provisions, Section 2 of HB-3 requires businesses that publish “content harmful to minors” to conduct age verification to ensure that users trying to access such content are ages 18 and older. This directive applies only to those companies that knowingly post such content and whose online services contain more than 33% of this material. Therefore, this excludes file hosts, cloud and internet providers, so long as they are not responsible for the creation of the content itself.


Opposing Views


Despite the law’s focus on addictive design, it has garnered criticism, particularly due to the blanket ban on under-14 users, restricting users’ access to lawful content protected by the First Amendment and noting that parents should oversee online activities, not the state. The narrow targeting of social media platforms also risks creating a fragmented regulatory landscape. Crucially, the law may ultimately fail the very adolescents it aims to protect. By pushing young users off mainstream platforms, it could drive them to less regulated, potentially more dangerous corners of the internet, leaving them more vulnerable, not less. In seeking to shield children, the ban risks isolating them from the digital literacy and support networks they need most.


Global Perspectives


UK’s Age-Appropriate Design Code (“AADC”)


The Age-Appropriate Design Code (or Children’s Code) was implemented by the UK and contains 15 standards that online services need to follow if children are likely to access their service, even if they are not their target audience. Unlike the HB-3, the AADC seeks to protect children within the digital world, not from it. Thus, it places a greater emphasis on ensuring that children have access to digital services whilst minimising data usage, by default. 


Some of the standards include age-appropriate application, transparency, and the detrimental use of data, among others. This framework, rooted in the UK GDPR, avoids constitutional pitfalls by focusing on mitigating harms through design adjustments rather than outright bans. This is in contrast with HB-3’s much more overt approach of punitive measures for platforms employing prolonged engagement features.


UK’s  Online safety act


UK’s Online Safety Act, 2023 also provides regulation for age-appropriate experiences for children online, as its strongest protections have been designed for children. Platforms are required to evaluate potential risks their services may pose to children and implement suitable age restrictions to ensure young users have safe, age-appropriate experiences and are protected from harmful material. If a provider restricts access for users below a certain age, they must clearly outline in their terms of service the specific methods used to verify age and must apply these measures consistently.


European Union’s Digital Services Act (“DSA”)


The Digital Services Act was implemented with a view to preventing harmful activities online and ensuring user safety. Although it doesn’t directly target addictive features of social media platforms, it nevertheless creates a safe space for children online through provisions prohibiting targeted advertisements to children. It also provides for regular mandatory audits, ensuring compliance of the digital platforms with its provisions.


These rules work along with the General Data Protection Regulation (GDPR), which provides special protection to children’s data. The DSA, like the AADC, emphasises the “best interests of the child” in its design. In contrast to H.B. 3, this law does not bar children from using social media; rather, it creates a safe space on the platforms where the children’s data is not exploited to promote compulsive or prolonged usage.


Why Indian Children Need Stronger Digital Safeguards


Presently, in India the Digital Personal Data Protection Act of 2023 is the only statute that provides some degree of protection to children in the digital sphere. However, it is limited to the extent of protecting the personal data of children and does not address the broader problem of platform design that fosters addiction. The law is silent on the problem of minors’ addiction to social media.


In light of heightened legal scrutiny in concomitant jurisdictions of the US and UK to regulate digital platforms and

protect vulnerable sections of society from the adverse impacts of social media addiction, India cannot afford to fall behind. Children are particularly susceptible to being addicted to technology as compared to adults, whose brains are more developed and can control their impulses. Therefore, young minds fall prey to features such as endless scrolling, which are purposefully designed to keep users engaged. This traps the mind into a dopamine reward system with a cycle of compulsive scrolling without coming to a natural stop. Such excessive and uncontrolled use of addictive applications results in stunted cognitive and emotional growth in children. It is further connected with a rise in symptoms of anxiety and depression, especially among teenagers.


Although the addiction is a growing concern, the entire blame cannot be placed solely on the tech companies. Their primary business model is to maximise user engagement to increase the revenue they earn from advertisements. Therefore, without any law regulating them, they have a free rein to do so by any means necessary, which, in this case, comes at the cost of the mental health of children. As was observed by the Utah federal district court in Net Choice, LLC v. Reyes, that “the whole point of the platforms is providing interactive, immersive, social interaction services.”


Way Forward


To safeguard children from the negative effects of social media, India needs a law that protects not just their data but also their mental health. It should directly target the addictive features of social media platforms. The following recommendations are proposed to develop an effective framework.


1.   Defining and Prohibiting Addictive Features


First and foremost, India needs to properly define what counts as addictive features. Taking a cue from Florida’s H.B. 3, India can include features that are inherently manipulative and are designed to hook children to use the application for prolonged periods of time. These should include features similar to the five addictive features as defined in the H.B.3, such as infinite scrolling, autoplay videos, streaks, and algorithmically curated content designed to maximise time spent on the platform. It should further ensure that the algorithms do not use the children’s data in a way that incentivises them to remain engaged, similar to the provisions of the UK’s Children’s Code. Indian legislation should mandate prioritisation of the best interests of children over the commercial objectives of the Big Tech companies. Such provisions would prevent online platforms from using the child’s browser history and engagement patterns to keep them engaged for prolonged periods, damaging their mental health.


2.   Mandatory Audits of Platform Algorithms


Similar to the European Digital Services Act, the legislation needs to create a regulatory authority and should include provisions for mandatory audits to ensure compliance with the standards. Moreover, the law must enforce transparency of its algorithmic systems to provide clarity on how its recommendation systems work. This would allow the regulatory authorities to detect whether the social media platforms are deploying addictive features to manipulate the usage behaviour of children. Such transparency measures would also ensure the accountability of the tech companies.


Conclusion


With the increasing use of social media among children, India needs new legislation that protects minors beyond just their personal data. Although the Digital Personal Data Protection Act is a significant measure for safeguarding the personal data of minors, it only addresses part of the issue. The challenge today is not solely the gathering of data from children, but also understanding how that data is utilised to encourage addiction among them.


The case of Snapchat is a landmark moment that showcases how governments are beginning to take cognisance of the issue and are starting to hold platforms liable. India must address the impacts of digital platforms on children’s mental health and development. With the rising use of social media platforms and growing global calls for regulation, India must develop a framework to safeguard its children. We cannot afford to lag behind while digital platforms continue to profit at the expense of the mental health and well-being of their most vulnerable citizens.

 
 
 

Comentários


bottom of page