+
 
For the best experience, open
m.thewire.in
on your mobile browser or Download our App.

Navigating Deception: Dissecting the Implications of India’s Guidelines on 'Dark Patterns'

rights
The guidelines must comprehensively address the risks associated with dark patterns, particularly focusing on the threat posed by the collection of excessive and unnecessary personal data without informed consent.
Photo: Pixabay/akashjoshi772

Dark patterns refer to deceptive user interface designs employed by online services such as websites or apps to influence users to make decisions they otherwise might not. These misleading tactics are pervasive, extending from popular news websites to your favourite food delivery app. Virtually every online service incorporates some form of deception or user manipulation to enhance their profits.

In response to this issue, the Department of Consumer Affairs (DoCA) introduced Guidelines for Prevention and Regulation of Dark Patterns under Section 18 of the Consumer Protection Act of 2019. Their primary objective is to curb dishonest practices and promote transparency in the online marketplace. This marks a significant and commendable stride, considering that the Indian legal framework had been largely silent on the matter of dark patterns until now. The guidelines define dark patterns as “any practices or deceptive design patterns using UI/UX (user interface/user experience) interactions on any platform; designed to mislead or trick users to do something they originally did not intend or want to do; by subverting or impairing the consumer autonomy, decision making or choice; amounting to misleading advertisement or unfair trade practice or violation of consumer rights.”

The proposed guidelines are intended to apply to a broad spectrum of entities, encompassing sellers, advertisers, and platforms systematically providing goods and services in India. Notably, this extends beyond businesses physically located in India, as the guidelines also cover businesses operating outside India that target Indian citizens for the sale of goods or services. The discourse surrounding dark patterns is intricately connected to user autonomy within India’s privacy framework. The use of dark patterns, leading to manipulation and diversion of attention, has the potential to nullify user consent, resulting in the unlawful collection and processing of personal data by platforms.

The guidelines underscore the government’s commitment to safeguarding consumer privacy and purchasing autonomy on the Internet by placing a significant focus on user interface design. This initiative is particularly pertinent in light of the relationship between dark patterns and user consent within India’s privacy regime. In addition to the guidelines, DoCA had already issued “Additional Influencer Guidelines for Health and Wellness Celebrities, Influencers, and Virtual Influencers.” These guidelines outline specific responsibilities for influencers, emphasizing caution when making claims related to healthcare and medicine.

The guidelines state that the application is wide-ranging and thus for “all platforms”, “advertisers” and “sellers”. It does not make an effort to define all the terms and makes vague references to the dark patterns, such that the applicability remains under question. However, the definition clause limits the meaning to as defined under the Consumer Protection (E-Commerce) Rules, 2020. In addition to the vague applicability and definitions, the penalty thresholds haven’t been clearly defined in the guidelines. There are no precise definitions of infractions or punishment criteria in the guidelines.

As it stands, the Consumer Protection Act of 2019 would impose severe fines on any of the dark patterns mentioned in Annexure 1. Non-compliance with the Act’s directions can lead to imprisonment for up to six months, a fine of up to Rs 20 lakh, or both. Additionally, causing false or misleading advertisement which is prejudicial to the interest of consumers, is also a punishable offence, with imprisonment of up to two years and a fine of up to Rs 10 lakh. The imprisonment and fine increases for subsequent offences. 

Legislation such as the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (“SPDI Rules”) mandates the informed consent of users before collecting their sensitive personal data. Similarly, the newly enacted Digital Personal Data Protection Act, 2023 places importance on obtaining explicit consent from the Data Principal before processing such data. Implicit or opt-out consent, lacking a clear, positive, and affirmative action, is not considered valid under these regulations.

The global trend towards prioritising users’ free and informed consent is evident in data protection laws worldwide. Various jurisdictions, including the US and the EU, have responded to the risks posed by dark patterns through legislation aimed at protecting both privacy and consumer rights. For instance, in the US, the California Consumer Privacy Rights Act, 2020, and the California Consumer Privacy Act, 2018, recognise dark patterns and discourage their usage by rendering consent obtained through them invalid. In the EU, legislations such as the General Data Protection Act, the Digital Services Act, the Digital Markets Act, and the Unfair Commercial Practices Directive provide safeguards against manipulative tactics online. Notably, the French Data Protection Act resulted in a €8 million fine on Apple for implementing the ‘personalised ads’ setting as the default without prior consent and making it challenging to change the setting through multiple steps.

The guidelines must comprehensively address the myriad privacy risks associated with dark patterns, particularly focusing on the threat posed by the unauthorised collection of excessive and unnecessary personal data without the informed consent of consumers. Even with the implementation of the Digital Personal Data Protection Act, 2023, there remains a potential gap as it does not specifically govern the interfaces or designs responsible for collecting personal data.

Annexure 1 of the guidelines serves as an indicative, though not exhaustive, list of dark patterns. However, recognising the evolving nature of deceptive practices, the guidelines should acknowledge the possibility of additional dark patterns. To facilitate this, consumers, civil society, or market players should be empowered to report new instances of dark patterns through an institutionalised feedback mechanism to the ministry. Once identified, the responsibility for limiting or discontinuing the deployment of dark patterns should squarely rest with market participants. This approach encourages industry self-regulation, with necessary oversight from the ministry and valuable consumer feedback. This collaborative effort ensures that the guidelines remain dynamic and responsive to emerging challenges in the realm of online consumer protection.

Bhuvnesh Kumar and Sharad Panwar are fourth-year law students at National Law University, Jodhpur.

Make a contribution to Independent Journalism
facebook twitter