x

Regulation of Dark Patterns

19 December 2023

by Paritosh Chauhan Rohan Verma

E-Commerce has gained exponential significance over the last 15 (fifteen) years and has permanently changed the way most people purchase goods and services. This has led businesses to devise online practices, strategies, and tools to maximise their e-commerce gains (for example, the use of browsing or demographic data for targeted advertising).

One such practice, which has recently become the subject of regulatory scrutiny in several jurisdictions, is the use of ‘dark patterns’.

What are ‘Dark Patterns’?

‘Dark Patterns’ are deceptive user interface/ user experience (‘UI/UX’) designs (such as pre-selected checkboxes and variations in visual prominence) which induce users to make purchases (or otherwise act in ways) that they did not initially intend.

The use of dark patterns is neither new, nor uncommon. Companies such as Google, LinkedIn, Amazon, Facebook, and Apple have all been identified as having utilised dark patterns at some point[1]. In a paper published in October 2022 by the Organisation for Economic Co-operation and Development (‘OECD’), it was noted that 57.4% (fifty-seven point four per cent) of the cookie consent notices on Europe’s most popular websites used interface designs which led users to accept options detrimental to their privacy[2].

Restrictions on the use of dark patterns in the EU and USA

Internationally, the use of dark patterns has already attracted regulatory and judicial scrutiny.

The EU Digital Services Act, 2022 (‘DSA’), prohibits ‘providers of online platforms’ from designing, organising, or operating their online interfaces ‘in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.’[3] Examples of dark patterns under the DSA include: (a) making certain choices more prominent when seeking a decision from a service recipient; and (b) making the procedure for the termination of a service more difficult than the procedure for acceptance of said service[4].

Similarly, the California Consumer Privacy Act (‘CCPA’) (effective from 1 January 2020), read with the California Consumer Privacy Regulations (‘CCPR’) (effective from 29 March 2023) (collectively ‘California Consumer Privacy Law’), identifies 'dark patterns' as a user interface designed/ manipulated with the substantial effect of subverting/ impairing user autonomy/ decision-making/ choice. The CCPA provides that consent obtained by use of dark patterns will not be considered valid.  

In March 2023, the U.S Federal Trade Commission imposed a penalty of $245,000,000 (two hundred forty-five million dollars) on Epic Games Inc. (the creators of the popular online game ‘Fortnite’)[5] pursuant to a complaint alleging violation of the U.S Federal Trade Commission Act, 1914 by Epic Games Inc., on the grounds of its use of dark patterns to deter users from cancelling/ requesting refunds for certain in-game charges[6].

Restrictions on the use of dark patterns in India

In India, the Department of Consumer Affairs (under the Ministry of Consumer Affairs, Food and Public Distribution) recently proposed the draft ‘Guidelines for Prevention and Regulation of Dark Patterns, 2023’, which aimed to prevent the use of dark patterns by platforms, advertisers, and sellers.  Subsequently, the final version of the guidelines were notified by the Central Consumer Protection Authority on 30 November 2023(‘Guidelines’).

Under the Guidelines, the term ‘dark patterns’ has been defined to mean practices or deceptive design techniques which: (a) use UI/UX interactions on a platform; (b) are designed to mislead users into committing actions they did not intend to carry out by subverting/ impairing their autonomy as consumers; and (c) amount to misleading advertisements, unfair trade practices, or violation of consumer rights.

In addition to the definition, the Guidelines also set out a list of illustrative examples of dark patterns in Annexure A. These include dark patterns which entail: (a) creating a false urgency (creating or implying a sense of urgency or scarcity, including by depicting false popularity of a product or service); (b) confirm shaming (instilling a sense of shame / guilt/ fear in a user’s mind through the use of a phrase/ video/ audio or other means); (c) interface interference (a design element highlights certain information and obscures other relevant information, with the aim to misdirect a user); (d) disguised advertisements (the practice of masking advertisements as user generated content or news articles such that the advertisements blend in with the rest of the user interface in order to trick consumers into clicking on them); (e) forced action (forcing a user to take an action that would require the user to buy additional goods/ sign up for an unrelated service or share personal information in order to buy/ subscribe to a product/ service intended by the user such as requiring a user to share details of their contacts/ social networks or making it difficult for consumers to alter/ understand their privacy settings); (f) SaaS billing (generating and collecting payments from consumers on a recurring basis as surreptitiously as possible); and (g) rogue malware (using a ransomware/ scareware to mislead or trick users into believing there is a virus on their computer with the aim of convincing them to pay for a fake malware removal tool which ends up installing malware on their computer).

‘Designed to Mislead’: The Relevance of Intent  

Some laws which regulate dark patterns in other jurisdictions (such as the DSA and the California Consumer Privacy Law) have recognised that dark patterns may adversely impact consumer choices regardless of whether they were designed with the intention to mislead consumers[7]. In fact, under the CCPR, the intent behind designing an interface is not determinative in deciding whether such interface amounts to a dark pattern (though it is a factor to be considered). For example, a user interface may be considered a dark pattern (regardless of intent) if a business is aware that such user interface has the effect of subverting/ impairing a user’s choice but does not remedy the same. Similarly, if a business deliberately ignores the effects of its user interface, this will also lead to a presumption of the existence of a dark pattern.

In India, however, one of the necessary ingredients to establish that a UI/UX utilises a dark pattern (and is therefore prohibited) is that it was ‘designed to mislead’.

While this position is different from the DSA and CCPA/CCPR, the requirement to prove intention behind the design might be useful to ensure that an intelligent analysis is carried out before a particular UI/UX design choice is considered a ‘dark pattern’. On the other hand, it may be somewhat practically difficult to establish in each case that the intention behind the design was to mislead the user. It remains to be seen how the Central Consumer Protection Authority interprets this.  

Closing thoughts

There have been some questions from stakeholders on the need for the Guidelines, given that dark patterns could be regulated/restricted under existing laws. For instance, the Consumer Protection Act, 2019 (‘CPA’) already has provisions restricting unfair trade practices, misleading advertisements, and consumer rights’ violations. The Guidelines for Prevention of Misleading Advertisements and Endorsements for Misleading Advertisements, 2022 (issued under the CPA) could also cover dark patterns. Furthermore, certain dark patterns described in the Guidelines, such as forced action resulting in the sharing of personal information, will also fall within the ambit of the Digital Personal Data Protection Act, 2023.

However, given the technical nature of this issue, addressing it as part of a wider subject may not be feasible and therefore the Guidelines are a welcome step. There is now clarity on the status of dark patterns as a distinct concept under Indian laws. This is useful not just for consumers, but also for e-commerce businesses, as they will have clarity on the scope (and limits) of the meaning of dark patterns under Indian laws. This clarity will allow them to effectively evaluate the legal permissibility of their online practices, tools, strategies, or marketing measures.

Also, while the Guidelines are a progressive step, there are some other measures which can be used to further address the issues arising out of the use of dark patterns. These could include measures such as raising public awareness about dark patterns or incentivising the use of ‘light patterns’. ‘Light patterns’ in UI/UX design are the opposite of dark patterns and entail the use of consumer-friendly architecture and best practices on online platforms (for instance, pre-selection of options in a UI/UX interaction which are most-likely to ensure consumer autonomy).

[The authors are Associate Partner and Principal Associate, respectively, in Corporate and M&A practice team of Lakshmikumaran & Sridharan, New Delhi]

 

[1] Deceptive Patterns - Hall of Shame.

[2] OECD (2022), ‘Dark commercial patterns’, OECD Digital Economy Papers, No. 336, OECD Publishing, Paris, <https://doi.org/10.1787/44f5e846-en.> pg. no. 19.

[3] Article 25, Regulation (EU) 2022/2065.

[4] Ibid.

[5] In the matter of Epic Games Inc., U.S. Federal Trade Commission (Docket no. C-4790), paragraphs 37-44, page nos. 9-11.

[6] Ibid at page 27.

[7] EU Digital Services Act, 2022, Recital 67.

Browse articles