Law and Technology

The Guidelines on Dark Patterns – Effective or Incomplete?


Dylan Sharma*


It is a practice that goes unnoticed – a popup on a website, fine print on an invoice or an advertisement in the middle of a video. What most consider a “nuisance”, the consumer protection authorities of various countries consider “dark patterns”. These are deceptive practices used by companies to drive sales by influencing consumer decisions. To ban such practices, the Indian Government released the Guidelines for Prevention and Regulation of Dark Patterns, 2023. These guidelines provide a clear definition of “dark patterns” as well as a comprehensive list of practices that will be considered dark patterns. This article critically examines the Guidelines and draws comparison with relevant laws from the US and EU. With learnings from countries that formulated a legal framework to tackle dark patterns much before India did, hopefully India will be able to balance consumer and business interests.

Introduction

“Dark pattern” has suddenly become a buzz word. So much so that, as companies prepare to comply with the Digital Personal Data Protection (“DPDP”) Act, 2023, they are now forced to look at their user interface (UI) and user experience (UX) to remove existing embedded dark patterns due to the new guidelines released by the Department of Consumer Affairs (“DoCA”) – Guidelines for Prevention and Regulation of Dark Patterns, 2023 (“Guidelines”).

The scope of this article is to understand the origin and meaning of dark patterns, highlight India’s attempt at creating a law that defines various types of dark patterns, and compare the Guidelines with some important regulations, reports, and judicial decisions from the EU and the US. It is crucial to look at these Guidelines since it is the first time India is attempting to regulate dark patterns. Given EU and the US had a slight head-start, the article also examines their best practices to evaluate how India can stand to gain from their experience.

1. What are Dark Patterns?

    In 2010, Harry Brignull, an expert on user experience and interface, realized that some software developers were intentionally creating and embedding patterns in codes to subtly trick users. He called these “dark patterns” and described them as a deceptive strategy to get unethical results. He believed that these could be eradicated by directing companies and designers to follow a code of ethics.[1] A decade later, governments worldwide are being forced to frame regulations and guidelines that prohibit, police, and penalize the use of dark patterns.

    Dark patterns encompass a variety of design and UI-based practices adopted to intentionally manipulate consumer choices and influence purchase decisions. An Organization for Economic Co-Operation and Development’s (“OECD”) Report[2] defines dark commercial patterns as “business practices employing elements of digital choice architecture in online use interfaces, that subvert or impair consumer autonomy, decision-making or choice”, and notes that these “may appear in e-commerce websites, apps, cookie consent notices, search engines or online games, and can intervene at different stages of a transaction, such as the advertising, pre-purchase, payment, or post-purchase stages”.

    The common goal of all these techniques is to exploit the psychological and cognitive predisposition of a consumer, and convince them to do something which they otherwise may not. Recognizing the ill-effects, the Indian Government deliberated on the need for a stringent oversight and regulatory mechanism that could penalize and thus eradicate the use of dark patterns.

    2. India’s Approach to Dark Patterns

    In June 2023, the DoCA issued a press release[3] directing online/e-commerce platforms to not use any design or pattern that can deceive or manipulate consumer decisions, and consequently be classified as a “dark pattern”. It was clarified that such conduct is determinantal to the consumer’s interest and would be tantamount to an “unfair trade practice” under the Consumer Protection Act, 2019 (“CPA”).

    Thereafter, the final Guidelines were notified on November 30, 2023. These Guidelines define “dark patterns” as any practice or deceptive design pattern using UI or user experience interactions on a platform, designed to “mislead or trick users to do something they originally did not intend or want to do, by subverting or impairing the consumer autonomy, decision-making, or choice, amounting to misleading advertisement or unfair trade practice or violation of consumer rights[4], and prohibit any person / platform from engaging in these.[5] They apply to advertisers, sellers, and all platforms / online interfaces that offer goods and services in India through a website, software, or application.[6]

    Annexure 1 of these Guidelines lists certain activities that are construed as dark patterns, namely:

    (a)       False urgency: Creating a false sense of scarcity to manipulate a consumer into impulsively acting or purchasing. For instance, a notification that false mentions that the quantity of a product is limited.

    (b)       Basket sneaking: Including additional items such as insurance coverages, payments to charity or donations at the time of checkout to increase the amount payable by the consumer without notifying the consumer. 

    (c)       Confirm shaming: Using words, phrases, or visual and audio means to create a sense of fear, shame, or guilt in the mind of the consumer to manipulate them into taking an action that they would not ordinarily choose. For instance, when booking an airline ticket, the consumer is given the option to purchase an insurance coverage. If the consumer declines, a message may appear that reads “Are you sure you are willing to take a risk?” Such message is intended to create a sense of fear and influence the consumer into purchasing the insurance policy.  

    (d)       Forced action: Forcing a consumer to purchase additional goods, subscribe to a service, or share personal information in order to purchase the product or service originally intended. For instance, a website may require the consumer to subscribe to the monthly newsletter to complete the purchase of a product.

    (e)       Subscription trap: Deceptive means utilized in the subscription process. For instance, making the cancellation of a paid subscription complex and lengthy, or forcing a consumer to provide payment details or authorization for auto payments to avail a free subscription.

    (f)        Interface interference: Highlighting certain aspects or hiding relevant information to mislead consumers. For instance, an option to opt into sharing personal information is in bold and highlighted in bright red, while the option to opt out is shaded in a light grey inconspicuous color.

    (g)       Bait and switch: Displaying a certain offer, option, or outcome based on the consumer’s preferences, and then providing an alternate less preferable outcome. For instance, a booking website may advertise a pool-view king size suite at INR 10,000 per night. However, at the checkout, just before making the payment, the website notifies the consumer that the king-sized rooms are sold out and that 1 queen-sized pool-view suite is available at INR 10,000.

    (h)       Drip pricing: Certain elements of the final price are not disclosed upfront, thus leading to a higher amount being charged at the time of checkout.

    (i)        Disguised advertisement: Advertisements are masked as content such as news articles or user reviews to mislead the consumer into engaging with such advertisement.

    (j)        Nagging: Sending the consumer repeated requests, options, or information, usually unrelated to the intended purchase, disrupting the transaction and causing the consumer to act out of annoyance or lack of patience. For instance, repeated notifications and e-mails asking a consumer to download an app.

    (k)       Trick question: Using confusing wording or complex language to misdirect a consumer. For instance, when a consumer wants to cancel a subscription, the website may present two options – “continue” and “cancel”. This could confuse the consumer about the correct option to select to cancel the subscription.

    (l)        SaaS billing: Billing a consumer on a recurrent basis in a software as a service businesss model without any notification or intimation of such payment.

    (m)      Rogue malware: Deceiving users to believe there is a virus on their computer and providing them with a link to a fake malware removal software which instead installs a malware on their computer.

    3. A Critical Analysis of the Guidelines

    While the Government must be appreciated for creating a law to deal with dark patterns, there are some red-flags which should be addressed sooner than later.

    For starters, there are drafting inconsistencies which could result in contrary interpretations even by consumer courts. For instance, the Guidelines state that dark pattern practices and illustrations provided in Annexure 1 are only indicative and that different facts and circumstances could lead to different interpretations. Conversely, Guideline 5 acts as a deeming provision since it specifically states that any person/platform indulging in any practice specified in Annexure 1 is “considered to be engaging in dark pattern practice”.

    The Guidelines include illustrations and definitions, some of which are quite generic and overlapping. For instance, the Guidelines include an illustration of SaaS billing as “using shady credit card authorization practices to deceive consumers[7]. However, none of the words used here have been defined. In fact, this sentence could also be confused with credit card frauds which are regulated by the Reserve Bank of India. Similarly, the Guidelines list “rogue malwares[8] as a dark pattern but do not properly define it, consequent of which, it can be interpreted as use of a malicious software for financial leverage, irrespective of whether it occurred in the course of a transaction.

    With rapid evolution in technology, new forms of dark patterns are likely to emerge. The Guidelines have provided a very inflexible definition of each dark pattern. A minor variance in a dark pattern could result in the practice following outside the scope of the Guidelines. Ideally, the Guidelines ought to have taken a principle-based approach instead of specifically defining different types of dark patterns.

    The Guidelines also lack explicit parameters, failing to give any guidance on how consumers ought to distinguish between acceptable business practices and dark patterns. For example, “subscription traps” are a type of dark pattern and have been defined as “forcing a user to provide payment details or authorization for auto debits for availing a free subscription”.[9] However, several businesses globally ask for credit card details even while offering free trials so that they can confirm the potential subscriber’s ability to pay in the future. The intent may not be to deceive the user but simply do a simple diligence. Given the burden of proof will be on the consumer, it will be difficult to prove that such dark patterns were adopted with the “intent” to deceive/manipulate.

    Finally, the Guidelines do not give any specifics or guidance on how compensation will be granted or how penalties would be imposed. Resultantly, a consumer may not believe it’s worth his time and money to hold a company accountable for dark patterns if the actual “harm” suffered is nothing more than mere annoyance. For instance, if a website nags a consumer into downloading an application[10], the consumer may get irritated but not suffer any tangible harm. Consequently, the consumer may be disincentivized and may not report such dark pattern.

    Thus, it is evident that the law on dark patterns has a long way to go in India and needs to be extensively tested in courts.  

    4. Global Regulations on Dark Patterns

    While India introduced the Guidelines only in 2023, the EU and US have been investigating dark patterns since much before. Both these jurisdictions have introduced some detailed regulations to regulate and prevent dark patterns.

    The Directive on Unfair Commercial Practices (“UCPD”)[11]was enforced in the European Union to identify unfair practices and regulate them in commercial transactions between consumers and businesses. In 2021, the EU issued a “guidance notice”[12] to provide clarification on certain aspects of application of the UCPD. Through this, dark patterns were brought under the scope of the UCPD.

    It provides that (a) the UCPD applies to commercial practices irrespective of whether the transaction involves purchase of a product, and thus, covers practices to capture the consumer’s attention resulting in a transactional decision, or advertising methods used to persuade a consumer to engage with the seller’s content; (b) any practice that misleads and wrongly changes the economic behavior of the average consumer is a violation of the seller’s professional diligence requirement under Article 5 of the UCPD and amounts to a misleading or aggressive practice[13]; (c) there is no need to prove “intention”. The seller is expected to exercise a level of professional diligence in the interface design and must ensure that its design does not manipulate consumer decisions; (d) While the UCPD does not explicitly define the term “dark patterns”, Annexure I provides a list of commercial practices that are categorized as dark patterns and prohibited.

    Furthermore, the European Data Protection Board (“EDPB”) Guidelines[14] provide practical suggestions for users to recognize and avoid dark patterns on social media platforms, especially when they are in violation of the GDPR requirements. The guidelines provide a list of dark patterns like “overloading”[15] and “emotional steering”[16], and detail how each affects personal data and violates the GDPR. While mentioning that the list in not exhaustive, the guidelines also identify relevant GDPR provisions for assessing dark patterns. Keeping in line with the objective of holding social media companies accountable for GDPR compliance, the guidelines give recommendations on how companies can improve design UI to eradicate dark patterns. These include allowing easy consent withdrawal, pre-selecting/highlighting the least data invasive features and options by default, and providing neutral explanations for consequences users may face if they withdraw consent or refuse an option including providing complete information about how their personal data will be processed and how websites will act based on the choice.

    While the GDPR does not specifically deal with dark patterns, authorities have correlated dark pattern practices with violation of the GDPR, especially around “consent”. In 2023, the Italian Data Protection Authority (“IDPA”) passed an order[17] against Ediscom S.p.A and held, amongst other things, that Ediscom had used deceptive design interfaces to trick consumers into providing consent. For example, (a) the privacy policy consent was pre-selected, and (b) the website provided a single consent box for two distinct purposes, i.e., to consent to the processing of personal data for promotion and marketing, and for transfer of data to third parties. These design practices manipulated the user into giving consent and, thus, were held to be violative of Article 7 of GDPR. The company was fined €300,000. In principle, IDPA concluded that proper systems of data processing and data privacy are important to eradicate the effects of dark patterns.

    In comparison, while there is no federal law on dark patterns in the United States, some states have incorporated regulatory frameworks. Authorities such as the Federal Trade Commission (“FTC”) and civil courts (such as the District Court for Western Washington in a pending lawsuit[18] against Amazon) have played a role in identifying dangerous dark patterns, holding large corporations accountable for deceiving online consumers, and adequately compensating affected consumers.

    The California Consumer Privacy Act (“CCPA”), 2018 amended by the California Privacy Rights Act (“CPRA”), 2023 is a primary example of a State law regulating dark patterns. It defines the term “dark pattern” in a manner similar to the Indian Guidelines. It also states that any consent obtained from a consumer by the use of dark patterns will not constitute valid consent.[19] It further requires that websites not use any dark patterns (including a popup, notice, or any other design element) that obstructs the consumer’s intended interaction with the website.[20] Businesses are mandated to design UI in a simple, clear, and unambiguous manner to make it easy for consumers to opt-out of sharing their personal data.

    Despite the lack of uniform federal laws, the Federal Trade Commission (“FTC”), being a central authority conducted a public workshop on digital dark patterns on April 29, 2021, and released a report[21] detailing key discussion points. It specifically stated that consent should be procured and cancelled without the involvement of any dark patterns. The report further covered commonly used dark patterns and how they affect consumer rights. While this report is not legally binding, it does provide comprehensive recommendations on how the privacy and consumer protection laws ought to function sans any dark patterns.

    The dark pattern norms have also evolved through judicial decisions. For instance, the FTC passed an order in the case of US vs. Epic Games Inc. [22]  holding that the company Epic Games had been using design-based dark patterns in its video game “Fortnite” to trick users who were primarily children into making in-game purchases. Epic Games was directed to pay $245 million to consumers as refunds.

    5. Key Learnings for India

    The Indian Guidelines are still in their early days with no reported cases of penalties or prosecution for indulging in dark patterns. While the laws and regulations in the US and EU are not flawless, they do have a head start and there is much to learn from them.

    (a)        Principle-based approach: The Guidelines provide an exhaustive but restrictively defined list of dark patterns. Consequently, even a minor deviation from a restrictively defined dark pattern would give a clean chit to the company. Ideally, the Guidelines should follow a principle-based approach wherein certain key characteristics should be defined and any business practice falling within it should be prohibited. For instance, the FTC staff report lists key features of common dark patterns to enable consumers to identify indicators and avoid variants. It mentions “design elements that induce false beliefs” and “design elements that hide or delay disclosure of material information”, and provides a detailed explanation of each with illustrations and the consequential impact.

    (b)       Distinguish acceptable commercial practices from dark patterns: There is often a thin line between what is an acceptable business practice and what is a dark pattern. While it is important to prevent any kind of dark patterns, it is equally important to ensure that businesses can adopt practices to engage with customers. For example, Article 5(2) of the UCPD provides that a commercial practice is unfair and prohibited if it “materially distorts or is likely to materially distort the economic behavior with regard to the product of the average consumer whom it reaches or to whom it is addressed”. Article 5(3) provides that, to distinguish prohibited practices such as creating false urgency, and common and legitimate practices such as making exaggerated statements not meant to be taken literally, the practice must be assessed from the perspective of the average well-informed, reasonably observant, and vigilant consumer. Therefore, merely because a dark pattern falls within the scope of its larger definition should not be enough. It should also be tested from the perspective of whether it causes any harm to the consumer or not.  

    (c)       Interplay between data protection laws and the Guidelines: Given that the DPDP Act falls under the Ministry of Electronics and Information Technology, and the Guidelines fall under the Ministry of Consumer Affairs, it is important we do not look at these aspects in silos. There may be instances where businesses use dark patterns to obtain a consumer’s consent to share and process their personal data.  In order to ensure proper reporting and penalizing of such cases, the Guidelines need to reconcile with the DPDP Act. For example, if the Data Protection Board (“Board”) finds that a data fiduciary took consent using some dark pattern, it should not require the consumer courts to first “declare” the existence of the dark pattern (since dark patterns are defined and regulated under the Guidelines) before giving its order and imposing penalty. The Board should have the power to declare a practice as a dark pattern even though such practice is not defined under the DPDP Act.

    The Government has already thought outside the box and created an alternate WhatsApp-based mechanism to allow consumers to report dark patterns. Consumers can simply send a message on 1915 or 8800001915, and the chat bot takes over. If this reporting mechanism proves useful (which, of course, time will tell), there will be a big deterrent on companies to engage in dark patterns as opposed to the onus being on the consumer to fight a lengthy, expensive and time-consuming litigation. In fact, such reporting mechanisms could also be incorporated in the Guidelines to give them statutory backing and create a more accountable system.


    [1] Harry Brignull, “Bringing Dark Patterns to Light” (2021), https://harrybr.medium.com/bringing-dark-patterns-to-light-d86f24224ebf, Last Accessed on December 27, 2023

    [2] Nicholas McSpedden-Brown and Brigitte Acoca, “Dark Commercial Patterns” (2022)

    [3] Ministry of Consumer Affairs, Food and Public Distribution, Press Release on “Department of Consumer Affairs urges online platforms to refrain from adopting dark patterns harming consumer interest” (2023), https://pib.gov.in/PressReleaseIframePage.aspx?PRID=1936432, Last Accessed on December 27, 2023

    [4] Guideline 2(e), Guidelines for Prevention and Regulation of Dark Patterns, 2023

    [5] Guideline 4, Guidelines for Prevention and Regulation of Dark Patterns, 2023

    [6] Guideline 3, Guidelines for Prevention and Regulation of Dark Patterns, 2023

    [7] Annexure 1 (12)(d), Guidelines for Prevention and Regulation of Dark Patterns 2023

    [8] Annexure 1 (13), Guidelines for Prevention and Regulation of Dark Patterns 2023

    [9] Annexure 1 (5)(iii), Guidelines for Prevention and Regulation of Dark Patterns 2023

    [10] Annexure 1 (10)(a), Guidelines for Prevention and Regulation of Dark Patterns 2023

    [11] Directive 2005/29/EC of the European Parliament and of the Council on unfair business-to-consumer commercial practices in the internal market (May 2005)

    [12] Guidance on the Interpretation and Application of Directive 2005/29/EC, C/2021/9320 (December 2021)

    [13] Under Article 8 of the UCPD, a commercial practice is considered aggressive if it significantly impairs the average consumer’s freedom of choice or conduct with regard to a product, and thereby causes / is likely to cause him to take a transactional decision that he would not otherwise take.

    [14] Guidelines 3/2022 on Dark Patterns in Social Media Platform Interfaces: How to Recognize and Avoid Them (March 2022), Amended on February 14, 2023

    [15] 4.1 defines “Overloading” as sending excessive requests, information, or options o users to confuse them or encourage them to rashly accept a certain data practice without further viewing the terms and options

    [16] 4.3.1 defines “Emotional Steering” as using words or visuals in a way to project information in an either highly positive or highly negative manner, to influence the users’ emotional state and cause them to act against their data protection interests

    [17] Garante Per La Protezione Dei Dati Personali, Prescriptive and sanctioning provision against Ediscom SpA (doc. web no. 9870014) (2023), Last Accessed on December 28, 2023

    [18] Civil Action no. 2:23-cv-0932, Before the United States District Court Western District of Washington, In the matter of Federal Trade Commission vs. Amazon.com Inc. (June 21, 2023)

    [19] 1798.140(h), California Consumer Privacy Act, 2018

    [20] 1798.185(a)(20)(C), California Consumer Privacy Act, 2018

    [21] FTC Bureau of Consumer Protection, “Bringing Dark Patterns to Light” (2022), Last Accessed on December 28, 2023

    [22] Docket No. C-4790, Before the Federal Trade Commission, In the matter of United States of America vs. Epic Games Inc. (March 13, 2023)


    *Dylan Sharma is an Associate at PSA, a full-service law firm headquartered at New Delhi. He works extensively in the technology and consumer rights domain and has represented companies that have had to re-look at their user experience and interface to comply with the letter and spirt of the new dark pattern guidelines. Dhruv Suri, Partner at PSA, supervised Dylan through this article.