Skip to main content

Podcast

Episode 27: Sandhya Brown of the FTC

Ashley Cianci
July 13, 2023
This episode of the COMPLY Podcast, Sandhya Brown, Assistant Director in the Division of Financial Practices at the FTC shares her perspective on what the agency is focused on these days, what lenders and others can keep in mind for compliance and consumer protection, and discusses four key areas of that are important to the FTC.

Episode Description

This week’s podcast features a very special guest, Sandhya Brown, Assistant Director in the Division of Financial Practices at the FTC. This past year, the FTC issued their staff report on dark patterns (“Bringing Dark Patterns to Light”) and also brought a number of cases involving dark patterns: Credit Karma, Epic Games (Fortnite), and Vonage, to name a few. 

They’ve also initiated rulemakings in a few areas related to dark patterns – in particular, junk fees and click-to-cancel. Listen as she talks about the staff report, key aspects of the cases they brought, the rulemaking proposals, and tips for compliance.

Show Notes:

Subscribe to COMPLY: The Marketing Compliance Podcast

About COMPLY: The Marketing Compliance Podcast

The state of marketing compliance and regulation is evolving faster than ever, especially for those in the consumer finance space. On the COMPLY podcast, we sit down with the biggest names in marketing, compliance, regulations, and innovation as they share their playbooks to help you take your compliance practice to the next level. 

Episode Transcript:

Ashley:
Hey there, COMPLY Podcast listeners, and welcome to this week’s episode. This week’s podcast features a very special guest, Sandhya Brown, Assistant Director in the Division of Financial Practices at the FTC. This past year, the FTC issued their staff report on dark patterns and brought a number of cases involving dark patterns, including Credit Karma, Epic Games, and Vonage, to name a few. They’ve also initiated rulemaking in a few areas related to dark patterns, in particular junk fees and click-to-cancel. Listen as she talks about the staff report, key aspects of the cases they brought, the rulemaking proposals, and tips for compliance. Thanks for listening and enjoy.

Sandhya:
Hello everyone. It’s good to be back with you. My name is Sandhya Brown, and I am an Assistant Director in the Division of Financial Practices at the FTC. I work on and supervise a range of consumer protection matters in the financial services space, and I’m looking forward to sharing my perspective on what the FTC has been up to and what lenders and others in the financial marketplace should keep in mind in terms of compliance and consumer protection. Before diving in, I want to note that everything I say today reflects only my own thoughts and opinions. I’m not speaking on behalf of the commission or any individual commissioner, or the Bureau of Consumer Protection. I plan to focus my discussion today on dark patterns, a topic that we’ve been bringing a fair amount of attention to and that I touched on briefly the last time I spoke at COMPLY.

Sandhya:
First, I’ll briefly refresh on what dark patterns are and why we’re focused on them. Second, I’ll give a brief overview of our “Bringing Dark Patterns to a Light” staff report that was issued in September. Finally, I’ll give you some recent case examples and describe some rulemaking we have in the works. Okay, so to refresh, what are dark patterns? Dark patterns are design practices that trick or manipulate consumers into making choices they would not otherwise have made that may cause harm. They often take advantage of consumers’ cognitive biases to steer their conduct, or they delay or deny access to information needed for consumers to make fully informed decisions. So what’s new? Why is the FTC addressing dark patterns now, and why does it matter? Well, as more and more commerce has moved online, so have manipulative design practices, and compared to their brick-and-mortar analogs, these digital tools can be deployed at a much larger scale and with more sophistication, creating bigger problems for consumers.

Sandhya:
Also, data collection has become pervasive. Companies now have techniques to gather massive amounts of information about consumers’ identities and online behavior. They can then use that data to target their tactics at particular demographic groups or even individual consumers. Plus, companies that market online can experiment with digital dark patterns more easily, frequently, and at a much larger scale than traditional brick-and-mortar retailers to determine which design features most effectively influence consumer behavior. Finally, as consumers increasingly use and rely on newer technology, that affects the number and types of dark patterns that they are likely to encounter, and Dark patterns tend to have stronger effects when they are stacked on top of each other.

Sandhya:
Next, I’d like to share a bit from the FTC’s dark pattern staff report that we released in the fall. It is, actually, a quick and easy read. Lots of pictures, and I would encourage you all to take a look at it. The staff report is organized around four categories of consumer protection harms implicated by dark patterns. I’m going to touch on each of those four briefly now, and I will offer some tips on compliance with respect to each. So, the first category includes design elements that induce false beliefs. This is where a company uses design features to make outright false claims or create a misleading impression about itself or the product. Classic examples include advertisements that are formatted to look like news articles and fake countdown clocks that make consumers think a good deal is about to disappear. Some tips here, to comply with the FTC Act, companies should make certain that their online interfaces do not create false beliefs or otherwise deceive consumers. Overall, when designing user interfaces, businesses should look not just at the effect their design choices have on sales, click-through rates, or other profit-based metrics but also on how those choices affect consumers understanding. If a business becomes aware that a particular design choice manipulates consumer behavior by inducing false beliefs, the company should remediate the problem, not capitalize on it for profit.

Sandhya:
The second category of dark patterns addressed in the report involves design elements that hide or delay the disclosure of material information. Examples of this are burying key information about a product in dense terms and conditions that consumers don’t see before purchase or tricking people into paying fees by hiding their existence behind obscure links in long blocks of text or at the bottom of a website requiring lots of scrolling to find. Another example is what’s called drip pricing, where companies lure consumers in by advertising only a part of the product’s total price and don’t mention other mandatory charges until late in the buying process after the consumer has sunk time into selecting the product and foregone other opportunities. So, takeaways for this category, companies should include any unavoidable or mandatory fees in the upfront advertised price, and companies should not deceive consumers into believing that optional products or fees are mandatory when they’re not.

Sandhya:
The third category of practices we talk about in the report are design elements that lead to unauthorized charges. This often involves tricking someone into paying for something they did not want or intend to buy. Examples include companies automatically adding items to a consumer’s online shopping cart that the person hadn’t selected or free trials that turn into recurring subscriptions without the consumer’s authorization. A related dark pattern involves making it hard for consumers to cancel existing subscription services resulting in unwanted ongoing charges. So, how can companies obtain consumers’ express and informed consent to charges? At a minimum, companies should make sure their procedures for obtaining consent include an affirmative, unambiguous act of consent by the consumer who is being charged. Acceptance of general terms and conditions documents that contain unrelated information are not enough. With respect to cancellation, negative options sellers should provide cancellation mechanisms that are at least as easy to use as the method the consumer used to buy the product or sign up for the service. I’ll be talking more about this in a bit. 

Sandhya:
The final category the paper addresses involves design elements that obscure or subvert consumers’ privacy choices. Because of dark patterns, consumers may be unaware of the privacy choices they have online or what those choices might mean. This may result in a significant deviation from consumers’ actual privacy preferences, including unwanted sharing of their personal information. So a few tips when it comes to consumer privacy. First, companies should collect information only when the business has a justified need for collecting that data and should avoid default settings that lead to the collection, usage, or sharing of consumers’ information in ways they wouldn’t expect. Second, companies should make privacy choices easy to access and understand. Consumers should not have to navigate through numerous screens to find privacy settings or have to look for settings buried in a privacy policy or a company’s terms of service. They should be presented at a time and in a context in which the consumer is actually making a decision about their data. And Lead generators must be honest about who they are and why they are collecting consumer information. If a company represents that they are collecting information for what audience or for one purpose, they cannot share it with a different audience for a different purpose without consent.

Sandhya:
Finally, I’m going to describe a few recent cases and rulemaking that the agency has announced to help illustrate the types of practices we have in our sights. Our recent case against Credit Karma is a good example of dark patterns that manipulate consumers into action by inducing false beliefs. We allege that Credit Karma knowingly ran a deceptive marketing campaign where they sent email credit card offers to their members saying that those members had been quote “pre-approved” for a credit card when they had not actually been approved at all. Up to one-third of those users who applied for certain credit cards were actually denied the credit card following a credit check. Worse, unbeknownst to the users, those credit checks were hard pulls, meaning the kind of credit check that can damage your credit score. So not only were those consumers left without the promised credit card, but also with potentially worse credit scores when they applied.

Sandhya:
We alleged that Credit Karma chose to make that quote “pre-approval” claim after it conducted comparison testing to see how consumers reacted to different types of claims about the credit card offer. It tested the “pre-approval” claim against, for example, a claim that told consumers they had quote “excellent odds” of being approved. Credit Karma’s testing showed that the false pre-approved claim yielded a greater quick click rate, and that’s the claim Credit Karma ultimately decided to use. This type of design experimentation, if it results in deceiving consumers or manipulating them into taking unwitting or detrimental actions, is a sign of dark patterns at work. Our settlement with Credit Karma puts a stop to the deceptive claims, requires them to preserve the results of all digital experimentation, and will have them pay 3 million that will be sent to consumers for their lost time applying for credit card offers they were ultimately not approved for.

Sandhya:
Another interesting case is our case against Vonage, which is a good example of dark patterns related to subscription services and cancellations. Vonage provides internet-based telephone services to consumers and businesses and bills their customers for these services on an automatic basis every month. We alleged that Vonage allowed consumers to easily sign up online but made the cancellation process much more difficult, leaving consumers and businesses on the hook for services they no longer wanted. Specifically, rather than letting them cancel online, which is how many of them had signed up, Vonage required canceling customers to speak to a live retention agent over the phone. But Vonage made finding and talking to those agents exceedingly difficult. The company obscured the cancellation contact information on its website, made those agents available only during set hours, created procedures that sent customers in a circle, and made customers suffer lengthy and repeated sales pitches before they could cancel.

Sandhya:
Many consumers also faced long hold times, dropped or unanswered calls, and unreturned chat messages. If customers somehow made it through all that anyway, they were then confronted with unexpected high-dollar early termination fees that were not disclosed clearly upfront. Under our settlement, Vonage has to stop unauthorized charges, be upfront with consumers about the subscription plans and how they work, put in place a simple cancellation process that is easy to find, easy-to-use, and will be available through the same method the consumer used to enroll, and they have to stop using dark patterns to frustrate consumers’ cancellation efforts.

Sandhya:
I will note here that we are actually in the midst of a rulemaking that would require companies to provide an easy way to stop recurring charges, what we call click-to-cancel. Designing cancellation as a sort of maze or endless loop to steer consumers away from what they are trying to do is a dark pattern, and this rule is aiming to put an end to that. Here’s the gist of the rule. There has to be a simple cancellation mechanism for consumers to immediately halt any recurring charges. The mechanism has to be at least as simple as the one used to sign up, and it has to be through the same medium that the consumer used to sign up, whether that’s through the internet, telephone, mail, in-person, or something else. For the internet, the mechanism must be accessible on the same website or app used for signup. For the phone, all calls have to be answered during normal business hours.

Sandhya:
The next case, and the last case I will mention, is our case involving Fortnite against the company Epic Games. This one is a good example of a dark pattern that tricks consumers into paying for something they do not want or intend to buy. In December, we announced a settlement that will require the company to pay more than 500 million dollars for charging users without authorization and for violating HAPA, which is a statute meant to protect children’s privacy online. We allege that Epic Games designed its purchase flows within Fortnite so that millions of users were charged for unwanted items while playing the game. Players were automatically and immediately charged for items based on the press of a single button, even when consumers had no idea they would be charged. For example, Epic Games changed the button configuration on the console so that users who pressed a button that was normally used to preview an item were now configured as the purchase button.

Sandhya:
Users were also unexpectedly charged for pressing a button when trying to wake the game from sleep mode or during a loading screen. On top of that, Epic Games also made it hard for people to undo charges by shrinking and obscuring the cancel button, so fewer players would notice it. In internal communications, they even admitted that the cancellation process was a dark pattern. Epic Games received a high volume of complaints about all of this but refused to fix its practices for years. Under our order, Epic Games will be required to pay 245 million dollars, which we will make sure goes back to consumers to be refunded for charges they never authorized. Going forward, they must stop charging consumers through the use of dark patterns and without their authorization. Speaking of unauthorized charges, I will make a final note about another rulemaking that the FTC has underway. This one is a rulemaking where we are going to consider prohibiting unfair or deceptive fees, including fees that consumers have not consented to and fees for fake or worthless products and services. The comment period for that ANPR closed in January, and now the agency will be assessing the comments and considering a proposed rule. So that concludes my remarks today. Thank you so much for your time.

Ashley:
Thanks for listening to this episode of the COMPLY Podcast. After today’s podcast, if you are interested in learning more about dark pattern compliance, we have several resources that I will drop for you in today’s show notes. As always, for the latest content on all things marketing compliance, you can head to performline.com/resources, and for the most up-to-date pieces of industry news events and content, be sure to follow Performline on LinkedIn. Thank you again for listening, and we will see you next time.

Stay Updated

Join thousands of other industry professionals

Subscribe to receive the latest regulatory news and updates with a focus on marketing compliance via content offers, newsletters, blog posts, and more
This field is for validation purposes and should be left unchanged.

Connect with PerformLine and see what we can do for you.