Field Notes from FLLR: Dark Patterns

A newsletter on the latest in privacy tech from FLLR Consulting

Welcome to the third issue of Field Notes from FLLR, from Nik Fuller and Dan Harms. Every month we’ll break down the latest in privacy news and break down different trending concepts in the industry.

Nik Fuller, CEO, FLLR Consulting

Dan Harms, Managing Partner, FLLR Consulting

FLLR Updates

We released our Strategic Guide to Consent & Preferences last Thursday. In case you haven’t checked it out yet, feel free to download it using the link below!

Strategic Guide to C&P Outline:

  1. What businesses need to know about C&P / The state of C&P today

  2. Cookie basics - Terminology and more

  3. Legal landscape - US, Europe, Rest of world

  4. Best Practices and Next Steps

Andrew Clearwater, former OT Chief Trust Officer, is featured in the guide with his section on best practices for businesses to deal with consent & preferences. 

Other features include John Henson, Henson Legal, PLLC, on the TCPA and Matthew Pearson, Partner, Womble Bond Dickenson, on CIPA.

OneTrust Webinar featuring FLLR on July 10

Dan Harms (Managing Partner, FLLR) and Zack Meszaros (Principal Solutions Engineer, OneTrust) will be talking Northeast privacy regulations and what they mean for consent and preferences on July 10th.

If that’s something that you’d like to join in for, register today using the link below!

Field Notes on Dark Patterns

This is a deep dive - you’ll have a PhD after reading this. 

A brief history on Dark Patterns

Harry Brignull, a UX designer, coined "dark patterns" in 2010 to categorize deceptive interfaces that manipulate user behavior. His strategic response was systematic: name, classify, and publicly expose these exploitative design practices through deceptive.design. Brignull's taxonomy has evolved from academic framework to regulatory enforcement tool. Privacy authorities now reference his classifications when investigating platform compliance. 

For organizations building privacy-first strategies, understanding dark pattern categories represents strategic risk management in an increasingly regulated digital landscape.

What does a dark pattern look like?

Dark patterns come in many forms and can be present in a variety of scenarios. High probability you’ve personally experienced a dark pattern at some point. Recent CPPA enforcement actions on Honda and Todd Snyder both include examples of “misconfigured” consent & preference solutions that led to similar patterns.

The most common examples would fall in the following buckets: 

  1. Confirm Shaming: Emotionally manipulates users through guilt-laden language ("No thanks, I don't want to save money") to coerce acceptance of unwanted services or subscriptions.

  1. Hard to Cancel: Creates asymmetric friction where signup processes are streamlined but cancellation requires navigating deliberately complex, multi-step obstacles designed to retain revenue.

  1. Preselection: Automatically opts users into profitable add-ons, subscriptions, or data-sharing agreements without explicit consent, forcing them to actively discover and deselect unwanted options.

  1. Sneak attack/Disguising: Conceals advertisements as native content or adds undisclosed items to purchase flows, exploiting user inattention to generate unintended revenue or data collection. 

SportsDirect, a UK company, used to add a magazine to carts for £1 without telling customers.

  1. Obstruction: This was called out in the Honda CPPA enforcement action. One click to accept all but multiple clicks are required to decline/opt out, leading to an asymmetrical experience.

Key Takeaway:

What jumps out at us is that addressing dark patterns requires attention in multiple areas:

Technical implementation matters as much as policy.

Avoiding dark patterns isn't just about having compliant language—it's about how that language is presented in the user interface. Organizations need to reassess their cookie banner implementations with symmetry and transparency in mind.

UX/UI choices have become a compliance concern.

Regulatory bodies are scrutinizing the user experience of privacy controls more closely than ever. Smart organizations are bringing together privacy professionals and UX/UI designers to create experiences that are both user-friendly and compliant.

A one-size-fits-all approach doesn't work.

Compliance requirements vary by jurisdiction, but the trend toward prohibiting dark patterns is consistent. Organizations need to implement geotargeted consent mechanisms that adapt to different regulatory requirements while maintaining a consistent user experience.

The tactical changes we're recommending reflect this reality: compliance requires addressing interconnected technical systems and design choices across the entire organization.

Have any privacy tech questions? Reach out to us.

Nik Fuller: [email protected] | (404) 731-7814

Dan Harms: [email protected] | (770) 337-6719

This newsletter is for informational purposes only and does not constitute legal advice. For questions about specific compliance needs, please contact the FLLR Privacy Team.

Visit our website to learn more - www.fllrconsulting.com

Follow us on LinkedIn for regular updates - https://www.linkedin.com/company/fllr-consulting/.

Don't want to see this newsletter anymore? Click here to unsubscribe