The World Has Laws Against Dark Patterns. The US Is Playing Catch-Up — State by State.
In 2018, the Norwegian Consumer Council published a report called "Deceived by Design." It documented in meticulous detail how Facebook, Google, and Windows configured their privacy settings to steer users toward sharing the maximum amount of data — not through outright deception, but through interface design. Confusing options, buried controls, and carefully asymmetric button styling all pushed users in one direction. The report named these techniques and showed exactly how they worked.
The response was significant. European regulators cited the research in enforcement actions. French data protection authorities fined Google and Facebook under GDPR for consent interfaces that made accepting data collection easier than refusing it. The Norwegian Consumer Council's work became a reference point for legislators drafting new rules. In the years that followed, "dark patterns" moved from academic terminology to legal language.
Today, the European Union's Digital Services Act — which came into full effect in 2024 — explicitly prohibits dark patterns. It defines them, lists specific practices that are banned, and establishes penalties for violations. The GDPR already required that consent be "freely given, specific, informed, and unambiguous," which courts have interpreted to prohibit the kind of lopsided consent interfaces that were common practice for years. The UK's Competition and Markets Authority has brought enforcement actions specifically targeting deceptive design in subscription services and online retail. These aren't theoretical prohibitions — they come with fines, and companies have paid them.
The United States has taken a different path. There is no federal dark pattern law. There is no federal agency with an explicit mandate to regulate deceptive interface design as a category. What exists instead is a patchwork — a collection of state laws and FTC enforcement actions that together provide some protection but nothing like the consistency or coverage of the European framework.
The Federal Trade Commission has the broadest authority in this space. Section 5 of the FTC Act prohibits "unfair or deceptive acts or practices," and the FTC has used this authority to bring actions against companies for dark patterns. Their 2022 report, "Bringing Dark Patterns to Light," named the category explicitly and signaled increased enforcement focus. In 2024, the FTC finalized its "click to cancel" rule, which requires subscription services to make cancellation as easy as signup. The Epic Games settlement — $520 million — was partly based on dark pattern claims related to in-app purchases and subscription enrollment.
But FTC enforcement is reactive, case-by-case, and limited to practices serious enough to pursue through a lengthy administrative process. It doesn't establish a clear standard that companies can design to. It doesn't tell a startup founder exactly what their checkout flow needs to look like. It tells them, after the fact, that they crossed a line — a line that wasn't clearly drawn in advance.
At the state level, California has gone furthest. The California Consumer Privacy Act and its successor, the CPRA, require businesses to make it easy for consumers to opt out of data selling and to honor those preferences. The California Age-Appropriate Design Code imposes specific requirements for products likely to be used by minors. Colorado's Consumer Protection Act and Virginia's Consumer Data Protection Act have their own requirements, similar in spirit but different in detail. Connecticut, Utah, and Texas have passed related legislation. Each law has different definitions, different thresholds, different enforcement mechanisms, and different private right of action provisions.
For a business operating nationally — which most web products do — navigating this patchwork is genuinely difficult. The practical result is that a company either designs to the most stringent state standard (usually California) and applies that everywhere, or it tries to comply with each state individually, which is expensive and inconsistent, or it waits until it's large enough to attract regulatory attention and deals with the consequences then.
The contrast with Europe is stark. A company launching in the EU has a single framework to understand. The prohibited patterns are listed. The consent requirements are clear. The enforcement history gives you a sense of where the line is. You can hire a lawyer or a consultant, get an audit, and know where you stand. In the US, none of that infrastructure exists at the federal level.
What this means in practice is that American consumers have uneven protection depending on where they live, and American businesses have unclear guidance about what they're actually required to do. The research base for what works — what regulatory approaches actually reduce dark pattern prevalence — already exists. Europe has been running that experiment for years. The Norwegian Consumer Council, the UK CMA, and EU enforcement bodies have produced a body of evidence about which interventions change behavior and which don't.
The US will almost certainly develop a more consistent federal framework eventually. The FTC's recent activity suggests the trajectory. But "eventually" is not now, and in the gap between now and a federal standard, third-party certification plays an important role. It's how industries have historically self-organized ahead of regulation — establishing credible, independent standards that companies can adopt voluntarily, demonstrate to customers, and use as evidence of good faith if regulators come looking.
That's the model Light Patterns is building toward. Not as a substitute for regulation, but as something that exists in the space regulation hasn't reached yet — and that gives businesses and consumers a common language for what ethical design actually looks like.