Disney’s $2.75m CCPA settlement exposes the real opt-out problem: propagation
California’s largest CCPA settlement to date wasn’t about having an opt-out link. It was about whether opt-out choices actually propagate across services, devices, identity states, and third-party data flows. The lesson for 2026: “effective opt-out” is an engineering standard, not a UI pattern.

On February 11, 2026, California Attorney General Rob Bonta announced a $2.75 million settlement with Disney over alleged violations of the California Consumer Privacy Act (CCPA). The core allegation wasn’t that Disney lacked an opt-out mechanism. It was that consumers’ opt-out choices did not consistently propagate across the services, devices, and data flows associated with a consumer’s Disney account.
The case matters because it reflects where privacy enforcement is going in 2026: away from policy language and toward implementation reality. Regulators are increasingly evaluating whether privacy controls are usable, technically effective, and consistent across the environments where a business actually interacts with consumers.
What regulators are testing now
The Disney matter is notable for its focus on mechanics. The Attorney General’s office looked at how many opt-out pathways existed, what each pathway actually did, and whether a consumer’s choice took effect broadly or only within a narrow surface.
According to the complaint and public statements, Disney provided multiple opt-out mechanisms (including toggles, a webform, and recognition of Global Privacy Control signals). The alleged gaps weren’t cosmetic. They were operational:
- Service-by-service and device-by-device behavior. Opt-out choices expressed via toggles or Global Privacy Control were allegedly applied only to the specific service or device used at the time, even when a consumer was logged into an account.
- Partial enforcement after webform opt-out. The state alleged that webform submissions stopped certain disclosures related to Disney’s own advertising offerings, while data continued to be shared with third-party ad tech partners whose code was embedded in Disney properties.
- Disconnected primary touchpoints. For some connected TV environments, the complaint alleges consumers were directed to the webform rather than being given an in-app path that would actually stop sale/sharing from those app-based surfaces.
The judgment requires that, once a consumer opts out, Disney must stop selling or sharing personal information and must stop conducting cross-context behavioral advertising for that consumer.
This is why the settlement lands as a technical precedent: it treats “opt-out” as a system behavior that must hold across a distributed environment, not as a single page or toggle.
The underlying failure mode: preference propagation
For most mature companies, the hardest part of compliance is not collecting a preference. It’s enforcing it across:
- multiple product surfaces (web, mobile, connected TV)
- multiple identity states (logged-in and logged-out)
- multiple internal systems (data warehouses, analytics, marketing automation, ad platforms)
- multiple third parties (pixels, SDKs, server-to-server integrations, batch exports)
When a company grows through acquisitions, new product lines, or platform diversification, privacy choices can become fragmented by default. “Opt-out” gets implemented differently by different teams, at different times, with different assumptions about identity and scope. That fragmentation becomes visible under enforcement.
The Disney complaint is explicit about user burden. It alleges that some subscribers would need to express their opt-out choice repeatedly across services and devices to achieve the outcome the law intends: stopping sale or sharing wherever it occurs.
What the judgment requires: account-wide opt-out when identity is available
The Final Judgment sets a clear enforcement standard for how opt-outs must be applied across identity states and surfaces.
In plain terms:
- If a consumer opts out while logged in, the business must honor that opt-out across all services linked to that account.
- If a consumer is not logged in, the business must either prompt login or collect the minimal information needed to effectuate the request more comprehensively.
- If the consumer declines to identify themselves further, the business must still apply the opt-out at the browser/app/device level.
This is a pragmatic view of identity resolution: regulators acknowledge that some scenarios are inherently device-scoped, but they expect businesses to avoid treating logged-in consumers as if they were anonymous.

For companies operating across multiple services and devices, the question is no longer 'Do we offer opt-out?' The question is: Can we prove — technically — that opt-out is effectuated everywhere we sell or share data?”Ethyca Team
A practical checklist for teams trying to stay ahead
If you run multiple consumer products, apps, or brands, the Disney settlement is a reminder to audit “opt-out” the way you’d audit reliability:
- Map enforcement boundaries, not just UX flows. List every place you sell/share data or enable cross-context behavioral advertising, then verify where opt-out is enforced. Don’t assume.
- Normalize opt-out semantics across surfaces. A “Do Not Sell or Share” toggle should mean the same thing on web, mobile, and connected TV. If it doesn’t, that inconsistency will surface.
- Treat third parties as part of the enforcement scope. If third-party pixels or SDKs continue firing after opt-out, you may still be “sharing” in the way regulators care about. Enforcement doesn’t stop at your edge.
- Validate Global Privacy Control behavior end-to-end. GPC is increasingly treated as a real-world preference signal that must be honored in practice, not just recognized in theory.
- Monitor continuously. Privacy enforcement is starting to look like security enforcement: ongoing verification beats point-in-time documentation.
Where Ethyca fits
It’s easy to frame cases like this as a failure of a specific interface or a single vendor category. That’s usually the wrong diagnosis. The failure mode is broader: rights were captured inconsistently and enforced incompletely across a complex enterprise system.
Ethyca’s approach is privacy engineering: making privacy choices executable and enforceable across infrastructure. That means treating opt-out as a durable control that can be propagated, audited, tested, and monitored across systems and partners.
For companies operating across multiple services and devices, the question is no longer “Do we offer opt-out?” The question is:
Can we prove — technically — that opt-out is effectuated everywhere we sell or share data?
That’s the standard regulators are increasingly moving toward. And it’s the standard modern privacy programs should be building toward now.
If you operate across multiple consumer services, apps, and ad tech integrations, the operational question is simple: can you prove that opt-out is effectuated everywhere data is sold or shared, including through third-party code paths?
Ethyca helps teams implement and verify rights enforcement across systems and partners, so privacy choices behave consistently across identity states and product surfaces. If you’re pressure-testing your opt-out propagation, speak with us today - we’re happy to compare notes.

.jpeg?rect=270,0,2160,2160&w=320&h=320&fit=min&auto=format)



.png?rect=0,3,4800,3195&w=320&h=213&auto=format)
.png?rect=0,3,4800,3195&w=320&h=213&auto=format)