Privacy flushed away

Privacy flushed away
Metadata
Date: 2025/12/02
Author: Roel van Cruchten
Reading Time: 11 min read
Tags:
privacy
Share:
Article

The persistency of ‘Privacy Washing’: a look at some standout examples

Imagine purchasing a high-end “smart toilet gadget” for your home. It promises to analyze your health using an internal camera. The manufacturer assures you:“Your data is secure. We use End-to-End Encryption.” It sounds safe, right?

Until news breaks, as it did with Kohler recently, that the reality is quite different. Kohler launched an End-to-End encrypted smart toilet camera for health diagnostics. Their “encryption” was merely a marketing term; in practice, collected data (including images) is accessible to the company. With the users consent, the data was also available to the company to train their AI models.

This might sound like a funny headline, but it is the perfect illustration of a persistent problem in the tech world: Privacy Washing. We are flushing our privacy down the drain, often without even realizing it.

To illustrate the reality of privacy washing, we’ve compiled a list of some standout examples (see below). Yes, this may feel like a “Top 10” kind of article (that’s why we listed only 8), but we hope they serve as a serious reminder to look beyond the marketing and spot the empty promises beneath.

What is Privacy Washing?

We are all familiar with Greenwashing: a polluting company slapping a green leaf on its logo to appear sustainable. Privacy Washing is its digital sibling.

It is important to note that this is not a new phenomenon. As early as January 2016, at the CPDP privacy conference in Brussels, it became clear that this would become a significant trend. As noted in a report from that time, experts predicted that “privacy washing”, so pretending to work with true data protection, was poised to become “big.” Nearly a decade later, that prediction has unfortunately become our daily reality.

It is the practice where companies aggressively market themselves as champions of privacy (“We value your data!”, “Security first!”), while simultaneously engineering their products to collect, share, or exploit as much user data as possible.

The signs are often subtle:

  • Overcomplicated privacy settings
  • Privacy settings that default to “Share” or “On”
  • Data minimization in theory, but not in practice
  • Complex legal terms that negate the marketing privacy promises
  • AI integration becomes a new excuse to capture data
  • And a very deceptive variant: Encryption Washing.

Motivation

Why do companies engage in this deception? It is a calculated strategy to have it both ways. Companies know that strong privacy messaging builds trust and appeases regulators, yet they are unwilling to sacrifice their primary revenue streams: advertising, algorithm training, user profiling and monetization. Privacy washing allows them to project a safety-first image without giving up the data advantage.

Danger

This is dangerous because it creates a false sense of security, tricking users into sharing sensitive content or opting into invasive AI features they don’t fully understand. Ultimately, it muddies the market, making it nearly impossible for consumers to distinguish between empty marketing promises and genuine privacy-first solutions.

The fake ‘encryption’ (Encryption Washing)

Encryption Washing is a specific type of deception where companies use terms like “Military Grade Security” or “End-to-End Encryption” (E2EE) to create a false sense of safety.

The crucial difference lies in who holds the keys:

  • true end-to-end encryption (or zero-knowledge encryption) Only you (the sender) and the recipient have the keys to unlock the data. The service provider (e.g., Signal or WhatsApp) sees only scrambled code. Even if served with a warrant, they cannot access the content.

  • the “washed” version (transport encryption): Your data is secure while traveling from your device to the company’s server. But once it arrives, the company holds the keys. They can decrypt, view, analyze, and potentially sell that data.

To get a better idea of this washing, some brief examples of encryption washing

Zoom

For years, Zoom’s website claimed their video calls were “End-to-End Encrypted.” In reality, Zoom held the keys to the servers. They eventually settled a multi-million dollar lawsuit for this deception, but the pattern was set.

Telegram

Often cited as the “secure” alternative, Telegram’s standard chats are not E2EE. They are cloud chats, meaning Telegram can access the message history. Only if you manually start a “Secret Chat” do you get true encryption. A nuance that most users do not notice.

“New” Outlook

Microsoft’s latest email client for Windows routes credentials and emails from third-party providers (like your private Gmail) through Microsoft’s cloud servers to enable “smart features.” Convenience is prioritized over the architectural privacy of keeping data local.

Other striking examples of privacy washing

The toilet camera example is far from an isolated incident. Major industry players have been caught in similar contradictions, like Zoom, Telegram and the new Outlook version. Next to the toilet camera example, also some other striking cases of privacy washing, mainly involving IoT, were much talked-about. A selection:

1. The “Smart” Vibrator: We-Vibe

This is widely considered the most ironic privacy fail in IoT history. Standard Innovation, the makers of the We-Vibe (a smartphone-connected vibrator), marketed the device as a way for long-distance couples to connect intimately and securely.

At the DEF CON hacking conference, researchers revealed the device was sending highly sensitive usage data like temperature changes and vibration intensity settings, back to the company’s servers in real-time. Standard Innovation claimed the data was for “diagnostic purposes,” but users had no idea a database existed that essentially tracked their orgasm habits. The company settled a class-action lawsuit for $3.75 million.

2. Roomba’s “Toilet Shot”

iRobot (makers of Roomba) has long marketed its vacuums as using advanced AI to “navigate your home intelligently” while respecting privacy. To train that AI, Roombas record images that are sometimes sent to human contractors for labeling. In 2022, it was revealed that gig workers in Venezuela were sharing funny or embarrassing images captured by the robots on private Discord servers.

While the company claimed the data was secure and anonymized, one leaked image clearly showed a woman sitting on the toilet, taken from the low angle of a vacuum cleaner. iRobot admitted the leak but blamed the third-party contractor (Scale AI) for violating agreements.

3. Google’s “Guy Incognito” joke

This comes directly from the Brown vs. Google lawsuit regarding Chrome’s “Incognito Mode.” Google marketed Incognito Mode as a way to browse without being tracked. The used “spy” icon implies you are invisible.

Internal emails released during the trial showed Google engineers mocking their own marketing. One engineer wrote that the Incognito icon should be changed to “Guy Incognito”. This was a reference to a Simpsons character who is just Homer Simpson in a cheap disguise (a fake mustache and top hat).

The engineer noted that Guy Incognito “accurately conveys the level of privacy it provides”, i.e. in fact none, because Google was still tracking users for ad targeting the whole time.

4. The ransomware chastity cage: Qiui Cellmate

This is perhaps the only example that rivals the We-Vibe for dark irony. The “Qiui Cellmate” was a smart male chastity cage that allowed a partner to lock and unlock the device remotely via an app. It was marketed as a high-tech way to build trust and intimacy between long-distance partners.

The API had no password protection. In 2020, security researchers found that hackers could permanently lock the devices remotely. Hackers actually did this, demanding a ransom of 0.02 Bitcoin (about $270 at the time) to unlock the users. Because the device was a physical metal ring, users who didn’t pay (or couldn’t because the hackers didn’t actually have an unlock mechanism) had to use bolt cutters or angle grinders on their own sensitive areas to free themselves. The flaw was discovered by Pen Test Partners.

5. Tesla’s “gossip” cam

Similar to the Roomba “toilet shot,” Tesla owners believed the cameras on their cars were processed by unfeeling AI for self-driving safety. Tesla’s privacy policy assured users that “camera recordings remain anonymous and are not linked to you or your vehicle.”

In 2023, former employees revealed to Reuters that they privately shared sensitive videos recorded by customers’ cars in internal chat rooms for entertainment. They shared videos of road rage, crashes, and even a naked man approaching his vehicle. Employees turned these images into memes and joked about them. This was a major investigative report by Reuters.

6. The “Espionage” doll: My Friend Cayla

This example is funny because of the government response. “My Friend Cayla” was an interactive doll that used the internet to answer children’s questions (like Siri). It was sold as a safe, educational toy that could “talk” to your child.

But the doll had an insecure Bluetooth connection that allowed anyone within 30 feet to hack it and listen to the child or speak to them through the doll. It was so insecure that the German Federal Network Agency (Bundesnetzagentur) officially classified the doll as a “concealed espionage device”. German authorities didn’t just ban sales; they recommended that parents destroy the doll to avoid prosecution.

7. The “listening” air fryer

In late 2024, the UK consumer group “Which?” analyzed smart appliances and found absurd levels of data collection in kitchen gadgets. Smart air fryers (specifically from Xiaomi and Aigostar) claimed to offer “connected cooking” for better recipes.

But the apps demanded permission to record audio on users’ phones (for no functional reason - you don’t talk to an air fryer) and sent data to servers in China.. They also included trackers from Facebook and TikTok.

8. Smart TVs: the “screenshot” spy

This applies to almost every major TV manufacturer (Vizio, Samsung, LG), but the mechanism is surprisingly manual and invasive. “Smart Interactivity” or “Viewing Data” features are pitched as a way to get better show recommendations. This feature is called ACR (Automatic Content Recognition). The TV literally takes a screenshot of what you are watching several times a second and uploads it to the manufacturer to match against a database. This happens even if you are using an HDMI cable to watch a DVD, play a video game, or use the TV as a PC monitor for private work. Vizio paid a $2.2 million settlement to the US Federal Trade Commission for doing this without clear consent.

Higher awareness, but ‘Privacy Fatigue’

While public awareness of privacy issues is arguably higher than ever before, we are far from a solution. In fact, a kind of a paradox has emerged.

Next to the privacy washing examples mentioned above, many high-profile data breaches at tech giants like Meta and Amazon, incidents at healthcare providers like Clinical Diagnostics, or the growing concern over unsecured medical devices, made companies and the public in general more skeptical than ever. However, this skepticism is often overridden by “privacy fatigue.” Users are tired of endless consent pop-ups and complex settings. When faced with the choice between reading a 50-page policy or clicking “Accept” to use a convenient tool, convenience usually wins.

As a handful of tech giants increasingly dominate the mainstream market, there remain only a few viable alternatives that are truly privacy-safe. Despite lawsuits and fines, it seems more profitable for these companies to continue privacy washing rather than making their products secure. As users are worn down by “privacy fatigue” instead of seeing practices evolve to become safer, we are seeing unsafe practices accepted as the “new normal”.

The AI Factor

With the explosive rise of AI, your data’s value has skyrocketed. Companies are more eager than ever to harvest it to train their models, wrapping invasive tracking in vague promises of “smarter” features and “enhanced” experiences. Unfortunately, even skeptical users often fall for these vague promises because the immediate benefit of the tool is so tempting.

The role of the DPO & CISO

For businesses, however, “privacy fatigue” is a risk they cannot afford. This is where the Data Protection Officer (DPO) and Chief Information Security Officer (CISO) must play a crucial, revitalized role within your company. In this era of AI and privacy washing, they cannot simply be the “compliance police” who check a box at the end of a project. They must be involved from the very start.

Crucially, this does not mean that the DPO or CISO should stand in the way of innovation. Their goal isn’t to hit the brakes, but to be an integral part of the development and implementation process. By baking privacy and security in from the start, they ensure that innovation is not just fast, but sustainable and trustworthy.

Conclusion

The word “privacy” has become a marketing buzzword, often completely divorced from technical reality. Companies know that privacy sells, but they also know that data is the new gold. This tension leads to Privacy Washing: selling the illusion of safety while leaving the back door wide open.

However, the deepest danger lies not in the deception itself, but in the normalization of these practices.Because a handful of tech giants dominate the market with few viable alternatives, and users are worn down by fatigue, we are witnessing a disturbing shift. Instead of unsafe practices evolving to become safer, we are seeing unsafe practices accepted as the “new normal.”

When a tech giant decides to route your emails through their cloud, or a toilet camera manufacturer decides to use your films for AI training, it creates a precedent. By accepting these terms out of convenience or lack of choice, we are collectively lowering the bar. We aren’t fixing the security flaws; we are simply getting used to them.

The advice: stop accepting “good enough” encryption and “mostly private” policies. This reality remains: If you don’t hold the key, it isn’t private.