LIFETIME DEAL — LIMITED TIME
Get Lifetime AccessLimited-time — price increases soon ⏳
News

Shocking AI Apps Are Undressing Images Using Your Google and Apple Accounts Are We Ignoring Privacy Issues

Updated: April 20, 2026
7 min read

Table of Contents

Over the past few weeks, I kept seeing the same unsettling claim pop up online: some AI apps are “undressing” people in photos—using your Google, Apple, or Discord login to make it easier to get started. And honestly, it’s hard to ignore. When something this invasive is just a sign-in away, it stops feeling like a fringe internet problem and starts looking like a real privacy issue.

These sites are often called “nudify” or “undress” tools. The pitch is usually simple: upload an image, let the model do its thing, and download the result. But the part that really set off alarm bells for privacy advocates is how many of them rely on single sign-on (SSO) from big platforms like Google and Apple (and sometimes Discord).

Why does that matter so much? Because SSO makes sketchy services feel safer. If it’s tied to a familiar login, people assume there’s some level of vetting or accountability behind it. In my experience, that assumption is exactly what gets exploited.

An investigation by WIRED reported that 16 of the most popular nudification/undressing websites were using single sign-on systems. That means users don’t have to create new accounts from scratch, don’t have to fill out forms, and don’t have to wonder as much about who’s behind the site—they just tap “Continue” and move on.

And that convenience has consequences. Privacy and cybersecurity experts have argued that these integrations can lower the barrier to entry for harmful tools, while also making it harder to trace abuse back to the original operator.

09 04 2024 Shocking AI Apps Are Undressing Images Using Your Google And Apple Accounts Are We Ignoring Privacy Issues

Let me be blunt: I don’t care how “AI” it is. If someone can take a photo and generate an explicit or sexualized version of a real person without their consent, that’s not just a privacy concern—it’s a consent and safety problem. And it can spiral fast.

Why using Google, Apple, and Discord logins makes this worse

Single sign-on is designed to make life easier. You sign in once, and you get access to lots of services without creating a new password every time. That’s convenient—when the service is legitimate.

But when SSO is used by nudify/undress apps, it creates a few practical risks:

  • Less friction for abuse: People can try the tool quickly, which means more experiments, more uploads, and more chances for misuse.
  • False sense of trust: The UI feels familiar. Users may assume the platform behind the login has checked the app.
  • Weaker accountability: If the app is removed, the underlying behavior can pop back up elsewhere—sometimes with the same workflow and similar infrastructure.
  • Data exposure: Even if the undressing result is the main harm, the uploaded images and metadata are still sensitive. Those files have to go somewhere to be processed.

I’ve noticed that most people don’t think about what happens after they upload. They focus on the output. Meanwhile, the input—often a real person’s photo—can be the most dangerous part.

What WIRED found (and why it matters)

According to WIRED, 16 popular nudification/undressing websites were leveraging single sign-on systems. That’s a lot of apps, and it suggests the SSO problem isn’t isolated to one “rogue” site. It looks more like a pattern that developers can exploit.

When a major platform’s login is involved, the app can appear more legitimate than it should. That can boost user adoption—especially among people who aren’t actively looking for something harmful, but stumble into it because the sign-in flow feels normal.

The “it’s just an app” misconception

These tools are often marketed like novelty filters. But the reality is that generated explicit images can be used to target people. And even if the image is AI-generated, it’s still a form of sexual content tied to a real person’s likeness.

That’s where the harm gets serious—emotionally, socially, and legally.

What companies are doing about it

Some platforms have started taking action. For example, Discord and Apple have reportedly moved to terminate developer accounts associated with apps linked to this kind of content.

Honestly, I’m glad to see that—but it’s not the end of the story. Even when one account gets shut down, new apps can appear. The tech landscape moves quickly, and bad actors often adapt.

Regulation is hard—because the tech moves faster

One of the biggest challenges is figuring out how to regulate AI tools that can be repackaged, rebranded, or redistributed. Policymakers and platforms have to balance innovation with safety, but in cases like this, the harm is immediate and identifiable.

So the question becomes: how do you stop an app that can be created and shared faster than enforcement can keep up?

The spam spike that hinted this was growing

The growth didn’t happen quietly. A 2020 study found a 2000% increase in spam links to these nudify/undress websites within a few months.

That number is eye-opening. Spam doesn’t usually surge unless there’s demand. It suggests people were actively clicking, trying the tools, and spreading links—whether out of curiosity, cruelty, or both.

And once a workflow exists—upload, generate, download—it’s not hard for scammers or abusers to scale it.

Beyond privacy: the real-world harm

Privacy is the headline, but it’s not the whole problem. These tools can be used for:

  • Cyberbullying (targeting someone with humiliating or sexualized content)
  • Revenge porn (creating explicit images without consent and using them to punish or threaten)
  • Coercion and harassment (sending “proof” or threatening to release generated content)

And here’s the part people underestimate: the victim doesn’t care that it was AI. The emotional impact is real. Even a single image can damage reputations, relationships, and mental health. Plus, in many places, creating or distributing non-consensual intimate images can lead to legal consequences.

In other words: it’s not “just synthetic.” It’s synthetic content used in very real ways.

Consent doesn’t disappear because the image is generated

I keep coming back to consent. If someone didn’t agree to have their image transformed into sexual content, that’s a violation. The technology doesn’t change the ethical baseline.

What you can do (practical steps)

If you want to protect yourself—or help someone else—here are a few things I’d actually recommend:

  • Be cautious with “Continue with Google/Apple/Discord” prompts when the site looks suspicious. Familiar buttons can hide sketchy behavior.
  • Think twice before uploading any photo of yourself or other people, even “for testing” or “to see what it does.” Once it’s out there, you lose control.
  • Talk about AI misuse with teens and younger kids. It’s not enough to say “don’t post.” You have to explain that images can be manipulated and weaponized.
  • Check account permissions for third-party apps tied to your login. If something looks off, remove access.
  • Document evidence if you’re targeted. Screenshots, URLs, and timestamps can matter if you need to report abuse.

And if you’re a parent or educator? I’d treat this like digital safety training, not a tech curiosity. Kids don’t always connect “AI filters” with consent violations. They should.

Where this goes next

AI is moving fast, and undressing-style tools are a clear example of how quickly harmful use cases can spread. Platforms shutting down developer accounts helps, but it doesn’t solve the underlying problem: bad actors can still build and distribute tools faster than oversight can catch up.

What I’d like to see is stronger enforcement around SSO-linked apps—basically, more responsibility from the platforms that enable easy access. If a login system is being used to reach harmful services, that’s not a “third-party” issue anymore. It’s part of the ecosystem.

Because if we keep shrugging at privacy and consent concerns, more people will get pulled into these tools—whether they meant to or not.

Stefan

Stefan

Stefan is the founder of Automateed. A content creator at heart, swimming through SAAS waters, and trying to make new AI apps available to fellow entrepreneurs.

Related Posts

diy publication featured image

DIY Publication: Top 10 Most Searched Strategies for 2026

Discover the best DIY publication tips, tools, and trends for 2026. Learn how to publish, design, and promote your content independently today!

Stefan
kundenspezifische druckerfabrik in china featured image

Kundenspezifische Druckerfabrik in China: Top 10 Hersteller 2026

Entdecken Sie die führenden kundenspezifischen Druckerfabriken in China 2026. Erfahren Sie, wie Sie die beste Fabrik wählen und von maßgeschneiderten Drucklösungen profitieren.

Stefan
book cover design size featured image

Book Cover Design Size: The Ultimate Guide for 2026

Discover the latest standards and best practices for book cover design size in 2026. Learn how to choose the right dimensions for print and eBooks to boost sales.

Stefan

Create Your AI Book in 10 Minutes