Apple and Google do their best to convince us that their app stores are the safest places on Earth, much like an impregnable fortress that only those who meet strict ethical and technical conditions can enter. But it seems that “Artificial Intelligence” has decided to take a dark path, and the big companies—surprisingly—are not far from the scene. A new report revealed that apps specializing in “nudifying” and creating fake pornographic images are still alive and thriving within the Apple App Store and Google Play. Even worse, the algorithms of these stores sometimes direct users to them directly through ads and search results.

Shocking Numbers and Public Presence
The Tech Transparency Project (TTP) report didn’t just come with guesses, but with numbers that make your hair stand on end. The report identified 18 apps on the Apple App Store and 20 apps on the Google Play Store specializing in what is known as “nudify” or nudity apps. These apps were not just failed experiments in the dark corners of the store; they collectively generated about $122 million in revenue and were downloaded 483 million times. Yes, you read that number correctly; we are talking about nearly half a billion downloads!

The ironic thing here is that some of these apps carried an “E” for Everyone rating, meaning any child with a device could simply access them. While parents struggle to prevent their children from spending too much time on screens, the “official” stores were offering them tools that could turn photos of real people into pornographic images or place their faces into inappropriate videos using deepfake technology.
Search Algorithms.. The Tour Guide for Bad Content
The problem doesn’t stop at the existence of the apps, but in how users reach them. The report alleged that searching for certain keywords provides easy access to these tools. Surprisingly, the platforms themselves were displaying ads for similar apps within search results, making Apple and Google partners—even if unintentionally—in spreading tools that turn real people into sexual images.

There are apps that promote themselves by showing an actress’s face superimposed on another actress’s body, and they allow users to put real faces on the bodies of semi-nude women, yet they enjoyed an age rating suitable for everyone. It seems that the reviewers in the stores needed AI glasses to see what is obvious to everyone!
Reactions and Government Moves

After this scandal was revealed, the companies rushed, as usual, to hide behind their official statements. Apple told Bloomberg that it had removed 15 of the identified apps, while Google said it had suspended a number of them and confirmed that its policies prohibit explicit sexual content and that the investigation and enforcement process is ongoing. But the question that begs to be asked: Why do we always need an external report to do the work that the multi-billion dollar review systems are supposed to do?

This negligence has pushed governments to act; in the UK, the Children’s Commissioner called for a ban on deepfake apps that create sexual images of children. The US and other countries have also proposed laws to prevent explicit deepfakes. Even the California Attorney General recently sent a directive to Elon Musk’s “X” to stop producing explicit deepfake images via the “Grok” AI model.
Source:
Leave a Reply