×

Apple and Google Under the Microscope: AI “Nudify” Apps Still Defiling Our Official Stores

Apple and Google do their utmost to convince us that their app stores are the safest places on Earth, much like an impregnable fortress that only those who meet strict ethical and technical conditions can enter. But it seems that “Artificial Intelligence” has decided to take a dark path, and the tech giants—to our surprise—are not far from the scene. A new report has revealed that apps specializing in “nudifying” and creating fake pornographic images are still alive and thriving within the Phonegram and Google Play stores. Even worse, the algorithms of these stores sometimes direct users to them directly through ads and search results.

Misleading AI apps on app stores


Shocking Numbers and Public Presence

The Tech Transparency Project (TTP) report did not come with mere guesses, but with numbers that would make one’s hair stand on end. The report identified 18 apps on the Apple App Store and 20 apps on the Google Play Store specializing in what is known as “nudify” or stripping apps. These apps were not just failed experiments in the dark corners of the store; they collectively generated about $122 million in revenue and were downloaded 483 million times. Yes, you read that number correctly; we are talking about nearly half a billion downloads!

From Phonegram: The image shows the ESRB 'E for Everyone' rating symbol on a blue gradient background with small dots, which has nothing to do with the AI nudify apps spreading across Apple and Google.

The ironic thing here is that some of these apps carried an “E” for Everyone rating, meaning any child with a device could simply access them. While parents fight to keep their children from spending too much time on screens, the “official” stores were providing them with tools that could turn photos of real people into pornographic images or place their faces into inappropriate videos using deepfake technology.


Search Algorithms.. The Tour Guide for Bad Content

The problem does not stop at the existence of the apps, but in how users reach them. The report alleged that searching for certain keywords provides easy access to these tools. Surprisingly, the platforms themselves were displaying ads for similar apps within search results, making Apple and Google partners—even if unintentionally—in spreading tools that turn real people into sexual images.

There are apps that promote themselves by showing an actress’s face superimposed on another actress’s body, and they allow users to place real faces on the bodies of semi-nude women, yet they enjoyed an age rating suitable for everyone. It seems the reviewers in the stores needed AI glasses to see what was obvious to everyone else!


Reactions and Government Moves

After this scandal was exposed, the companies rushed, as usual, to hide behind their official statements. Apple told Bloomberg that it had removed 15 of the identified apps, while Google said it had suspended a number of them and confirmed that its policies prohibit explicit sexual content and that the investigation and enforcement process is ongoing. But the question that arises is: Why do we always need an external report to do the work that multi-billion dollar review systems are supposed to do?

From Phonegram: The Grok logo features a stylized black logo next to the word 'Grok' in bold black text on a pink-to-purple gradient background.

This negligence has prompted governments to take action; in the UK, the Children’s Commissioner called for a ban on deepfake apps that create sexual images of children. The US and other countries have also proposed laws to prevent explicit deepfakes. Even the Attorney General of California recently sent an order to Elon Musk’s company “X” to stop producing explicit deepfake images via the “Grok” AI model.

This article is for warning purposes, and for the sake of our children; do not assume that the app store is safe. Do you think Apple and Google are truly failing to monitor AI apps, or is technology outpacing their ability to control it?

Source:

engadget.com

Leave a Reply