The same artificial intelligence tools built to help users generate creative videos, avatars and digital art are increasingly being used for something far more like a nightmare — creation of non-consensual sexualised images of women and girls.
These tools work on demand, triggered by plain language prompts that can be as simple as a few keywords. At first glace, they do not look dangerous, as they come disguised as playful creative apps, with taglines such as “make your cat talk in five seconds”.
Unlike the traditional adult content that requires deliberate searching, age gates or explicit opt-ins, these apps sit openly in our phone app stores, often in the same scroll as a Disney racing game or a maths app for kids.
That is the central finding of a recent investigation by the Tech Transparency Project (TTP), which examined the growing number of AI-powered ‘undressing’ apps available on Apple’s and Google’s app stores. The report shows how consumer-facing AI can quickly turn corrosive when guardrails fail and how easily such tools can fall into the hands of children.
Hidden in Plain Sight
The investigation comes amid heightened scrutiny of generative AI after users of X repeatedly prompted its chatbot Grok to digitally alter photographs of women, placing them in bikinis or underwear. In some cases, images of minors were manipulated into sexually suggestive content. That episode triggered probes by regulators, including the UK media watchdog Ofcom, the European Commission, and California’s attorney general, eventually forcing X to restrict parts of Grok’s image-generation features.
TTP’s findings suggest, however, that the issue is not confined to high-profile chatbots. According to the report, TTP identified 55 apps on the Google Play Store and 47 apps on Apple’s App Store that allow users to digitally remove clothing from images of women or render them partially or fully nude using AI. Collectively, the apps have been downloaded more than 705 million times worldwide and generated around $117 million in revenue, based on estimates from AppMagic. The investigation found that many of the tools require nothing more than uploading a photograph and entering a short text prompt. Others use face-swap technology to superimpose a person’s face onto an already nude body. In both cases, the resulting images or videos are produced in seconds, often without meaningful friction or warnings.
For testing, TTP used images of women generated by AI rather than photographs of real people.
Even so, the organisation found that in nearly every case it could generate sexualised content using only the free versions of the apps. Many platforms offered more advanced features behind paid subscriptions, including higher-resolution renders, longer videos and faster processing.
One of the most popular apps examined was DreamFace, an AI video generator that claims it can turn “photos, text, and voices into stunning HD videos”. Using the app’s free daily video allowance, TTP uploaded an image of a fully clothed woman and prompted the app to show her taking off her top. The resulting video matched the prompt precisely. DreamFace’s terms of service explicitly prohibit sexually explicit or indecent content, yet the app generated the video without warning or moderation. Despite this, the app was rated suitable for users aged 13 and above on Google Play and as young as nine on Apple’s App Store at the time of testing.
While writing this piece, I checked how some of these apps function in India and how easily they are accessible. DreamFace was available on the App Store with an 18-plus rating and did not reproduce the exact result described in the TTP test. Another app, FaceAI, restricted image generation due to insufficient credits, but its face-swap catalogue prominently featured women, mostly Asian, depicted in skimpy clothing or underwear. The app, rated 13-plus, described itself as “perfect for fashion lovers” and promoted features such as “Outfit Anyone”, face swapping and singing avatars.
Using another app, Swapify, TTP uploaded an image of a clothed woman and swapped her face onto a video of another woman undressing on a park bench. While the app required a subscription to view the full output, the preview alone demonstrated how easily such tools could be used to create non-consensual sexualised imagery.
Regulatory Reckoning
In statements cited in the report, some developers said their apps were not designed or intended to generate nudity and pointed to content moderation systems meant to prevent misuse.
The presence of these apps on mainstream app stores also raises questions about age protections. Both Apple and Google require developers to submit information to determine age ratings and claim to shield children from inappropriate content. Apple updated its ratings framework only last year.
Yet many of the apps identified by TTP carried ratings suggesting they were suitable for minors, even though they could be used to generate explicit imagery with minimal effort. After the investigation, Apple said it had removed 28 such apps and warned other developers that they risked removal if they failed to address violations. Google declined to comment publicly but requested the list of apps identified in the report. As generative AI becomes cheaper and more powerful, the question is whether existing safeguards are remotely adequate, especially when the people most exposed may be the least equipped to understand the consequences.
