What is undress AI? Guidance for parents and carers
What is 'undress AI'?
Undress AI describes a type of tool that uses artificial intelligence to remove clothes of individuals in images.
While how each app or website works might vary, all of them offer this similar service. Although the manipulated image isn’t actually showing the victim’s real nude body, it can imply this.
Perpetrators who use undress AI tools might keep the images for themselves or might share them more widely. They could use this images for sexual coercion (sextortion), bullying/abuse or as a form of revenge porn.
Children and young people face additional harm if someone ‘undresses’ them using this technology. A report from the Internet Watch Foundation found over 11,000 potentially criminal AI-generated images of children on one dark web forum dedicated to child sexual abuse material (CSAM). They assessed around 3,000 images as criminal.
The IWF said that it also found “many examples of AI-generated images featuring known victims and famous children.” Generative AI can only create convincing images if it learns from accurate source material. Essentially, AI tools that generate CSAM would need to learn from real images featuring child abuse.
Risks to look out for
Undress AI tools use suggestive language to draw users in. As such, children are more likely to follow their curiosity based on this language. Children and young people might not yet understand the law. As such, they might struggle to separate harmful tools from those which promote harmless fun.
Inappropriate content and behaviour
The curiosity and novelty of an undress AI tool could expose children to inappropriate content. Because it’s not showing a ‘real’ nude image, they might then think it’s okay to use these tools. If they then share the image with their friends ‘for a laugh’, they are breaking the law likely without knowing.
Without intervention from a parent or carer, they might continue the behaviour, even if it hurts others.
Privacy and security risks
Many legitimate generative AI tools require payment or subscription to create images. So, if a deepnude website is free, it might produce low-quality images or have lax security. If a child uploads a clothed image of themselves or a friend, the site or app might misuse it. This includes the ‘deepnude’ it creates.
Children using these tools are unlikely to read the Terms of Service or Privacy Policy, so they face risk they might not understand.
Creation of child sexual abuse material (CSAM)
The IWF also reported that cases of ‘self-generated’ CSAM circulating online increased by 417% from 2019 to 2022. Note that the term ‘self-generated’ is imperfect as, in most cases, abusers coerce children into creating these images.
However, with the use of undress AI, children might unknowingly create AI-generated CSAM. If they upload a clothed picture of themselves or another child, someone could ‘nudify’ that image and share it more widely.
Cyberbullying, abuse and harassment
Just like other types of deepfakes, people can use undress AI tools or ‘deepnudes’ to bully others Undresser app. This could include claiming a peer sent a nude image of themselves when they didn’t. Or, it might include using AI to create a nude with features that bullies then mock.
How widespread is 'deepnude' technology?
Research shows that usage of these types of AI tools is increasing, especially to remove clothes from female victims.
One undress AI site says that their technology was “not intended for use with male subjects.” This is because they trained the tool using female imagery, which is true for most of these types of AI tools. With the AI-generated CSAM that the Internet Watch Foundation investigated, 99.6% of them also featured female children.
Research from Graphika highlighted a 2000% increase of referral link spam for undress AI services in 2023. The report also found that 34 of these providers received over 24 million unique visitors to their websites in one month. They predict “further instances of online harm,” including sextortion and CSAM.
Perpetrators will likely continue to target girls and women over boys and men, especially if these tools mainly learn from female images.
What does UK law say?
It is illegal to make, share and possess sexually explicit deepfake images of children.
However, it is not currently illegal to create such images of adults. Additionally, the nudifying tools themselves are not illegal. These are tools which people can use to create images of both children and adults.
As recently as 2023, people could create and share sexually explicit deepfake images of adults without breaking the law. However, the Online Safety Act made sharing intimate images without consent illegal in January 2024.
Furthermore, before the 2024 General Election was called, the Ministry of Justice announced a new law that would prosecute those creating sexually explicit deepfake images of adults without their consent. Those convicted would face an ‘unlimited’ fine.