For people interested in using AI to generate artificial nudes, YouTube has been a good place to start.
Though it doesn’t host “nudifier” tools, the video sharing platform used by 2.5 billion was hosting in excess of 100 videos with millions of views that were advertising how quickly these AI apps and websites can remove clothes from images of women, a review by Forbes found.
Some of the YouTube videos provided tutorials for an app that high school students in Spain and New Jersey had reportedly used to generate nudes of their female classmates. Students who’ve been allegedly victimized have faced bullying, public shaming and panic attacks.
Another website showcased in a number of YouTube videos was cited in court documents for a 2023 case in which a child psychiatrist was sentenced to 40 years for using artificial intelligence to create child sexual abuse images and for sexual exploitation of a minor. He was accused of using the tool to alter images of his high school girlfriend when she was underage by removing her clothes. “In this digital age, it is horrifying to realize that pictures of me, innocent pictures, may be taken and twisted without my consent for purposes that are illegal and disgusting,” his former girlfriend testified in court.
It is “unthinkable” that Google is facilitating the use of these apps, Signy Arnason, associate executive director for the Canadian Center for Child Protection, told Forbes. “On YouTube and even in Google Search results, instructional videos or services with titles blatantly promoting these types of applications, are easily found,” she added. She said her organization was increasingly hearing from schools whose students have been victimized by AI-generated nudes.
Google’s AI nudifier problems don’t stop at YouTube. Forbes identified three Android apps offering to remove clothes from photos, including a “nudity scanner photo filter” with more than 10 million downloads; a Spanish-language app that allows the user to “run a finger over what you want to erase, for example, a swimsuit,” which has more than 500,000 installs; and Scanner Body Filter, offering to add a “sexy body image” to photos, also with half a million downloads.
Forbes also found 27 ads in the Google Ads Transparency Center promoting “deep nude” services. One advertised a site with the word “baby” in the URL. The National Center on Sexual Exploitation (NCOSE) provided information on another four, including a nudifier website openly offering to create Taylor Swift AI photos.
After Forbes’ asked if the videos, advertisements and apps ran afoul of Google policies, it removed all 27 ads, and YouTube took down 11 channels and over 120 videos. One of those channels, hosted by a male deepfake AI, was responsible for over 90 of those videos, most pointing to Telegram bots that undressed women. The Scanner Body Filter app was also made unavailable for download, though other Android apps remained online.
Tori Rousay, corporate advocacy program manager and analyst at NCOSE, said that Google has created “a continuous profit cycle” from nudify apps by accepting advertising money from developers and taking cuts of ad revenue and one-off payments when the app is hosted in the Google Play store. Rousay said that Apple, by comparison, had been quick to remove nudifier apps when NCOSE highlighted a number hosted on the App Store.
“Apple listened… Google must do the same,” Rousay added. “Google needs to create responsible practices and policies around the spread of image-based sexual abuse.”
AI-generated deepfake porn is on the rise, including of children. The National Center for Missing and Exploited Children told Forbes this week that it had received 5,000 reports of AI-generated child sexual abuse material (CSAM) over the last year. Earlier this year, a Wisconsin man was charged for allegedly using the Stable Diffusion 1.5 AI-powered image generator to produce CSAM.
In the case of the convicted child psychiatrist, other victims besides his childhood girlfriend testified in court about the ongoing trauma caused by his use of AI to undress them in their childhood photos.
“I fear that when he created child pornography using my image online, others will have access to that image as well. I fear coworkers, family, community members or other pedophiles will have access to this image,”said one of his victims. Another added, “I fear artificial intelligence because of him and when I see or hear of AI, there he is in the back of my head.”
MORE FROM FORBES
Read the full article here