Apple is cracking down on a category of AI image generation apps that “advertised the ability to create nonconsensual nude images.” According to a new report from 404 Media, Apple has removed multiple AI apps from the App Store that claimed they could “create nonconsensual nude images.”
On Monday, the site published a report exploring how companies were using Instagram advertising to promote apps that could “undress any girl for free.” Some of these Instagram ads took users directly to Apple’s Store for an app that was described there as an “art generator.”
Today’s report says that Apple did not initially respond to 404 Media’s request for comment on Monday. The company did, however, reach out directly after the initial story was published to ask for more information. When provided with direct links to the specific ads and App Store pages, Apple proceeded to remove those apps from the App Store.
Burkas make a lot more sense now
reply
They also go a long way to protect you from facial recognition
reply
apps that could “undress any girl for free.”
That, and "X-Ray vision" apps were totally a thing even before AI.
But also, how is that "nonconsensual nudity"? The nude body does not exist, and does not belong to the person depicted. For me, there's barely any difference between this and photoshopping a face onto a porn image.
reply
Lol oh no people who want to farm fake nudes have to use their browser now, not going to stop them, I guess the payment providers would be an issue now. I won't be surprised if these apps accept bitcoin and then Bitcoin gets blamed for some fake nude epidemic
reply
Or deapple their minds for good and switch to Android phones.
reply
Porn and military spending are the two main drivers of technological innovation. AI image generation was developed with nudes in mind.
reply
The genie is out of the bottle but as for other powerful technologies a ban for the normies becomes necessary
reply
Apple is cracking down on a category of AI image generation apps that “advertised the ability to create nonconsensual nude images.”
...and they waited until now to do it while allowing them to be uploaded in the first place?
reply
🤣, even the Ai is pushing back against Apple!
reply