Apple cracks down on AI powered nude image generating apps

software downloader7

As per a report on 404 Media, Apple has removed three apps from the App Store that let users create non-consensual nude images of people using AI.



Apple has removed at least three apps from its App Store that claimed to be capable of generating non-consensual nude images using Artificial Intelligence (AI), a report on 404 Media said. Advertisements of these ads were spotted on Instagram.

According to 404 Media, Apple took action against these apps only after the publication shared the links to these and their ads. This suggests the tech giant was unable to find apps which violated its App Store policies without external help.

The report stated they came across five such ads by browsing Meta’s Ad Library, a place where all ads on the platform are archived. While two of these ads were for web-based services that offered such services, three led them to apps on the Apple App Store. Some of these apps offered face swaps on adult images and a few were marketed as ‘undressing’ apps which used AI to remove clothes from a person’s normal photos.
It goes on to say that while Meta was quick to delete these ads, Apple initially declined to comment and asked for more details about these ads, This was after the story was published last week.
This is not the first time Apple has been alerted about AI-powered deep-fake apps on the App Store. In 2022, several such apps were found on Google Play Store and Apple App Store, but neither of the tech giants removed these. Instead, they asked these app developers to stop advertising such capabilities on popular porn websites.
In the last few months, undressing apps have spread across schools and colleges across the world. Some of these features are distributed as apps while others offer such features as a subscription service.
Previous Post Next Post