close
close

Apple in Trouble as it Deals with AI Porn

Altered image found in face swap ad

Apple appears unable to stem the flow of so-called “dual-use” apps that look innocent at first glance but help users create deepfakes — and at a hefty price.

Apple prides itself on regulating the App Store, and part of that control is preventing porn apps altogether. But there are limits to that control, as some apps can offer features that users can easily abuse — apparently without Apple knowing.

According to the report from 404 MediaApple is grappling with a “dual-use” problem in apps that offer features like face swapping. While the feature seems innocent at first glance, users are swapping faces for pornography, sometimes using the faces of minors.

The problem became apparent when the reporter came across a paid ad on Reddit for a face-swapping app. Face-swapping apps are typically easy to find and often free, so such an app would need a business model that allowed for paid ads.

They found an app offering users the ability to swap any face with a video from their “favorite website,” with an image suggesting Porn Hub as an option. Apple doesn’t allow porn-related apps on the App Store, but some apps related to user content often include such images and videos as a sort of loophole.

When Apple was notified of the double use of the advertised app, it was pulled. However, it seemed that Apple was not aware of the problem at all, and the link to the app had to be shared.

This isn’t the first time that innocent-looking apps have made it through app review and offered a service that violates Apple’s guidelines. While it’s not as egregious a violation as turning a kids’ app into a casino, the ability to generate non-consensual intimate images (NCII) was clearly not something that was on Apple’s radar.

A face swap app appears on the smartphone screen, offering various facial transformation features such as gender change, hair color change, and facial editing tools.

Face swap apps are a popular category in the App Store

AI features in apps can create incredibly realistic deepfakes, and it’s important for companies like Apple to get ahead of these issues. While Apple won’t be able to stop these use cases, it can at least implement policies that can be enforced during app review—clear guidelines and rules about generating pornographic images. It’s already stopped AI deepfake sites from signing in with Apple.

For example, no app should be able to source videos from Porn Hub. Apple could also implement specific policies for potential dual-use apps, such as zero-tolerance bans on apps that attempt to create such content.

Apple has gone to great lengths to ensure that Apple Intelligence doesn’t create nude images, but that shouldn’t be the end of its oversight. Given that Apple claims to be the ultimate arbiter of the App Store, it needs to address things like promoting NCII generation in ads.

Face swap apps aren’t the only apps with a problem. Even apps that openly promote infidelity, intimate video chats, adult chats, or other euphemisms make it through app review.

Reports have long suggested that app review is broken and regulators are tired of the platitudes. Apple needs to rein in the App Store or risk losing control.