4 mins read

Deepfake ads featuring Jenna Ortega ran on Meta platforms. Big Tech needs to fight this.

The crisis of deepfakes continues. Meta platforms, including Instagram, Facebook, and Messenger, reportedly hosted AI-generated ads depicting Wednesday actor Jenna Ortega as undressed.

As reported by NBC News, the ads used a blurred picture of Ortega taken when she was 16 years old, and instructed users on how they could change the celebrity’s outfit, including an option for removing all clothes. The images were reportedly manipulated by an app called Perky AI, listed as developed by company RichAds, which is described in Apple’s App Store as a platform that uses AI to “create super-realistic or fantasy-like persons” with prompts. This includes “NSFW” (not safe for work) images, which are typically sexually explicit.

Following NBC’s report, the publisher says the Perky AI app’s page was suspended by Meta, having already run 260 unique ads on the company’s platforms since September — those featuring Ortega’s image reportedly ran throughout the month of February. Of the ads that had run on Meta’s platforms, the news outlet says 30 were already suspended for not meeting the company’s advertising standards, but the ads featuring Ortega were not among these.

In a statement to Mashable, Meta spokesperson Ryan Daniels said, “Meta strictly prohibits child nudity, content that sexualizes children, and services offering AI-generated non-consensual nude images. While this app remains widely available on various app stores, we’ve removed these ads and the accounts behind them.”

Perky AI also appears to have been removed on Apple’s App Store and Google Play (Mashable checked and it’s not available on either). Apple told NBC the app had been taken down on Feb. 16, having already been under investigation by the company for violating its policies around “overtly sexual or pornographic material”.

Mashable has reached out to Apple and Google for further comment.

This incident is the latest in a slew of nonconsensual, sexually explicit deepfakes being circulated on the internet. In the first two months of 2024, pictures of celebrities like Taylor Swift and podcast host Bobbi Althoff have spread across major social media platforms, including X, formerly known as Twitter. Deepfakes have also infiltrated schools, with fake nudes of students recently making their way around a Beverly Hills middle school and a high school in suburban Seattle.

The issue is at a critical point, with experts warning that legal and societal change is urgently needed. Subsum, an identity verification platform, found that detection of deepfakes increased 10 times between 2022 and 2023. Many social media platforms have struggled to contain this type of content: in the past few months alone, Google, X, and Meta have been called out for allowing deepfake material to circulate on their platforms.

If users see an ad on any platform that they believe needs to be reported, company guides feature directions on how to do so. Meta allows users to report ads on Facebook or Instagram, while Apple features several community threads that help users report ads that are in-app, for example. Google also helps users report inappropriate ads via a form.

But some of these steps aren’t enough to stop the proliferation of AI-generated content. Big tech needs to take significant action to tackle what seems to be becoming an epidemic, most often targeting girls, women, and marginalized people.

If you have experienced sexual abuse, if you are based in the U.S., call the free, confidential National Sexual Assault hotline at 1-800-656-HOPE (4673), or access the 24-7 help online by visiting online.rainn.org. If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.

If you are based in the UK and have experienced intimate image abuse (aka revenge porn), you can contact the Revenge Porn Helpline on 0345 6000 459. If you have experienced sexual violence and are based in the UK, call the Rape Crisis helpline 0808 802 9999.

Source link