Our Faces No Longer Belong to Us

Your likeness is now fair game for AI. Anyone is a click away from creating a digital version of you.

A family member sent me an image of a baby dressed in an elf outfit. My baby. It was adorable—and unnerving.

My cousin took a photo from our group chat and uploaded it to Meta AI. While his heart was in the right place, my mind was spinning. What could Meta do with my son’s face? The privacy policy of Meta Platforms says the company can use these images to “improve AIs and related technology.”

In the age of AI, I can’t control where my—or my son’s—face ends up. Our likenesses are no longer our own.

Tools such as OpenAI’s new Sora video app , Google’s Nano Banana image generator and even Apple’s Image Playground can take a genuine video or photo and spin up a virtual knockoff in minutes. While there are safeguards, anyone from a well-meaning loved one to a malicious scam artist can create fake versions of you that would have been inconceivable a few years ago.

After trying several of the tools, I’ve learned that some models are quite good and getting better—enough to fool people into thinking they’re real. As you’ll read below, that can have devastating effects.

The AI puppet machines

I recently co-hosted a birthday party with a friend and used Gemini for the invitation. My vision: our younger selves blowing out candles together. I gave Gemini a couple of childhood photos (with my friend’s permission), and it placed us together in front of the same cake. It got our outfits exactly right, and the faces could have fooled our own moms.

To prevent Google from using the images to improve its AI, I turned my activity history off.

The Sora app (currently invite-only) is part video generator, part social network—like an upside-down version of TikTok where every clip is fake. You only need to pose for a few seconds for it to generate your digital doppelgänger, aka “cameo.” You can use your cameo—and your friends’ cameos—to star in any video you dream up.

In the name of research, I uploaded my likeness. I looked into my phone’s camera, moved my head around and spoke aloud the three numbers that appeared on screen. Within minutes, AI me was escaping from a pit of USB cords, and accepting an Oscar for going to bed early.

My avatar wasn’t perfect—my teeth looked off, and so did some of my expressions—but the resemblance was impressive.

I restricted the use of my cameo to a few friends. You can set restrictions on the type of cameos (none featuring your rival football team, for example) and how you appear in them (no buzz cuts, please).

People also can’t secretly make videos of you: They can’t download or screen-record videos with multiple cameos, and anything they make with your cameo appears in your account, too. You can delete your cameo appearances across all accounts if you want.

Despite the protections, I was creeped out that my digital twin now resided somewhere inside OpenAI.

Sora blocked videos about Taylor Swift and other living celebrities. But I spotted plenty of dead legends including Tupac, Einstein, Abraham Lincoln, Bob Ross and John F. Kennedy. In one eerily realistic clip, an AI version of Martin Luther King Jr. said, “I have a dream that one day Sora will change its content-violation policy.”

OpenAI says it takes steps to protect user privacy. In the Sora app’s settings, you can prevent the company from using your content to train its models. ( News Corp , owner of The Wall Street Journal, has a content-licensing partnership with OpenAI.)

There are other AI video apps , including Meta’s Vibes and Google’s Veo, and likely more to come. And while most AI clips I’ve seen have been lighthearted, there’s a sinister side to the technology, where deepfakes are used to impersonate real people and steal money.

The cost of a stolen face

For years, meteorologist Bree Smith appeared on Nashville TV to share the forecast. On Instagram, she shared bits of her life to over 23,000 followers. Last year, Smith received an odd email from a man who said he was a longtime viewer. Someone was posing as Smith online, offering “pictures only a husband would have,” he wrote.

The mother of two panicked. According to the viewer, a fake Smith had sent images via the Signal chat app with her face on a seminude body, along with a video of her saying, “It is me, it is really me.” Except it wasn’t her. She believes the images and videos were manipulated using AI.

The scammers used the deepfakes to try to persuade victims to pay $200 for access to a phony Smith fan account. In another instance, a fake Smith offered a hotel stay and promised intimate acts.

“It’s degrading and traumatizing to experience your own identity being distorted without your consent,” Smith said, now age 43. This year, she helped pass a Tennessee law making it illegal to share or threaten to share such deepfakes.

Smith tracked around 70 fake Facebook accounts. Vermillio—a service that searches for your likeness around the web—found thousands more fake Smith pics and videos and hundreds of accounts on YouTube, TikTok and elsewhere.

Vermillio Chief Executive Dan Neely said the company can monitor the outputs of public generative-AI models like Sora and social-media forums for misuse of its customers’ name and likeness. The service offers a free tool to assess your risk, with plans from $10 to $99 a month depending on the number of requests to take down unauthorized content.

There are other companies in the business of AI-likeness protection, including Loti, which charges up to $2,500 a month for public figures and has a free program in beta.

Fake nudes are a frightening new reality for teens. A 2024 study by the Center for Democracy and Technology found that 15% of U.S. high-school students say they’ve heard about nonconsensual intimate deepfakes depicting their classmates.

Neely said AI needs better safeguards to protect people’s identity and reputations. It’s for tech companies’ own good. “Many people, once they feel safe, will feel better about engaging with the technology,” he said.

Until then, our faces are up for grabs.

Write to Nicole Nguyen at nicole.nguyen@wsj.com

Follow tovima.com on Google News to keep up with the latest stories
Exit mobile version