Their kids died in shootings. Now they’re voices of gun reform

AI deepfakes used to impersonate children who have died by guns
© Motortion | Dreamstime.com

When I read how some grieving parents are using AI to grab the emotions of lawmakers, I thought you’d want to know about it, too. I will warn you this is not one of my usual happy stories.

I’m sure you remember six years ago when a gunman opened fire on Marjory Stoneman Douglas High School in Parkland, Florida, killing 14 students and three staff members. One of the students who died that day was 17-year-old Joaquin Oliver. 

Now, his parents are using his voice and those of six other kids who died in shootings — to advocate for firearm law changes. Their website lets anyone send a call directly to their reps, in which the kids do the talking. So far, more than 40,000 have gone out.

All the parents involved in the project recreated their children’s voices with AI using audio taken from home movies, voicemails and social media. Take a listen here.

Wait, aren’t AI robocalls illegal? 

I warn you a lot about AI scammers using family members’ or employers’ voices to fool you into handing over money or info. And you probably heard the faked voice of President Biden encouraging people not to vote. 

Both of those call types are now illegal. The FCC just outlawed AI-generated deepfake telemarketing calls. The Telephone Consumer Protection Act also bans pre-recorded calls, messages, automated dialers and calling you without consent. 

But calling lawmakers is different than calling consumers. For now, at least, AI-voiced messages sent to lawmakers are legal.

But are they ethical? 

Whenever we talk about AI voice cloning, we have to discuss ethics, both for the “owners” of the original voice and the “listeners” of the cloned version.

Most of the time with AI deepfakes, the ethics are pretty obvious. A scammer has stolen someone’s voice to try to dupe people. Both the mark and the person whose voice has been stolen are victims — Case closed! 

The calls created by the Shotline all clearly state they were made using AI. They aren’t trying to dupe lawmakers, just get their attention. That feels OK to me. Lawmakers can handle it. 

What about the kids?

Do parents have a right to create AI versions of their kids’ voices without consent? Joaquin’s parents, Manny and Patricia, say they feel strongly he’d get behind what they’re doing. 

The high school senior was vocal on social media about protecting children from guns, and his parents feel crystal clear that their use of their son’s image and likeness is true to his wishes. 

For now, it seems like the law does, too. ElevenLabs, the AI company they used to recreate Joaquin’s voice, requires the estate to have visual and audio likeness rights. Right now, the assumption is that parents do have those rights over their minor children. 

This is all still a legal gray area. There just haven’t been enough cases for it to be set in stone. Even I’m not sure, honestly. It’s painful to think about, isn’t it? 

Planning ahead 

AI now just needs a small sample of audio to recreate a voice. Consider including your wishes about the use of your voice and likeness in your estate planning. If you don’t want your family to be able to recreate your voice or image with AI, say so in your will. And if you do, you should say that, too. 

I know this is a heavy topic. You can read the entire story here (WSJ, paywall link). I’d like to know what you think. Reply to this email, respond when you rate this issue or drop me a DM on social media. I’m @kimkomando everywhere.

🙏🏻 Before we leave this topic and move on, a prayer: Dear God, We ask for Your comfort to envelop the families, friends, and entire communities affected by senseless acts of violence. Grant them the strength to endure the pain of loss and the courage to continue their lives with purpose and hope. Amen.

Tags: family, home