In the News

Citizens Need Better Legal Protection Against Abusive, Nonconsensual Deepfake Content: Jake Morabito on NTD News

Plenty of everyday students and even minors all fall victim to this AI deep fake technology. It's an evolution of an existing problem.

 Jake Morabito, ALEC Communications and Technology Task Force Director, spoke with David Lam of NTD News to talk more about the damage being caused by explicit AI-generated deepfake images and their exploitation on everyday people following the nonconsensual and abusive images of pop singer Taylor Swift going viral across social media platforms.

Plenty of everyday students and even minors all fall victim to this AI deep fake technology, but this isn’t a new problem, per se it is an evolution of an existing problem. Ten states have already taken measures to update their existing statutes to clarify that, AI generated nonconsensual, intimate images are illegal, but in the remaining states it’s a bit unclear. That’s why ALEC’s new model legislation is necessary to update existing statutes to reflect that, these malicious deep fakes should be illegal, and justice should be allowed through.

Morabito dives further into the enforcement behind such laws already implemented across the 10 states.

It’s divided into two categories. You have some situations where it’s Taylor Swift, or with those who are over 18, where it’s essentially explicit or abusive images of adults being circulated without their permission. There’s also the case of minors, child sex abuse material, also known as C-SAM, and that brings a whole other level of criminal penalties, and those abuses really needs to be held accountable to the full extent of the law. ALEC has a great framework that legislators can take today to their capitals. It’s bipartisan. It’s not controversial. I think people understand that this is a problem that needs to be fixed, and it’s a way to do that while protecting AI innovation and allowing this technology to flourish in a responsible way.

Watch the full interview here.