In the News

Talking AI & Digital Rules: Jake Morabito on FM Talk 1065

It's our private sector leaders, who are developing these AI systems, that have the expertise. The government should work hand-in-hand with them, not punish them with stifling regulations.

ALEC Communications and Technology Task Force Director Jake Morabito joined FM Talk 1065 to discuss the latest developments in AI policy, including California Governor Newsom’s recent veto of a strict AI regulation bill, Alabama’s approach to updating existing laws to address AI-related crimes, and the importance of avoiding overregulation of AI to prevent stifling innovation while emphasizing the need for the responsible use of technology.

One thing I want to start the conversation with is that it’s been about two years since ChatGPT first came on the scene and introduced the world to the current form of generative artificial intelligence. Many people have been fascinated and even a little horrified by how advanced the technology has become. Now, two years in, some of the initial hype from ChatGPT’s launch has subsided. The apocalypse hasn’t happened, and Skynet isn’t here. But to your point, April, several states—including the usual suspects like California, New York, and Illinois—are eager to stifle AI and regulate it. Governor Newsom actually vetoed a bill that would have amounted to the strictest regulations on AI in the country, so even for him, it was too much.

The law is really focused on advanced artificial intelligence systems that humanity could potentially lose control of, leading to catastrophic harm. They’re concerned AI could be used to create something like a bioweapon or commit cyberattacks that would harm society. To restrict that, the law aims to put technical limitations on how powerful an AI can be and impose regulations on the California government that would dictate who can create AI models and how they can be used. While the intent is noble—I agree, I don’t want AI systems causing cyberattacks either—government isn’t usually best equipped to make those decisions. It’s our private sector leaders, who are developing these AI systems, that have the expertise. The government should work hand-in-hand with them, not punish them with stifling regulations.

At ALEC, we’re all about giving parents complete choice over how their children should be raised. Whether it’s school choice or, in this case, how parents know what’s best for their children, state actors shouldn’t dictate that. There are already tools on Apple smartphones and social media platforms that provide parents with parental controls to monitor or restrict their children’s access. We should let parents be the parents and keep the government out of it.

Listen to the full interview.