CISOs to grapple with a thicket of emerging regulations after Newsom vetoes California’s AI bill



  • creating or using certain weapons of mass destruction to cause mass casualties,
  • causing mass casualties or at least $500 million in damages by conducting cyberattacks on critical infrastructure or acting with only limited human oversight and causing death, bodily injury, or property damage in a manner that would be a crime if committed by a human
  • and other comparable harms.

It also required developers to implement a kill-switch or “shutdown capabilities” in the event of disruptions to critical infrastructure. The bill further stipulated that covered models implement extensive cybersecurity and safety protocols subject to rigorous testing, assessment, reporting, and audit obligations.

Some AI experts say these and other bill provisions were overkill. David Brauchler, head of AI and machine learning for North America at NCC Group, tells CSO the bill was “addressing a risk that’s been brought up by a culture of alarmism, where people are afraid that these models are going to go haywire and begin acting out in ways that they weren’t designed to behave. In the space where we’re hands-on with these systems, we haven’t observed that that’s anywhere near an immediate or a near-term risk for systems.”

Critical harms burdens were possibly too heavy for even big players

Moreover, the critical harms burdens of the bill might have been too heavy for even the most prominent players to bear. “The critical harm definition is so broad that developers will be required to make assurances and make guarantees that span a huge number of potential risk areas and make guarantees that are very difficult to do if you’re releasing that model publicly and openly,” Benjamin Brooks, Fellow at the Berkman Klein Center for Internet & Society at Harvard University, and the former head of public policy for Stability AI, tells CSO.

Recent Articles

Related Stories

Leave A Reply

Please enter your comment!
Please enter your name here