Draft
Model State Artificial Intelligence Act
Section 1. {Statement of Purpose}
(1) The [LEGISLATIVE BODY] finds the following:
(a) WHEREAS, artificial intelligence (AI) represents the next great tool for human flourishing, artistic creativity, increased productivity, and economic growth;
(b) WHEREAS, AI also represents a major area of competition between American innovators and foreign adversaries and cyber criminals;
(c) WHEREAS, the major advancements in AI have been driven by private sector capital, ingenuity, and effort across a wide range of business models and other arrangements, including academic research and open source collaboration,
(d) WHEREAS, throughout history, major technological advances have been met by panicked and misguided responses from government regulators;
(e) WHEREAS, AI innovation is supported by the principles of a market-driven approach to policy creation; competition and technological neutrality; constitutional limits and protections against government overreach; self-governance as the preferred approach to addressing novel challenges, as opposed to increased government regulation; continuous deregulation; and, when absolutely necessary, implementing simple regulations with guardrails;
(f) WHEREAS, AI is an expansive and rapidly developing category of computer technology and applications with no consensus definition and unclear boundaries that may include a large portion of existing computer software and any regulatory body or law should take into account the broad potential effects of regulation, with the goal of creating a sectoral approach for specific AI uses;
(g) WHEREAS, the category of computer technology labeled “AI” is rapidly changing and any regulatory body or law should take into account that it may soon be outdated;
(h) WHEREAS, specialized regulatory and government agencies can serve a vital role protecting consumers but they are also prone to regulatory capture by special interests;
(i) BE IT FURTHER RESOLVED, [LEGISLATIVE BODY] supports efforts by state and federal functional regulators to enforce existing anti-discrimination and other laws against regulated entities regardless of AI use;
(j) BE IT FURTHER RESOLVED, [LEGISLATIVE BODY] rejects any attempt by the federal, state, or municipal, governments to ban the use of AI, or significantly curtail the use and advancement of AI by undermining the above principles;
(k) BE IT FURTHER RESOLVED, [LEGISLATIVE BODY] believes that any new federal regulations governing the use of AI technology must emerge from clear statutory directives of the U.S. Congress, and not be promulgated absent such directives by federal agencies.
Section 2. {Definitions}
(1) “Applicant” means a person that applies for participation in the regulatory Learning Laboratory.
(2) “Learning Agenda” means the areas of artificial intelligence applications, risks, and policy considerations selected by the Office for focus by the Learning Laboratory.
(3) “Learning Laboratory” means the artificial intelligence analysis and research program created in Section 4 of this Act.
(4) “Office” means “The Office of Artificial Intelligence Policy” as created in Section 3 of this Act;
(5) “Participant” means a person that is accepted to participate in the Learning Laboratory.
(6) “Regulatory mitigation” means:
(a) When restitution to users may be required;
(b) Terms and conditions related to any cure period before penalties may be assessed;
(c) Any reduced civil fines during the participation term; and
(d) Other terms tailored to identified issues of the artificial intelligence technology.
(7) “Regulatory mitigation agreement” means an agreement between a participant, the Office, and relevant state agencies described in Section 6 of this Act.
(8) “State agency” means:
(a) An authority, a board, a branch, a commission, a committee, a department, a division, or another instrumentality of:
(i) The executive branch of state government, including the administration; or
(ii) The legislative branch of state government.
(b) “State agency” does not include the judicial branch of state government.
Section 3. {Creation of Office of Artificial Intelligence Policy – Director Appointed – Duties and Authority}
(1) There is created in the Office of the Attorney General the Office of Artificial Intelligence Policy.
(2) The purposes of the Office are to:
(a) identify regulatory barriers to AI development, deployment, and use;
(b) recommend regulatory proposals to remove or avoid such barriers;
(c) identify regulatory gaps where existing law is insufficient to prevent or redress substantial, non-speculative, concrete, and redressable harm; and
(d) recommend regulatory proposals to fill such gaps.
(3) The Attorney General shall appoint a director to oversee the management and operations of the office.
(4) The Office shall:
(a) Conduct the Inventory of Existing State Regulation of Artificial Intelligence Technology described in Section 5 of this Act;
(b) Create and administer an Artificial Intelligence Learning Laboratory program;
(c) Consult with businesses and other stakeholders in the state about potential regulatory proposals;
(d) Consult with Offices, Learning Laboratories, or similar bodies in other states;
(e) Make rules in accordance with [STATE ADMINISTRATIVE PROCEDURE ACT], establishing:
(i) Procedures, requirements, and fees to apply to participate in the learning laboratory program;
(ii) Criteria for invitation, acceptance, denial, or removal of participants;
(iii) Data usage limitations and cybersecurity criteria for participants;
(iv) Required participant disclosures to consumers;
(v) Reporting requirements for participants to the Office;
(vi) Criteria for limited extension of the participation period; and
(vii) Other requirements as necessary to administer the Learning Laboratory; and
(f) Report annually, before [DATE], to the [LEGISLATIVE COMMITTEES OF JURISDICTION] regarding:
(i) The proposed learning agenda for the Learning Laboratory;
(ii) The findings, participation, and outcomes of the Learning Laboratory; and
(iii) Recommended legislation from findings from the Inventories and Learning Laboratory.
Section 4. {Inventory of Artificial Intelligence Technology Use by State Agencies}
(1) Not later than [_____ MONTHS AFTER DATE OF ENACTMENT], each state agency may:
(a) Compile, in a form specified by the Office, an inventory of all artificial intelligence technologies that are:
(i) In use by the state agency; or
(ii) Being developed or considered by the state agency for use.
(b) Submit the inventory to:
(i) The Office;
(ii) The Attorney General;
(iii) The Governor;
(iv) The Speaker of the House of Representatives, the President of the Senate, and the Chairs and Ranking Minority Members of the [LEGISLATIVE COMMITTEES OF JURISDICTION]
(2) A state agency’s inventory under Section 4 (1) shall include the following information for each artificial intelligence technology included in the inventory:
(a) The vendor of the artificial intelligence technology.
(b) A description of the function and capabilities of the artificial intelligence technology.
(c) A description of:
(i) The purpose or purposes for which the state agency uses the artificial intelligence technology;
(ii) Any purpose for which the state agency contemplates using the artificial intelligence technology in the future; and
(iii) Examples of the data or information produced by the artificial intelligence technology for each purpose.
(d) Whether the artificial intelligence technology provides:
(i) The state agency with information or data that is used by the state agency to inform decisions made by the state agency; or
(ii) Decisions, without human intervention, that are implemented by the state agency.
(e) The types of information or data used by the artificial intelligence technology and the source of the information used by the artificial intelligence technology.
(f) The manner in which the state agency secures the following from unauthorized access:
(i) Artificial intelligence technology;
(ii) Information or data used by the artificial intelligence technology; and
(iii) Information or data produced by the artificial intelligence technology.
(g) Any person with which the state agency shares the information or data produced by the artificial intelligence technology and the purpose for which the state agency shares the information or data with the person.
(h) The documented or anticipated benefits and risks of the state agency’s use of the artificial intelligence technology for both the state agency and state residents served by the state agency.
(i) Any information or data used by the state agency to assess the benefits and risks of the state agency’s use of the artificial intelligence technology.
(j) The fiscal effect of the state agency’s use of the artificial intelligence technology, including the following:
(i) Costs associated with the artificial intelligence technology, including:
(1) Initial acquisition or development costs; and
(2) Ongoing operating costs, including costs of licensing, maintenance, legal compliance, and data storage and security.
(ii) Any funding source that is used, or could be used, by the state agency to defray the costs described;
(iii) An estimate of the degree to which the costs described are offset by a reduction in the state agency’s operating costs attributable to the state agency’s use of the artificial intelligence technology.
(k) Whether the artificial intelligence technology has been tested or evaluated by an independent third party.
(l) Whether the data or information produced by the artificial intelligence technology has been:
(i) Evaluated for bias against Constitutionally protected classes of individuals;
(ii) Found to exhibit bias; and
(iii) Adjusted to mitigate any such bias.
(3) Not later than [_____ MONTHS AFTER DATE OF ENACTMENT], the Office may prescribe a form for use by state agencies for compilation and submission of the inventory under Section 4 (2).
Section 5. {Inventory of Existing State Regulation of Artificial Intelligence Technology}
(1) Inventory Required. Not later than [_____ MONTHS AFTER DATE OF ENACTMENT OF THIS ACT], the Office, in consultation with relevant state agencies, shall conduct a comprehensive analysis of the existing regulatory governance of artificial intelligence technology in the State:
(a) Contents of Inventory. The analysis conducted under subsection (1) shall include:
(i) A review of existing laws, regulations, executive orders, and state agency rulemaking that pertain to the regulation of the development and use of artificial intelligence technology within the State;
(ii) An assessment of the specific category of artificial intelligence use governed by each existing law, regulation, executive order, and state agency rulemaking, and whether each achieves its purpose without impeding the development and use of artificial intelligence technology;
(iii) An identification of any gaps where existing law, regulation, executive order, and state agency rulemaking are insufficient to prevent or redress substantial, non-speculative, concrete, and redressable harm from a specific use of artificial intelligence technology; and
(iv) An identification of state agencies that possess statutory authority to regulate development and use of artificial intelligence technology.
(b) Submission of Analysis. Upon completion of the analysis under subsection (a), the Office shall submit the findings to:
(i) The Attorney General;
(ii) The Governor;
(iii) The Speaker of the House of Representatives, the President of the Senate, and the Chairs and Ranking Minority Members of the [LEGISLATIVE COMMITTEES OF JURISDICTION]
Section 6. {Artificial Intelligence Learning Laboratory Program}
(1) There is established the Artificial Intelligence Learning Laboratory Program, to be administered by the Office.
(2) The purpose of the Learning Laboratory is to:
(a) Analyze and research the benefits, risks, impacts, and policy implications of artificial intelligence technologies to inform the state regulatory framework;
(b) Encourage development of artificial intelligence technologies in the state;
(c) Evaluate the effectiveness and viability of current, potential, or proposed regulation on artificial intelligence technologies in cooperation with artificial intelligence developers; and
(d) Produce findings and recommendations for legislation and regulation of specific artificial intelligence uses.
(3) The Office shall periodically set a learning agenda for the Learning Laboratory that establishes the specific areas of artificial intelligence policy the office intends to study.
(4) The initial learning agenda shall include identifying specific categories of artificial intelligence uses with similar profiles of benefits, risks, impacts, and associated regulatory bodies.
(5) In establishing the learning agenda, the office may consult with:
(a) Relevant agencies;
(b) Industry leaders;
(c) Academic institutions in the State; and
(d) Key stakeholders with relevant knowledge, experience, or expertise in the area.
(6) The Office may invite and receive an application from a person to participate in the Learning Laboratory.
(7) The Office shall establish the procedures and requirements for sending an invitation and receiving requests to participate in the Learning Laboratory in accordance with the purposes of the Learning Laboratory.
(8) Open-source projects shall be eligible for participation in the Learning Laboratory.
(9) In selecting participants for the Learning Laboratory, the Office shall consider:
(a) The relevance and utility of an invitee or applicant’s artificial intelligence technology to the learning agenda;
(b) The invitee or applicant’s expertise and knowledge specific to the learning agenda; and
(c) Other factors identified by the Office as relevant to participation in the Learning Laboratory;
(10) The Office shall work with participants to establish benchmarks and assess outcomes of participation in the Learning Laboratory.
Section 7. {Participation in the Artificial Intelligence Learning Laboratory}
(1) The Office may approve an applicant to participate in the program.
(2) An approved applicant becomes a participant by entering into a participation agreement with the Office and relevant state agencies.
(3) A participant shall:
(a) Provide required information to state agencies in accordance with the terms of the participation agreement; and
(b) Report to the Office as required in the participation agreement.
(4) A participant shall retain records as required by Office rule or the participation agreement.
(5) A participant shall immediately report to the Office any incidents resulting in consumer harm, privacy breach, or unauthorized data usage, which may result in removal of the participant from the Learning Laboratory.
Section 8. {Artificial Intelligence Regulatory Mitigation Agreements}
(1) A participant who uses or wants to utilize an artificial intelligence technology in the state may apply for regulatory mitigation according to criteria and procedures outlined by the Office by rule made under Section 3 of this Act.
(2) The Office may grant, on a temporary basis, regulatory mitigation to a participant by entering into a regulatory mitigation agreement with the Office and relevant agencies.
(3) To receive regulatory mitigation, a participant must demonstrate that the applicant meets eligibility criteria established in Section 9 of this Act.
(4) A regulatory mitigation agreement between a participant and the Office and relevant agencies shall specify:
(a) Limitations on scope of the use of the participant’s artificial intelligence technology, including:
(i) The number and types of users;
(ii) Geographic limitations; and
(iii) Other limitations to implementation;
(b) Safeguards to be implemented; and
(c) Any regulatory mitigation granted to the applicant.
(5) The Office shall consult with relevant agencies regarding appropriate terms in a regulatory mitigation agreement.
(6) A participant remains subject to all legal and regulatory requirements not expressly waived or modified by the terms of the regulatory mitigation agreement.
(7) The Office may remove a participant at any time and for any reason, and the participant does not have an expectation of a property right or license to participate in the Learning Laboratory.
(8) A participant demonstrating an artificial intelligence technology that violates legal or regulatory requirements (taking into account any regulatory mitigation agreement), or that violates the terms of the participation agreement, may be immediately removed from further participation and subject to all applicable civil and criminal penalties.
(9) Participation in the Learning Laboratory does not constitute an endorsement or approval from the State.
(10) Participation or non-participation in the Learning Laboratory does not constitute a legally cognizable factor for any tort claim, civil law violation, or criminal law violation.
(11) The State shall not be responsible for any claims, liabilities, damages, losses, or expenses arising out of a participant’s involvement in the learning laboratory.
Section 9. {Regulatory Mitigation Eligibility Requirements}
(1) To be eligible for regulatory mitigation, a participant shall demonstrate to the Office that:
(a) The participant has the technical expertise and capability to responsibly develop and test the proposed artificial intelligence technology;
(b) The participant has sufficient financial resources to meet obligations during testing;
(c) The artificial intelligence technology provides potential substantial consumer benefits that may outweigh identified risks from mitigated enforcement of regulations;
(d) The participant has an effective plan to monitor and minimize identified risks from testing; and
(e) The scale, scope, duration of proposed testing is appropriately limited based on risk assessments.
(2) To evaluate whether an applicant meets eligibility criteria to receive regulatory mitigation, the Office may consult with relevant agencies and outside experts regarding the application.
Section 10. {Program Extension}
(1) An initial regulatory mitigation agreement shall be in force for no longer than 12 months.
(2) A participant may request a single 12-month extension for participation in the Learning Laboratory period no later than 30 days before the end of the initial 12-month period.
(3) The Office shall grant or deny an extension request before expiration of the initial demonstration period.
Section 11. {Effective Date}
Section 12. {Sunset Clause}
The provisions of this Act are repealed on [ONE YEAR AFTER ENACTMENT] and may be renewed for one year.