Technology‑Neutral Anti‑Discrimination Clarification Act

Prior to task force meetings, ALEC posts these legislative member-submitted draft model policies to our website. The draft model policies are then discussed, debated, and voted on by ALEC task force members. Policies that receive final approval by legislators on the ALEC Board of Directors become official ALEC model policy. Draft model policies that fail to become official ALEC model policy are removed from the website.

Summary

This act reaffirms that discrimination based on protected characteristics is unlawful whether it occurs in housing, employment, public accommodations, education, credit, or any other context covered by state civil‑rights statutes. Advances in software, automation, algorithmic decision‑making, artificial intelligence, and other digital technologies do not create a lawful avenue to engage in conduct that existing law already prohibits. Clarifying that discriminatory conduct remains unlawful regardless of the tools used will remove any doubt that digital implementation does not confer legal immunity, keep state law neutral toward future technologies, and avoid imposing vicarious liability on technology developers who neither intend nor knowingly facilitate discriminatory conduct.

Technology‑Neutral Anti‑Discrimination Clarification Act

Section 1. Short Title 

This Act may be cited as the “Technology‑Neutral Anti‑Discrimination Clarification Act of 20 XX.” 

Section 2. Legislative Purpose and Findings 

  1. The legislature reaffirms that discrimination based on protected characteristics is unlawful in [State] whether it occurs in housing, employment, public accommodations, education, credit, or any other context covered by state civil‑rights statutes.
  2. Advances in software, automation, algorithmic decision‑making, artificial intelligence, and other digital technologies do not create a lawful avenue to engage in conduct that existing law already prohibits.
  3. Clarifying that discriminatory conduct remains unlawful regardless of the tools used will:
    • Remove any doubt that digital implementation does not confer legal immunity;
    • Keep state law neutral toward future technologies; and
    • Avoid imposing vicarious liability on technology developers who neither intend nor knowingly facilitate discriminatory conduct. 

Section 3. Omnibus Clarification of Anti‑Discrimination Law 

[Use this section if the state prefers a single, catch‑all provision.  If the legislature instead wants explicit amendments in each civil‑rights chapter, skip to Section 4.] 

§ 3‑1. Digital‑Assisted Discrimination Prohibited

All prohibitions on discrimination contained in the [State Code] apply with full force and effect regardless of whether the challenged conduct is executed, facilitated, or scaled, in whole or in part, by means of any digital, automated, algorithmic, artificial‑intelligence, machine‑learning, or other technological process. 

Section 4. Targeted Amendment to Civil‑Rights Definitions 

(Use this approach if the legislature prefers to embed the clarification inside each substantive chapter.) 

  1. In [Code cite & section title—e.g., “§ xx‑xxx, ‘Definitions’ ], add a new subdivision:“( # ) ‘Discrimination’ or ‘discriminate’ includes any act, practice, or policy that would violate this chapter if carried out by a human actor alone, even when such act, practice, or policy is effected, facilitated, or scaled through any digital, automated, algorithmic, artificial‑intelligence, machine‑learning, or other technological process.Developer Safe Harbor. — A person that designs, licenses, or distributes a technology is liable under this chapter only if that person intentionally engages in, directs, or knowingly assists the discriminatory conduct.
  2. Replicate subsection (1) in every civil‑rights chapter that maintains its own definition section (e.g., fair housing, fair employment, public accommodations) unless those chapters already incorporate the master definition by reference. 

Section 5. Rules of Construction 

  1. Technology‑Neutrality. — The phrases “digital, automated, algorithmic, artificial‑intelligence, machine‑learning, or other technological process” are illustrative and non‑exhaustive; they encompass present and future means of computation, communication, or decision‑making.
  2. No Expansion of Protected Classes. — Nothing in this Act alters the list of protected characteristics enumerated elsewhere in state law.
  3. Developer Safe Harbor. — A creator, publisher, or distributor of technology is liable under this Act only if that party intentionally engages in, directs, or knowingly aids the underlying discriminatory conduct.
  4. Continuity of Legal Standards. — Existing burdens of proof, defenses, and remedies applicable to discrimination claims remain unchanged. 

Section 6. Severability 

If any provision of this Act or its application to any person or circumstance is held invalid, the remainder of the Act—and the application of its provisions to other persons or circumstances—shall not be affected. 

Section 7. Effective Date 

This Act takes effect on [January 1 of the calendar year following enactment] and applies to conduct occurring on or after that date.