AI CHATBOTS - LICENSING/SAFETY/PRIVACY.

Printer-friendly: Click to view
View NCGA Bill Details(link is external)2025-2026 Session
Senate Bill 624 (Public) Filed Tuesday, March 25, 2025
AN ACT REGULATING ARTIFICIAL INTELLIGENCE CHATBOT LICENSING, SAFETY, AND PRIVACY IN NORTH CAROLINA.
Intro. by Burgin.

Status: Ref To Com On Rules and Operations of the Senate (Senate action) (Mar 26 2025)
S 624

Bill Summaries:

  • Summary date: Mar 31 2025 - View Summary

    Enacts new GS Chapter 114B, licensing of chatbots, as follows.

    Defines chatbot as a generative artificial intelligence system with which users can interact by or through an interface that approximates or simulates conversation through a text, audio, or visual medium. Defines generative artificial intelligence system as one that uses artificial intelligence, as defined in the specified federal act, to generate or substantially modify image, video, audio, multimedia, or text content. Define additional terms used in the Chapter.

    Requires obtaining a health information chatbot license before operating or distributing a chatbot that deals substantially with health information. Sets out information that must be included in the license application, including quality control and testing procedures, proof of insurance coverage, and documentation of the chatbot’ security measures and protocols. Requires the Department of Justice (DOJ) to review applications based on six criteria, including public safety considerations and risk management procedures. Requires DOJ to adopt rules to carry out this Chapter. Requires licensees to maintain professional liability insurance. Requires licensees to: (1) implement industry-standard encryption for data in transit and at rest, maintain detailed access logs, and conduct regular security audits no less than once every six months; (2) report any data breaches within 24 hours to DOJ and within 48 hours to affected consumers; (3) obtain explicit user consent for data collection and use; (4) provide users with access to their personal data; and (5) provide users with the ability to delete their data upon request. Requires licensees to clearly disclose six pieces of information, including the artificial nature of the chatbot, data collection and use practices, and emergency resources when applicable. Requires a licensee to: (1) demonstrate effectiveness through peer-reviewed, controlled trials with appropriate validation studies done on appropriate sample sizes with real-world performance data; (2) demonstrate effectiveness in a comparative analysis to human expert performance; and (3) meet minimum domain benchmarks as established by the DOJ.

    Under new GS 114B-5, requires DOJ to enforce this Chapter and rules adopted under this Chapter. Requires the Attorney General to designate staff to the oversight and enforcement of this Chapter, who may enter, at reasonable times, any factory, warehouse, or establishment in which licensed chatbots are manufactured, processed, or held, and to inspect, in a reasonable manner and within reasonable limits and in a reasonable time. Also allows DOJ to conduct digital inspections of licensed chatbots. Requires treating information that is a trade secret or confidential commercial information as confidential. Requires providing the manufacturer or importer with a detailed report on identified deficiencies and required corrective actions. Includes recordkeeping and reporting requirements for manufacturers or importers.

    Makes it illegal to: (1) introduce or deliver for introduction into state commerce any chatbot that deals substantially with health information without complying with these licensing requirements; (2) fail to comply with any requirement of this Chapter or any rule adopted hereunder; (3) refuse to permit access to or copying of any record as required by this Chapter; or (4) fail to report adverse events as required under this Chapter. Violations of GS 114B-5 are subject to civil penalties of $50,000, with the proceeds remitted to the Civil Penalty and Forfeiture Fund.

    Includes a severability clause.

    Effective January 1, 2026.

    Part II.

    Enacts new GS Chapter 170, Chatbot Safety and Privacy Act, as follows.

    Defines covered platform as any person that provides chatbot services to users in this state, if the person (1) has annual gross revenues exceeding $100,000 in the last calendar year or any of the two preceding calendar years or (2) has more than 5,000 monthly active users in the US for half or more of the months during the last 12 months. Excludes from the term any person that provides chatbot services solely for educational or research purposes and does not monetize such services through advertising or commercial uses or any government entity providing chatbot services for official purposes. Defines legitimate purpose as a purpose that is lawful and in line with the stated objectives, functionalities, core services, and reasonable expectation of users on a platform. Defines other terms used in this Chapter.

    Prohibits a covered platform from processing data or designing chatbot systems and tools in ways that significantly conflict with trusting parties’ best interests, as implicated by their interactions with chatbots. Sets out requirements for covered platforms, in carrying out their duty of loyalty in emergency situations, dusty of loyalty in preventing emotional dependence on a chatbot, duty of loyalty in chatbot identify disclosure, duty of loyalty in influence, duty of loyalty in collection, duty of loyalty when personalizing content, and duty of loyalty in gatekeeping of personal information.

    Specifies that the duties between a covered platform and an end-user are be established through a terms of service agreement which is presented to the end-user in clear, conspicuous, and easily understandable language. Requires the terms of service agreement to (1) explicitly outline the online service provider's obligations, (2) describe the rights and protections afforded to the end-user under this relationship, and (3) require affirmative consent from the end-user before the agreement takes effect. Requires notification of material changes to the terms of service agreement and obtain renewed consent for such changes and requires the terms to be easily accessible to users at all times through the covered platform's application or website.

    Requires the chatbot’s identification process to include the covered platform informing users that the chatbot has four specified features, including that it is not human or sentient, and is without personal preference or feelings. Requires users to give explicit and informed consent to interact with the chatbot, as described. Prohibits using deceptive design elements that manipulate or coerce users into providing consent or that obscure the nature of the chatbot or consent process. Requires repeating the chatbot identity communication and opt-in consent at the start of each new session with a user.

    Requires covered platforms to: (1) ensure that all user-related data disclosed collected through conversations between users and chatbots or through third-party cookies, undergoes a process of de-identification prior to storage and analysis; (2) take reasonable care to prohibit the incorporation or inclusion of any sensitive personal information derived from a user during the use of a chatbot into an aggregate dataset used to train any chatbot or generative artificial intelligence system; and (3) store all chatbot conversations which does not include sensitive personal information for at least 60 days. Sets out further requirements related to these data privacy provisions.

    Allows the Attorney General to bring a civil action when he has reason to believe that a covered platform has committed a violation, on behalf of the State’s residents, to: (1) enjoin any practice violating this Chapter and enforce compliance with the pertinent section or sections on behalf of residents of the State; (2) obtain damages, restitution, or other compensation, each of which must be distributed in accordance with State law; or (3) obtain such other relief as the court may consider appropriate.

    Allows persons suffering an injury due to a violation of the Chapter to sue the covered platform to enjoin further the violation, recover damages in an amount equal to the greater of actual damages or $1,000 per violation, obtain reasonable attorneys’ fees and litigation costs; and obtain any other appropriate relief. Requires actions to be brought within two years after the person first discovered or reasonably should have discovered the violation. Prohibits a person from bringing more than one action against the same covered platform for the same alleged violation.

    Includes a severability clause for the Chapter.

    Effective January 1, 2026.