Bill Summaries: H301 SOCIAL MEDIA & AI SAFETY. (NEW)

Printer-friendly: Click to view
Tracking:
  • Summary date: Apr 29 2026 - View Summary

    Senate committee substitute to the 3rd edition makes the following changes. Divides the act into parts and makes other organizational changes. Makes conforming changes to act’s titles.

    Part I.

    Makes the previous content effective October 1, 2026, instead of 2025.

    Adds the following new content.

    Part II.

    Requires the State Board of Education (SBE) to adopt age-appropriate standards for instruction on artificial intelligence (AI) literacy for grades kindergarten through 12, including the five topics described, in GS 115C-81.90. Tasks SBE with reviewing and updating these standards every two years to keep up with advancements in AI. Starting in the 2028-29 school year, requires SBE to revise the standard course of study for computer science for grades kindergarten through 12 to include artificial intelligence literacy in accordance with GS 115C-81.90(a1), as enacted by the act, above. Tasks SBE, in consultation with the Department of Public Instruction (DPI) to update the lists of approved courses to reflect course alignment with the revised computer science standards, beginning with the 2028-29 school year. Requires DPI to report to the specified NCGA committee on the three specified matters relating to the adoption of revised computer science standards by December 15, 2028.

    Part III.

    Adds GS 115C-102.13, requiring DPI to adopt a model AI policy to serve as guidance to public school units when developing their AI policies, to include the four described topics by no later than December 31, 2026. Requires local boards of education (GS 115C-47), the Board of Trustees of the NC schools for the deaf and blind (GS 115C-150.12C), charter schools (GS 115C-218.33), regional schools (GS 115C-238.66), and the Chancellor of the UNC Laboratory Schools (GS 116-239.8) to adopt a policy on AI use by students and staff for educational purposes after reviewing DPI’s model policy. Tasks the Superintendent with ensuring that public school units have access to DPI’s model policy by no later than January 15, 2027. Requires public school units to adopt the policies required by no later than June 30, 2027.

    Part IV.

    Requires DPI to establish and maintain an evaluation framework, as described, that provides criteria and guiding considerations for evaluating generative artificial intelligence-powered educational tools (AI tools) in new GS 115C-102.14. Tasks DPI with reviewing and updating these standards every two years to keep up with changes in technology, evidence, or educational practice. Provides for procurement guidance, qualified vendor lists and other tools to support and incentivize the adoption of AI tools that have been reviewed under the framework. Requires DPI to maintain a publicly available list of all AI tools reviewed under the framework and all AI tools being used in public schools.  

    Part V.

    Directs DPI to partner with the Friday Institute for Educational Innovation at North Carolina State University (Friday Institute) to design, produce, and support implementation of a suite of tool-agnostic online training modules and related training resources, addressing the eight described topics. Requires the Friday Institute to produce a suite of self-paced modules that require at least 10 hours of seat time to complete, a facilitator guide, model classroom resources, and “train the trainer” materials. Requires these resources to be made available to public schools by June 30, 2027. Instructs for all teachers employed by local school administrative units, charter schools, or laboratory schools to complete the professional development by June 30, 2028.

    Requires DPI to report to the specified NCGA committee on the Friday Institute’s modules, including any recommendations for updates or additional support needed, by December 15, 2028.

    Part VI.

    Effective when it becomes law, except as otherwise provided.


  • Summary date: Apr 15 2025 - View Summary

    House committee substitute to the 2nd edition makes the following changes. 

    Increases the exceptions to the definition of a social media platform under GS Chapter 114B, “Social Media Protections for Minors Act” by now also exempting an online service, website, or application that consists primarily of news, sports, entertainment, or other information or content that is not user generated but preselected by the provider, and for which any chat, comments, or interactive functionality is incidental to, directly related to, or dependent on the provision of such content. Now exempts all interactive video game services equipped with parental controls (was, just those that must be deactivated for minors to use). 


  • Summary date: Apr 1 2025 - View Summary

    House committee substitute to the 1st edition makes the following changes. Makes conforming changes, including to act’s long title.

    Modifies new GS Chapter 114B, “Social Media Protections for Minors Act” (Act), as follows. Changes the definition of account holder so that it means a person who opens an account or creates a profile or is identified by the social media platform by a unique identifier while using or accessing a social media platform when the social media platform knows or has reason to believe the person is a resident of this State (was, a resident of the State that the social media platform had reason to believe was located in the State). Adds new terms anonymous age verification and standard age verification. Makes technical change to social media platform or platform and adds a community forum where the primary purpose of the forum is for customer self-service support, an interactive video game service equipped with parental controls that must be deactivated for a minor to use, online shopping, and e-commerce to the types of digital platforms that are excluded from the term.

    Modifies the remedies provisions of new GS 114B-2, (following social media protections for minors) so they all apply for any violation of GS 114B-2 (previously, applied for violations pertaining to minors under fourteen years of age). Changes the effective date of new GS Chapter 114B from March 1, 2025, to October 1, 2025.

    Substantially rewrites new GS 114B-3, now concerning age verification for social media platforms, as follows. Now requires a social media platform to use either anonymous age verification or standard age verification to verify that an account holder is 16 years of age or older and, except as provided in GS 114B-2(b), and prevent creation of an account by a person younger than 16 years of age. Requires the social media platform to offer anonymous age verification and standard age verification, and a person attempting to create an account may select which method will be used to verify the person’s age. (Previously, directed commercial entity that knowingly and intentionally publishes or distributes material harmful to minors on a website or application, if the website or application contains a substantial portion (more than 33.3%) of material harmful to minors, to use either anonymous age verification or standard age verification to verify that the age of a person attempting to access the material is 16 years of age or older and prevent access to the material by a person younger than 16 years of age. Makes organizational and conforming changes, including to the statute's title. Removes defined terms anonymous age verification, commercial entity, distribute, news-gathering organization, publish, age standard verification, and substantial portion. Removes provisions containing exemptions to GS 114B-3.

     

     


  • Summary date: Mar 5 2025 - View Summary

    Enacts new GS Chapter 114B, “Social Media Protections for Minors Act” (Act), effective March 1, 2025. Defines account holder, daily active users, Department (Department of Justice), minor (person under 16 years of age), and resident. Defines social media platform or platform as an online forum, website, or application that satisfies the four listed criteria, including that it allows users to upload content or view the content or activity of other users, that it employs algorithms that analyze user data or information to select user content, and certain additive features and meets specified user criteria. Excludes an online service, website, or application where the exclusive function is email or direct messaging consisting of text, photographs, pictures, images, or videos shared only between the sender and the recipients, without displaying or posting publicly or to other users not specifically identified as the recipients by the sender from the definition of social media platform. 

    Adds new GS 114B-2, outlining the following social media protections for minors. Requires social media platforms to bar minors under the age of 14 from those platforms and to only allow minors aged 14 and 15 on such platforms with parental consent. Specifies that if a social media platform allows an account holder to use the platform, the parties have entered into a contract. 

    For any account holders that are younger than 14, requires the platform to:

    1. Terminate any account upon 30 days' notice to the minor account holder. Termination must be effective upon the expiration of the 30 days if the account holder fails to effectively dispute the termination.
    2. Permanently delete all personal information held by the social media platform relating to the terminated account, unless there are legal requirements to maintain the information.

    For any account holders that are aged 14 or 15, requires the platform to:

    1. Terminate any account held by an account holder if the account holder's parent or guardian has not provided consent for the minor to create or maintain the account. The social media platform must provide 30 days for an account holder to dispute the termination.
    2. Allow the parent or guardian of an account holder to request that the minor's account be terminated. Termination must be effective within 10 business days after the request.
    3. Permanently delete all personal information held by the social media platform relating to the terminated account unless there are legal requirements to maintain the information.

    Provides for civil enforcement against a platform in violation of the Act as an unfair trade practice by the Department. Allows the Department to impose a civil penalty of up to $50,000 per violation and reasonable attorneys’ fees and court costs. Allows for punitive damages if the platform’s failure to comply with the Act is a consistent pattern of knowing or reckless conduct. Provides for liability by the platform to the minor account holder for social media platforms that knowingly or recklessly violate the Act, including court costs, reasonable attorneys’ fees and up to $10,000 in damages. Provides for standing, personal jurisdiction, and statute of limitations. Specifies that the Act does not preclude any other available remedies at law or in equity. Gives the Department subpoena power in investigating violations of the Act. Provides processes for (1) a party to object to a subpoena issued by the Department; (2) out-of-State responses to a subpoena; and (3) enforcement of the subpoena by the Department in a court of law.  Provides for a $5,000 weekly penalty, reasonable attorneys’ fees and costs for any entity or person that fails to appear with the intent to avoid, evade, or prevent compliance in whole or in part with any investigation under the Act or who removes from any place, conceals, withholds, mutilates, alters, or destroys, or by any other means falsifies any documentary material in the possession, custody, or control of any entity or person subject to any such subpoena, or knowingly conceals any relevant information with the intent to avoid, evade, or prevent compliance. Requires that the clear proceeds of any civil penalties be remitted to the Civil Penalty and Forfeiture Fund. Allows the Department to adopt rules to implement the Act.

    Adds new GS 114B-3, concerning age verification for online access to materials harmful to minors, as follows. Defines anonymous age verification, commercial entity, distribute, news-gathering organization, publish, age standard verification, and substantial portion.

    Incorporates the definition of harmful to minors found in GS 14-190.13:

    That quality of any material or performance that depicts sexually explicit nudity or sexual activity and that, taken as a whole, has the following characteristics:

    • The average adult person applying contemporary community standards would find that the material or performance has a predominant tendency to appeal to a prurient interest of minors in sex; and
    • The average adult person applying contemporary community standards would find that the depiction of sexually explicit nudity or sexual activity in the material or performance is patently offensive to prevailing standards in the adult community concerning what is suitable for minors; and
    • The material or performance lacks serious literary, artistic, political, or scientific value for minors.

    Adds that the term also includes any material that the average person applying contemporary community standards would find, taken as a whole, appeals to the prurient interest or depicts or describes, in a patently offensive way, sexual conduct and when taken as a whole, lacks serious literary, artistic, political, or scientific value for minors.

    Directs a commercial entity that knowingly and intentionally publishes or distributes material harmful to minors on a website or application, if the website or application contains a substantial portion (more than 33.3%) of material harmful to minors, to use either anonymous age verification or standard age verification to verify that the age of a person attempting to access the material is 16 years of age or older and prevent access to the material by a person younger than 16 years of age. Requires the commercial entity to offer anonymous age verification and standard age verification. Specifies that a person attempting to access the material may select which method will be used to verify the person's age. Requires a commercial entity to ensure that the third party conducting anonymous age verification complies with four listed requirements, including not retaining personal identifying information used to verify age once the account holder’s/account applicant’s age has been verified. Deems violation of the section pertaining to age verification and third party age verification an unfair trade practice actionable solely by the Department on behalf of a resident minor against a commercial entity. Allows for a civil penalty of up to $50,000 per violation and reasonable attorneys' fees and court costs. Allows for punitive damages if the entity’s failure to comply with the Act is a consistent pattern of knowing or reckless conduct.

    Clarifies that an internet service provider or its affiliates or subsidiaries, a search engine, or a cloud service provider does not violate GS 114B-2 solely for providing access or connection to or from a website or other information or content on the internet or a facility, system, or network not under the provider's control, including transmission, downloading, intermediate storage, or access software, to the extent the provider is not responsible for the creation of the content of the communication which constitutes material harmful to minors. Exempts bona fide news or public interest broadcast, website, video, report, or event and does not affect the rights of a news-gathering organization.  

    Sets forth the General Assembly’s intent that the Act be liberally construed. Contains severability clause for the Act.