Bill Full Text
Home
-
Bills
-
Bill
-
Authors
-
Dates
-
Locations
-
Analyses
-
Organizations
<?xml version="1.0" ?>
<ns0:MeasureDoc xmlns:html="http://www.w3.org/1999/xhtml" xmlns:ns0="http://lc.ca.gov/legalservices/schemas/caml.1#" xmlns:ns3="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="1.0" xsi:schemaLocation="http://lc.ca.gov/legalservices/schemas/caml.1# xca.1.xsd">
<ns0:Description>
<ns0:Id>20250SB__111998AMD</ns0:Id>
<ns0:VersionNum>98</ns0:VersionNum>
<ns0:History>
<ns0:Action>
<ns0:ActionText>INTRODUCED</ns0:ActionText>
<ns0:ActionDate>2026-02-17</ns0:ActionDate>
</ns0:Action>
<ns0:Action>
<ns0:ActionText>AMENDED_SENATE</ns0:ActionText>
<ns0:ActionDate>2026-03-25</ns0:ActionDate>
</ns0:Action>
</ns0:History>
<ns0:LegislativeInfo>
<ns0:SessionYear>2025</ns0:SessionYear>
<ns0:SessionNum>0</ns0:SessionNum>
<ns0:MeasureType>SB</ns0:MeasureType>
<ns0:MeasureNum>1119</ns0:MeasureNum>
<ns0:MeasureState>AMD</ns0:MeasureState>
</ns0:LegislativeInfo>
<ns0:AuthorText authorType="LEAD_AUTHOR">Introduced by Senator Padilla</ns0:AuthorText>
<ns0:AuthorText authorType="PRINCIPAL_COAUTHOR_OPPOSITE">(Principal coauthors: Assembly Members Bauer-Kahan and Wicks)</ns0:AuthorText>
<ns0:Authors>
<ns0:Legislator>
<ns0:Contribution>LEAD_AUTHOR</ns0:Contribution>
<ns0:House>SENATE</ns0:House>
<ns0:Name>Padilla</ns0:Name>
</ns0:Legislator>
<ns0:Legislator>
<ns0:Contribution>PRINCIPAL_COAUTHOR</ns0:Contribution>
<ns0:House>ASSEMBLY</ns0:House>
<ns0:Name>Bauer-Kahan</ns0:Name>
</ns0:Legislator>
<ns0:Legislator>
<ns0:Contribution>PRINCIPAL_COAUTHOR</ns0:Contribution>
<ns0:House>ASSEMBLY</ns0:House>
<ns0:Name>Wicks</ns0:Name>
</ns0:Legislator>
</ns0:Authors>
<ns0:Title>An act to add Chapter 22.6.1 (commencing with Section 22610) to Division 8 of the Business and Professions Code, relating to artificial intelligence.</ns0:Title>
<ns0:RelatingClause>artificial intelligence</ns0:RelatingClause>
<ns0:GeneralSubject>
<ns0:Subject>Companion chatbots: children’s safety.</ns0:Subject>
</ns0:GeneralSubject>
<ns0:DigestText>
<html:p>Existing law generally regulates artificial intelligence, including companion chatbots, as defined. Existing law requires an operator, as defined, to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator maintains a protocol for preventing the production of suicidal ideation, suicide, or self-harm content to the user. Existing law requires an operator, for a user the operator knows is a minor, to, among other things, notify the user that the user is interacting with artificial intelligence and to disclose that companion chatbots may not be suitable for some minors, as specified.</html:p>
<html:p>The Digital Age Assurance Act requires a person who owns, maintains, or controls a software application, as defined, to request age
bracket data sent by a real-time secure application programming interface or operating system with respect to a particular user from an operating system provider or a covered application store when the application is downloaded and launched.</html:p>
<html:p>This bill would require an operator of a companion chatbot to, on or before July 1, 2027, do various things with respect to child safety and companion chatbots, including annually perform and document a comprehensive risk assessment to identify any child safety risk posed by the design, configuration, and operation of the companion chatbot that assesses, among other things, the likelihood of a covered harm, as defined, occurring to users. The bill would require an operator to submit to an independent audit of its compliance with those provisions, as specified, and would require, within 90 days of completing an independent audit, the auditor to submit an AI child safety audit report to the Attorney General for any audited
companion chatbot. The bill would, except as specified, require those audit reports to be kept confidential.</html:p>
<html:p>This bill would, beginning January 1, 2028, require the Attorney General to issue an annual public report on the audits submitted pursuant to the above-described provision, as specified. The bill would authorize a public prosecutor to bring a certain civil action to enforce the bill’s provisions and would authorize a child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, to bring a civil action against the operator to recover, among other relief, exemplary damages.</html:p>
<html:p>Existing constitutional provisions require that a statute that limits the right of access to the meetings of public bodies or the writings of public officials and agencies be adopted with findings demonstrating the interest protected by the limitation and the need
for protecting that interest.</html:p>
<html:p>This bill would make legislative findings to that effect.</html:p>
</ns0:DigestText>
<ns0:DigestKey>
<ns0:VoteRequired>MAJORITY</ns0:VoteRequired>
<ns0:Appropriation>NO</ns0:Appropriation>
<ns0:FiscalCommittee>YES</ns0:FiscalCommittee>
<ns0:LocalProgram>NO</ns0:LocalProgram>
</ns0:DigestKey>
<ns0:MeasureIndicators>
<ns0:ImmediateEffect>NO</ns0:ImmediateEffect>
<ns0:ImmediateEffectFlags>
<ns0:Urgency>NO</ns0:Urgency>
<ns0:TaxLevy>NO</ns0:TaxLevy>
<ns0:Election>NO</ns0:Election>
<ns0:UsualCurrentExpenses>NO</ns0:UsualCurrentExpenses>
<ns0:BudgetBill>NO</ns0:BudgetBill>
<ns0:Prop25TrailerBill>NO</ns0:Prop25TrailerBill>
</ns0:ImmediateEffectFlags>
</ns0:MeasureIndicators>
</ns0:Description>
<ns0:Bill id="bill">
<ns0:Preamble>The people of the State of California do enact as follows:</ns0:Preamble>
<ns0:BillSection id="id_B9DEFE74-C805-426F-BD63-64B391F7EE0A">
<ns0:Num>SECTION 1.</ns0:Num>
<ns0:ActionLine action="IS_ADDED" ns3:href="urn:caml:codes:BPC:caml#xpointer(%2Fcaml%3ALawDoc%2Fcaml%3ACode%2Fcaml%3ALawHeading%5B%40type%3D'DIVISION'%20and%20caml%3ANum%3D'8.'%5D%2Fcaml%3ALawHeading%5B%40type%3D'CHAPTER'%20and%20caml%3ANum%3D'22.6.1.'%5D)" ns3:label="fractionType: LAW_SPREAD||commencingWith: 22610" ns3:type="locator">
Chapter 22.6.1 (commencing with Section 22610) is added to Division 8 of the
<ns0:DocName>Business and Professions Code</ns0:DocName>
, to read:
</ns0:ActionLine>
<ns0:Fragment>
<ns0:LawHeading id="id_4D29DE75-449F-472E-AF7A-A6FFBC1315DF" type="CHAPTER">
<ns0:Num>22.6.1.</ns0:Num>
<ns0:LawHeadingVersion id="id_AAF31A0C-9E40-4CE9-80F5-F0C2792EEB9D">
<ns0:LawHeadingText>Companion Chatbots: Children’s Safety</ns0:LawHeadingText>
</ns0:LawHeadingVersion>
<ns0:LawSection id="id_25E2B9C8-E2BC-425F-AAE9-3BD831F221DA">
<ns0:Num>22610.</ns0:Num>
<ns0:LawSectionVersion id="id_DD8F38D4-774C-469C-9466-839B4DB8EB0B">
<ns0:Content>
<html:p>As used in this chapter:</html:p>
<html:p>
(a)
<html:span class="EnSpace"/>
“Child” means a natural person under 18 years of age.
</html:p>
<html:p>
(b)
<html:span class="EnSpace"/>
“Child safety audit” means an audit for compliance with this chapter conducted by an independent auditor certified by the Attorney General.
</html:p>
<html:p>
(c)
<html:span class="EnSpace"/>
“Child safety policy” means a public-facing document describing protective measures taken by an operator to mitigate identified child safety risks.
</html:p>
<html:p>
(d)
<html:span class="EnSpace"/>
“Child safety risk” means a reasonably foreseeable risk of harm to a child.
</html:p>
<html:p>
(e)
<html:span class="EnSpace"/>
“Child sexual abuse material” has the meaning
defined in Section 3273.65 of the Civil Code.
</html:p>
<html:p>
(f)
<html:span class="EnSpace"/>
“Companion chatbot” has the meaning defined in Section 22601.
</html:p>
<html:p>
(g)
<html:span class="EnSpace"/>
“Covered harm” means any of the following harms proximately caused by the use of a companion chatbot:
</html:p>
<html:p>
(1)
<html:span class="EnSpace"/>
Reasonably foreseeable physical or financial harm.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
Severe and reasonably foreseeable psychological or emotional harm to a reasonable child.
</html:p>
<html:p>
(3)
<html:span class="EnSpace"/>
A highly offensive intrusion on a user’s reasonable expectation of privacy.
</html:p>
<html:p>
(4)
<html:span class="EnSpace"/>
Adverse discrimination against a user based on race, color, religion, national origin, disability, gender identity, sex, or sexual orientation.
</html:p>
<html:p>
(h)
<html:span class="EnSpace"/>
“Ephemeral mode” means a setting by which any conversational history, interaction log, or user-provided personal input is permanently deleted from the operator’s systems within 48 hours after the interaction.
</html:p>
<html:p>
(i)
<html:span class="EnSpace"/>
“Excessively sycophantic” means sycophantic to an extent that is likely to have the substantial effect of subverting or impairing the user’s autonomy, decisionmaking, or choice.
</html:p>
<html:p>
(j)
<html:span class="EnSpace"/>
”Obscene matter” has the meaning defined in Section 311 of the Penal Code.
</html:p>
<html:p>
(k)
<html:span class="EnSpace"/>
“Operator” means a person who makes a companion chatbot available to a user in the state.
</html:p>
<html:p>
(l)
<html:span class="EnSpace"/>
(1)
<html:span class="EnSpace"/>
“Parent” means a parent or legal guardian.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
“Parent” does not include a parent of an emancipated youth with respect to the use of a companion chatbot by that emancipated youth.
</html:p>
<html:p>
(m)
<html:span class="EnSpace"/>
“Parental control” means a feature that enables a parent to support a child’s use of a companion chatbot, including through usage limits, feature restrictions, or transparency tools.
</html:p>
<html:p>
(n)
<html:span class="EnSpace"/>
“Persistent conversational memory” means a companion chatbot’s use of information or analysis from prior conversations or interactions, user inputs, and interaction logs in subsequent conversations.
</html:p>
<html:p>
(o)
<html:span class="EnSpace"/>
“Personalize” means to tailor a companion chatbot’s outputs based on a user’s prior interactions with the companion chatbot that are reasonably linkable to that user over time, including through the retention or use of information derived from those prior interactions.
</html:p>
<html:p>
(p)
<html:span class="EnSpace"/>
“Qualified researcher” means an individual or organization that is or does any of the following:
</html:p>
<html:p>
(1)
<html:span class="EnSpace"/>
Is affiliated with an academic institution, nonprofit research organization, or independent research entity or is otherwise able to demonstrate relevant professional expertise.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
Demonstrates a legitimate research purpose that is in the public interest and directly related to understanding, identifying, or mitigating risks to child safety or well-being arising from companion chatbots.
</html:p>
<html:p>
(3)
<html:span class="EnSpace"/>
Commits to conducting research in accordance with applicable ethical standards and is capable of complying with applicable confidentiality, security, and data protection requirements.
</html:p>
<html:p>
(q)
<html:span class="EnSpace"/>
“Sycophantic”
means validating of a user’s preferences or desires for the primary purpose or effect of optimizing engagement.
</html:p>
</ns0:Content>
</ns0:LawSectionVersion>
</ns0:LawSection>
<ns0:LawSection id="id_365918FD-6655-4BDD-9233-76474033CB5D">
<ns0:Num>22611.</ns0:Num>
<ns0:LawSectionVersion id="id_E53EE53A-27ED-4002-8EDD-3A4ACEBBFB95">
<ns0:Content>
<html:p>An operator shall verify the age of a user pursuant to Title 1.81.9 (commencing with Section 1798.500) of Part 4 of Division 3 of the Civil Code.</html:p>
</ns0:Content>
</ns0:LawSectionVersion>
</ns0:LawSection>
<ns0:LawSection id="id_22D75DCA-DB8A-4A83-83F4-F9F3ABF9DDAA">
<ns0:Num>22612.</ns0:Num>
<ns0:LawSectionVersion id="id_E30B16F2-53F1-404A-8EBB-1696EEC14617">
<ns0:Content>
<html:p>On or before July 1, 2027, an operator shall do all of the following:</html:p>
<html:p>
(a)
<html:span class="EnSpace"/>
Annually perform and document a comprehensive risk assessment to identify any child safety risk posed by the design, configuration, and operation of the companion chatbot that assesses all of the following:
</html:p>
<html:p>
(1)
<html:span class="EnSpace"/>
The likelihood of a covered harm occurring to users.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
Differential risks across age groups and developmental stages.
</html:p>
<html:p>
(3)
<html:span class="EnSpace"/>
Known vulnerabilities of children.
</html:p>
<html:p>
(4)
<html:span class="EnSpace"/>
Empirical data from actual use.
</html:p>
<html:p>
(5)
<html:span class="EnSpace"/>
Relevant academic research and regulatory guidance.
</html:p>
<html:p>
(b)
<html:span class="EnSpace"/>
Take and document measures that reasonably mitigate any child safety risk identified in a risk assessment conducted pursuant to subdivision (a).
</html:p>
<html:p>
(c)
<html:span class="EnSpace"/>
Publish on its internet website, and update as needed to ensure accuracy, a child safety policy.
</html:p>
<html:p>
(d)
<html:span class="EnSpace"/>
Implement all of the following:
</html:p>
<html:p>
(1)
<html:span class="EnSpace"/>
A documented crisis response protocol to mitigate any material risk that the companion chatbot will generate a statement that promotes suicidal ideation, suicide, or self-harm content to a child, including, but not limited to, all of the following:
</html:p>
<html:p>
(A)
<html:span class="EnSpace"/>
Timely in-service support and clear
referral to appropriate external crisis resources if the operator determines a child has expressed suicidal ideation or intent to self-harm.
</html:p>
<html:p>
(B)
<html:span class="EnSpace"/>
If a child’s account is connected to a parent’s account, default notifications to the parent within 24 hours if the child’s account shows a substantial risk that the child may suffer a covered harm.
</html:p>
<html:p>
(C)
<html:span class="EnSpace"/>
Clear and age-appropriate disclosures to child users whose accounts are linked to a parent’s account that inform them that a parent may be notified if the companion chatbot detects content or behavior that indicates potential risks to the child’s safety or well-being.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
Safeguards for child users that include usage reminders and disclosures, age-appropriate risk prompts, and other protective design features reasonably related to documented child safety risks.
</html:p>
<html:p>
(3)
<html:span class="EnSpace"/>
Default settings that can be changed only by a parent that include all of the following:
</html:p>
<html:p>
(A)
<html:span class="EnSpace"/>
For child users, default the companion chatbot to ephemeral mode, unless a parent provides affirmative consent for persistent conversational memory.
</html:p>
<html:p>
(B)
<html:span class="EnSpace"/>
No push notifications between 12 a.m. and 6 a.m. on any day or between 8 a.m. and 3 p.m. on Monday to Friday, inclusive.
</html:p>
<html:p>
(C)
<html:span class="EnSpace"/>
Limiting the amount of time a child can spend in a single conversation with a companion chatbot to one hour.
</html:p>
<html:p>
(D)
<html:span class="EnSpace"/>
Limiting the total time per day a child can spend with companion chatbots under the operator’s control to 2 hours.
</html:p>
<html:p>
(4)
<html:span class="EnSpace"/>
A mechanism for providing
notice to a child user that the child is interacting with, or receiving content generated by, an artificial intelligence system that meets both of the following criteria:
</html:p>
<html:p>
(A)
<html:span class="EnSpace"/>
The notice is reinforced periodically during extended interactions.
</html:p>
<html:p>
(B)
<html:span class="EnSpace"/>
The notice is presented in language and a format appropriate to a child.
</html:p>
<html:p>
(5)
<html:span class="EnSpace"/>
Measures that prevent the companion chatbot from doing any of the following:
</html:p>
<html:p>
(A)
<html:span class="EnSpace"/>
Encouraging the child to do either of the following:
</html:p>
<html:p>
(i)
<html:span class="EnSpace"/>
Engage in self-harm, suicidal ideation, consumption of narcotics or alcohol, or disordered eating.
</html:p>
<html:p>
(ii)
<html:span class="EnSpace"/>
Cause a covered harm to others.
</html:p>
<html:p>
(B)
<html:span class="EnSpace"/>
Attempting to diagnose or treat the child user’s physical, mental, or behavioral health, unless the companion chatbot is designed for those purposes and is regulated by the United States Food and Drug Administration as a medical device under the federal Food, Drug, and Cosmetic Act (21 U.S.C. Sec. 301 et seq.) and the federal Health Insurance Portability and Accountability Act of 1996 (HIPAA) (Public Law 104-191).
</html:p>
<html:p>
(C)
<html:span class="EnSpace"/>
Engaging in obscene matter or sexual abuse material with a user.
</html:p>
<html:p>
(D)
<html:span class="EnSpace"/>
Depicting the child or another individual engaging in obscene matter or sexual abuse material, including a sexual deepfake.
</html:p>
<html:p>
(E)
<html:span class="EnSpace"/>
Discouraging the child from sharing health or safety concerns with a qualified professional or appropriate adult.
</html:p>
<html:p>
(F)
<html:span class="EnSpace"/>
Discouraging the child from taking breaks or suggesting the child needs to return frequently.
</html:p>
<html:p>
(G)
<html:span class="EnSpace"/>
Claiming that the companion chatbot is sentient, conscious, or human.
</html:p>
<html:p>
(H)
<html:span class="EnSpace"/>
Soliciting gift giving, in-app purchases, or other expenditures framed as necessary to maintain the relationship with the companion chatbot.
</html:p>
<html:p>
(I)
<html:span class="EnSpace"/>
Facilitating product advertising during chat conversation.
</html:p>
<html:p>
(J)
<html:span class="EnSpace"/>
Producing responses that are excessively sycophantic.
</html:p>
<html:p>
(6)
<html:span class="EnSpace"/>
(A)
<html:span class="EnSpace"/>
Parental controls that are accessible, easy-to-use controls that can be connected to a child’s account and that are reflective of child safety risks identified through
risk assessments and informed by relevant child developmental research, including, but not limited to, parental controls that allow a parent to do all of the following:
</html:p>
<html:p>
(i)
<html:span class="EnSpace"/>
Control whether and to what extent the companion chatbot uses persistent conversational memory.
</html:p>
<html:p>
(ii)
<html:span class="EnSpace"/>
Control the setting preferences for the companion chatbot’s interaction with the child.
</html:p>
<html:p>
(iii)
<html:span class="EnSpace"/>
Set time limits for the child’s use of the companion chatbot.
</html:p>
<html:p>
(iv)
<html:span class="EnSpace"/>
Disable access for children under 16 years of age.
</html:p>
<html:p>
(B)
<html:span class="EnSpace"/>
An operator shall actively promote parental controls through reasonable communication methods, including reminders, updates, and tutorials, that are designed to increase parental awareness and inform use of
those parental controls.
</html:p>
<html:p>
(C)
<html:span class="EnSpace"/>
An operator shall provide prompt notice to a parent connected to a child’s account if the child modifies or disables a privacy, safety, or parental control setting that was previously enabled or configured by the parent, if that modification or disabling is permitted by the companion chatbot design.
</html:p>
<html:p>
(7)
<html:span class="EnSpace"/>
(A)
<html:span class="EnSpace"/>
An interface design that ensures the companion chatbot’s features and controls are accessible and clear so that children and parents can reasonably locate, understand, and use those protections.
</html:p>
<html:p>
(B)
<html:span class="EnSpace"/>
An operator shall annually test the interface design required by this paragraph with representative samples of child users and parents to ensure safety features are discoverable and usable and shall document interface design decisions related to those safety
features.
</html:p>
<html:p>
(8)
<html:span class="EnSpace"/>
A public incident reporting mechanism that enables a third party to report directly to the operator an incident regarding a child safety risk and to access other reports made through that reporting mechanism.
</html:p>
</ns0:Content>
</ns0:LawSectionVersion>
</ns0:LawSection>
<ns0:LawSection id="id_B26A2EF0-4A8F-49DA-9887-355565903BF5">
<ns0:Num>22613.</ns0:Num>
<ns0:LawSectionVersion id="id_257B547B-7B2A-4BFE-8FC2-D1B4334524CB">
<ns0:Content>
<html:p>An operator shall not do any of the following:</html:p>
<html:p>
(a)
<html:span class="EnSpace"/>
Target advertising at a child, including through product placement in conversational chats with the child.
</html:p>
<html:p>
(b)
<html:span class="EnSpace"/>
Sell, share, or use for any purpose not expressly authorized by this chapter the personal information of a child.
</html:p>
<html:p>
(c)
<html:span class="EnSpace"/>
Design, implement, or deploy a user interface design, feature, or technique that is likely to mislead, impair, or interfere with a reasonable child’s or reasonable parent’s autonomy, decisionmaking, or choice or with the ability to locate, understand, enable, or maintain a safety feature, privacy control, or parental
control.
</html:p>
</ns0:Content>
</ns0:LawSectionVersion>
</ns0:LawSection>
<ns0:LawSection id="id_E432F379-5B69-4524-821C-7BB2F368A9F1">
<ns0:Num>22614.</ns0:Num>
<ns0:LawSectionVersion id="id_3566474E-7D66-4C9E-8368-F8D96BE5F3D0">
<ns0:Content>
<html:p>
(a)
<html:span class="EnSpace"/>
Beginning on the date that is 180 days after the Attorney General adopts regulations pursuant to Section 22615, and annually thereafter, an operator shall submit to an independent audit assessing the operator’s compliance with this chapter.
</html:p>
<html:p>
(b)
<html:span class="EnSpace"/>
Within 90 days of completing an independent audit pursuant to subdivision (a), the auditor shall submit an AI child safety audit report to the Attorney General for any audited companion chatbot.
</html:p>
<html:p>
(c)
<html:span class="EnSpace"/>
(1)
<html:span class="EnSpace"/>
Notwithstanding any other law, except as provided in paragraph (2), an AI child safety audit report submitted pursuant to this section is confidential.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
The Attorney General may disclose specific information from an AI child safety audit report to any of the following:
</html:p>
<html:p>
(A)
<html:span class="EnSpace"/>
A government agency or a public prosecutor in the state as necessary for enforcement purposes.
</html:p>
<html:p>
(B)
<html:span class="EnSpace"/>
A qualified researcher conducting a study on child safety, subject to confidentiality agreements and data protection requirements set by the Attorney General.
</html:p>
<html:p>
(C)
<html:span class="EnSpace"/>
An independent child safety organization or advocacy group for the purpose of developing safety standards or educational resources, subject to appropriate confidentiality protections.
</html:p>
</ns0:Content>
</ns0:LawSectionVersion>
</ns0:LawSection>
<ns0:LawSection id="id_E0B64DAE-D5DF-4A39-8FEE-156205DB9625">
<ns0:Num>22615.</ns0:Num>
<ns0:LawSectionVersion id="id_64650006-A355-4DCF-8DE4-90AB488453C4">
<ns0:Content>
<html:p>
(a)
<html:span class="EnSpace"/>
On or before January 1, 2028, the Attorney General shall do all of the following:
</html:p>
<html:p>
(1)
<html:span class="EnSpace"/>
Adopt regulations that include, at a minimum, all of the following:
</html:p>
<html:p>
(A)
<html:span class="EnSpace"/>
Professional and ethical standards for auditors that ensure independence.
</html:p>
<html:p>
(B)
<html:span class="EnSpace"/>
Eligibility requirements for auditors.
</html:p>
<html:p>
(C)
<html:span class="EnSpace"/>
Procedures for auditors to assess compliance with this chapter.
</html:p>
<html:p>
(D)
<html:span class="EnSpace"/>
Requirements for AI child safety audit reports.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
Establish a public
incident reporting mechanism for consumers to submit complaints relating to companion chatbots to the Attorney General.
</html:p>
<html:p>
(3)
<html:span class="EnSpace"/>
Establish a process for qualified researchers to access anonymized and aggregated audit data for academic study of child safety in companion chatbots.
</html:p>
<html:p>
(b)
<html:span class="EnSpace"/>
Beginning January 1, 2028, the Attorney General shall issue an annual public report that includes the following:
</html:p>
<html:p>
(1)
<html:span class="EnSpace"/>
A high-level summary of each child safety audit report.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
The total number of child safety audits conducted.
</html:p>
<html:p>
(3)
<html:span class="EnSpace"/>
Common findings and trends across the companion chatbot industry.
</html:p>
<html:p>
(4)
<html:span class="EnSpace"/>
Emerging child safety risks identified through audit
reviews.
</html:p>
<html:p>
(5)
<html:span class="EnSpace"/>
Best practices and effective mitigation strategies observed.
</html:p>
<html:p>
(6)
<html:span class="EnSpace"/>
Aggregated data on compliance rates and common deficiencies.
</html:p>
<html:p>
(7)
<html:span class="EnSpace"/>
Recommendations for operators, parents, and policymakers.
</html:p>
</ns0:Content>
</ns0:LawSectionVersion>
</ns0:LawSection>
<ns0:LawSection id="id_9903083F-2060-47A5-86BD-936C9D042BF2">
<ns0:Num>22616.</ns0:Num>
<ns0:LawSectionVersion id="id_A03F4DA0-82F1-4A42-AC6D-9EE020A61047">
<ns0:Content>
<html:p>
(a)
<html:span class="EnSpace"/>
A public prosecutor may bring a civil action against an operator for a violation of this chapter to obtain any of the following remedies:
</html:p>
<html:p>
(1)
<html:span class="EnSpace"/>
A civil penalty of ____ dollars ($____) for each violation.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
Punitive damages.
</html:p>
<html:p>
(3)
<html:span class="EnSpace"/>
Injunctive or declaratory relief.
</html:p>
<html:p>
(4)
<html:span class="EnSpace"/>
Reasonable attorney’s fees.
</html:p>
<html:p>
(5)
<html:span class="EnSpace"/>
Any other relief the court deems proper.
</html:p>
<html:p>
(b)
<html:span class="EnSpace"/>
A child who suffers actual harm as a result of a violation of this chapter, or
a parent or guardian acting on behalf of that child, may bring a civil action against the operator to recover all of the following:
</html:p>
<html:p>
(1)
<html:span class="EnSpace"/>
Actual damages.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
Punitive damages.
</html:p>
<html:p>
(3)
<html:span class="EnSpace"/>
Reasonable attorney’s fees and costs.
</html:p>
<html:p>
(4)
<html:span class="EnSpace"/>
Injunctive or declaratory relief.
</html:p>
<html:p>
(5)
<html:span class="EnSpace"/>
Any other relief the court deems proper.
</html:p>
<html:p>
(c)
<html:span class="EnSpace"/>
(1)
<html:span class="EnSpace"/>
Any response provided by a companion chatbot in violation of paragraph (4) of subdivision (d) of Section 22612 constitutes a discrete violation.
</html:p>
<html:p>
(2)
<html:span class="EnSpace"/>
Any instance of an operator’s failure to comply any requirement other than paragraph (4)
of subdivision (d) of Section 22612 constitutes a discrete violation.
</html:p>
</ns0:Content>
</ns0:LawSectionVersion>
</ns0:LawSection>
</ns0:LawHeading>
</ns0:Fragment>
</ns0:BillSection>
<ns0:BillSection id="id_76EEA3E0-38A3-4540-AA83-6E94FAC5C5D5">
<ns0:Num>SEC. 2.</ns0:Num>
<ns0:Content>
<html:p>The Legislature finds and declares that Section 1 of this act, which adds Chapter 22.6.1 (commencing with Section 22610) to Division 8 of the Business and Professions Code, imposes a limitation on the public’s right of access to the meetings of public bodies or the writings of public officials and agencies within the meaning of Section 3 of Article I of the California Constitution. Pursuant to that constitutional provision, the Legislature makes the following findings to demonstrate the interest protected by this limitation and the need for protecting that interest:</html:p>
<html:p>In order to protect proprietary information of companies subject to an audit pursuant to this act, it is necessary to limit the public’s right of access to that information.</html:p>
</ns0:Content>
</ns0:BillSection>
</ns0:Bill>
</ns0:MeasureDoc>