California Governor Gavin Newsom has recently taken additional steps to ensure the safety of children on social media platforms by signing a new bill into law. This law, titled Assembly Bill 1394, will hold social media platforms, like Instagram, TikTok, and others, liable for their failures to protect children on their platforms and prevent the spread of child sexual abuse materials. The law’s primary aim is to address the platform’s shortcomings in safeguarding the well-being of children who use their services and in curbing the dissemination of materials related to child sexual abuse.
This latest legislative move follows the California Consumer Privacy Act of 2018, which granted consumers the right to their personal information as well as the right to request that the business that obtained or collected their personal information delete all personal information collected.
Assembly Bill 1394 will bar social media platforms from “knowingly facilitating, aiding, or abetting commercial sexual exploitation.” The bill further defines facilitation, aiding, or abetting as the platform deploys “a system, design, feature, or affordance that is a substantial factor in causing minor users to be victims of commercial sexual exploitation.”
The law will go into effect in January 2025. It will require a court to award damages ranging from $1 million to $4 million for each act of exploitation that one of the platforms, as mentioned earlier, either facilitated, aided, or abetted.
By making social media platforms liable for their actions or inactions when it comes to the creation and sharing of child sexual abuse materials, the state is aiming to create a more responsible online environment. The bill advises social media platforms to avoid lawsuits by conducting biannual audits of designs, algorithms, practices, affordance, and features in place to detect designs, algorithms, practices, affordances, or features that have the potential to cause violations of these provisions.
What exactly does Assembly Bill 1394 do?
This bill requires these platforms to establish a user-friendly reporting mechanism, allowing individuals depicted in such materials to report their presence. A critical aspect of this legislation is the time-bound response requirement, demanding that platforms acknowledge and act upon these reports within a strict 36-hour window.
Furthermore, the bill imposes a dual responsibility on these platforms. It not only mandates the permanent blocking of reported material but also stresses the importance of a proactive approach. Platforms are called upon to exert ‘reasonable efforts’ to remove and block any other instances of the same reported material throughout their digital ecosystem.
This proactive stance aims to ensure that such harmful content is swiftly eradicated and remains inaccessible to users.
The legislation underscores the gravity of non-compliance by introducing liability for platforms that fail to meet these stringent requirements. In such cases, platforms may be held accountable for further damages. This legal framework emphasizes the importance of effectively combating child sexual abuse materials on social media platforms, protecting both the victims and the broader online community.
What are we talking about when we say Child Sexual Abuse Materials?
Child Sexual Abuse Material, often abbreviated as CSAM, constitutes a term encompassing a wide range of visual, written, or digital content that portrays explicit or sexually explicit images and videos involving individuals under the legal age of consent. This highly sensitive content category is intrinsically linked to the disturbing issues of child exploitation, child pornography, and child sexual abuse.
CSAM has emerged as an enduring and critical challenge for numerous social media platforms. Under federal law, these platforms are obligated to report such content to the National Center for Missing and Exploited Children.
Facebook reported taking action against a staggering 7.2 million instances of CSAM during the second quarter of 2023, demonstrating the issue’s magnitude. Instagram, another social media platform, also actively addressed this concern, reporting interventions against 1.7 million pieces of CSAM. These statistics underscore the significant efforts required to combat the pervasive issue of child sexual abuse material on online platforms and the imperative need to safeguard minors from exploitation.
According to a 2022 FBI press release, over 3,000 minors were targeted in “financial sextortion” schemes. Sextortion is a new term used to describe instances where adults coerce children and teens into sending explicit images online and are further extorted for money. These schemes primarily occur on online platforms that are widely used and trusted by youth, like social media, gaming, or video chat platforms. Many of these victims are primarily boys between the ages of 14 and 17, and these schemes have tragically led to more than a dozen deaths by suicide.
In a report released by the Stanford Cyber Policy Center in June of 2023, Instagram and Twitter were directly named and reported as having involvement in the advertising and trading of CSAM and SG-CASM (Self-Generated Child Sexual Abuse Material). Features built into social media platforms, like recommendation algorithms and direct messaging options, help facilitate the disseminating of these harmful materials.
Praise for Assembly Bill 1394
In a statement made on Monday, 10/9, Assemblymember and author of the bill, Buffy Wicks (D-Oakland), stated, “This law underscores our state’s dedication to defending the most vulnerable among us and sends a resounding message to other states and tech platforms that using the internet to exploit children will no longer go unchecked.”
“Requiring social media platforms to implement a reasonably accessible mechanism for users to report CSAM is a good start to combatting the widespread distribution of CSAM online. Hopefully, this requirement will prevent social media platforms from continuing to turn a blind eye to the rampant proliferation of CSAM on their platforms,” Mary Liu, a partner at Aylstock, Witkin, Kreis, and Overholtz PLLC and lead attorney on the CSAM litigation stated.
“We have more work to do to hold social media platforms accountable for harms they cause to kids and teens and their families, but today’s signing into of AB 1394 is a major step in the right direction,” said Jim Steyer, the founder and chief executive of Common Sense Media. Common Sense Media is a nonprofit that advocates for online child safety. They have frequently made statements that social media platforms and companies utilize inadequate self-policing of child sexual abuse materials on their platform.
In the previously mentioned report by Stanford Cyber Policy Center, researchers noted that while Twitter utilizes image hashes to identify and remove CSAM from their platform correctly, researchers were able to identify multiple instances of CSAM on public profiles that had been missed. “The failure highlights the need for platforms to prioritize user safety and the importance of collaborative research efforts to mitigate and proactively counter online child abuse and exploitation,” the report urges.
The signing of Assembly Bill 1394 into law in California is a pivotal moment in the ongoing battle against the alarming proliferation of Child Sexual Abuse Material (CSAM) and Sexually Graphic Content Featuring Minors (SG-CASM). This legislation sets a precedent, holding online platforms accountable for their inaction and complicity in the distribution of such harmful materials, reflecting a crucial shift in legal responsibility.
Acknowledged experts in this field concur that while this legislative development is noteworthy, further measures remain imperative to guarantee the safety of children in the online sphere. The Stanford Cyber Policy Center underscores the need for a comprehensive, industry-wide initiative to eradicate the persistent abuse of minors. This initiative highlights the pressing necessity for collaborative efforts among tech giants and relevant stakeholders.
Even as safety protocols are continually updated and self-policing mechanisms strengthened, there is a recognition that much work still lies ahead. The battle against CSAM and SG-CASM demands ongoing vigilance and collective action to fortify the defenses, thus shielding children from the pernicious impacts of online exploitation.