Two members of the Senate Judiciary Committee are preparing to introduce a bipartisan bill that would mandate tech companies to more swiftly report and remove child sexual abuse material hosted on their platforms, but critics warn it could result in the weakening or elimination of encrypted messaging services that many Americans rely on.
The Stop CSAM Act, first introduced in 2023 by Sens. Josh Hawley, R-Mo., and Dick Durbin, D-Ill., would impose several new requirements on companies to prevent the hosting and distribution of child sexual abuse material on their platforms.
The bill would expand companies’ obligations to report instances of the material to the National Center for Missing and Exploited Children, enhance privacy protections for children who testify in court, and create a Child Online Protection Board at the Federal Trade Commission that could enforce the removal of such content or fine companies for violations.
Most controversially, it would also seek to alter tech companies’ immunity under Section 230 of the Communications Decency Act, allowing victims to file civil lawsuits against companies that fail to remove CSAM content from their platforms in a timely fashion.
At a Senate Judiciary hearing Tuesday, Hawley criticized the way online platforms handle victim requests to take down material, alleging that tech companies “fob [victims] off on some complicated procedure that never bears any fruit, or just flat out refuse to do anything” when victims contact them.
Michelle DeLaune, CEO of the National Center for Missing and Exploited Children, expressed support for the Stop CSAM Act, calling the bill an opportunity “to require meaningful and enhanced reporting by online platforms, to create meaningful incentives for these companies to ensure their platforms protect children from predators and to support survivors.”
DeLaune expressed concerns about a significant drop in reports the organization has received on child sexual abuse material through its cyber tip line. In 2023, the organization received more than 36 million reports, compared to just 20 million last year.
She attributed the decline to multiple factors, including an internal change within her organization that allowed companies to bundle reports. Despite this, she said that the center still received about seven million fewer reports from companies about child sexual abuse material on their platforms, including year-over-year decreases in reporting of 20% or more from companies like Google, X, Discord and Microsoft.
She also said the quality of reporting has degraded in recent years, noting that “companies are still choosing what they wish to include in their report and what they don’t.”
Hawley and Durbin have yet to formally reintroduce the bill this year. Critics of past iterations have emphasized that they are waiting to see if the new version has updated language in response to previous feedback.
When CyberScoop requested text for the latest version of the bill, a Hawley staffer referred the reporter to a Feb. 19 press release indicating that Hawley and Durbin planned to reintroduce the legislation this year. A Durbin staffer referred CyberScoop to Hawley’s office for the most updated bill.
Digital rights groups are opposed
Digital rights groups have come out hard against the bill, arguing that tech companies are already legally required to address known instances of CSAM on their platforms and there is no technical or policy solution that would allow tech companies greater access to the encrypted communications of users sharing CSAM material without weakening or removing encrypted services for everyone.
In 2023, the Stop CSAM Act passed unanimously through the Senate Judiciary Committee, but was blocked from receiving a floor vote under unanimous consent by Sen. Ron Wyden, D-Ore., who argued at the time that the bill would result in tech companies weakening or removing their encrypted services for all their users in an effort to avoid legal action.
Jenna Leventoff, senior policy counsel at the American Civil Liberties Union, told CyberScoop her organization is still waiting to see language from this year’s bill, but reiterated broad concerns about the impact on encryption for all users.
“We’re very concerned that if this bill passes, the platforms’ reaction will be, ‘if we’re going to be held liable for content we don’t know about, we can’t offer encrypted services, because it’s not worth the risk for us,’” Leventoff said.
Any company going that route, she said, would diminish access to encrypted messaging for other vulnerable groups, such as political dissidents, domestic abuse survivors, and others who need to communicate privately.
“I think there are a lot of people in this political climate who rely on encrypted messages,” Leventoff added.
The fear that companies may respond to laws requiring greater access or surveillance of encrypted services by eliminating those services for all users is not hypothetical. Last month, Apple reacted to the British government’s request for law enforcement access to encrypted iCloud accounts by discontinuing its end-to-end encryption services for all Apple devices in the United Kingdom.
The Electronic Frontier Foundation (EFF) sent a letter to the Senate Judiciary Committee reiterating its opposition to the bill, saying that attempts to crack down on the spread of child pornography is a noble endeavor, but “laudable goals do not always make good law.”
Language in previous versions of the Stop CSAM Act have sought to hold tech companies liable not only for knowingly hosting child sexual abuse content, but also for instances where they were “negligent” about such material.
India McKinney, director of federal affairs at EFF, wrote that “due to the nature of their services, encrypted communications providers who receive a takedown notice under a separate section of this bill may be deemed to have ‘knowledge’ under the criminal law even if they cannot verify and act on that takedown notice.”