Take It Down Act Expected To Become Law Despite Concerns
U.S. legislation to criminalize non-consensual intimate images, videos and deepfakes has passed Congress with the overwhelming support of both parties, and even social media companies have voiced support for the bill.
The Take It Down Act – short for the bill’s full title, “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act” – also creates processes and requirements for removing non-consensual intimate imagery (NCII) within 48 hours of notification by victims.
But some critics say the legislation, while well intended, doesn’t do enough to ensure that it won’t be misused to suppress lawful speech.
The bill is awaiting President Donald Trump’s signature, but as both he and First Lady Melania Trump have voiced support for the bill, it is expected to become law.
Take It Down Act Provisions
The bill, which takes aim at revenge porn and other malicious or harmful uses of intimate images, would make it a federal crime to knowingly share – or threaten to share – non-consensual intimate images, including deepfakes generated by AI.
Penalties include fines and imprisonment of up to two years for offenses involving adults, and imprisonment of up to three years for those involving minors. Online platforms would be required to remove NCII within 48 hours of notification by victims.
In an effort to restrict abuses of the law, it excludes content that is a “matter of public concern,” commercial pornography, and materials used for legitimate purposes such as medical uses, law enforcement, national security and legal cases.
Some Say Law Needs More Protections Against Misuse
Some advocacy groups fear the law as written could be abused to remove lawful speech, among other concerns.
The Electronic Frontier Foundation (EFF) said the law gives “the powerful a dangerous new route to manipulate platforms into removing lawful speech that they simply don’t like.”
“The takedown provision in TAKE IT DOWN applies to a much broader category of content—potentially any images involving intimate or sexual content—than the narrower NCII definitions found elsewhere in the bill,” EFF said in a statement. “The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Services will rely on automated filters, which are infamously blunt tools. They frequently flag legal content, from fair-use commentary to news reporting. The law’s tight time frame requires that apps and websites remove speech within 48 hours, rarely enough time to verify whether the speech is actually illegal. As a result, online service providers, particularly smaller ones, will likely choose to avoid the onerous legal risk by simply depublishing the speech rather than even attempting to verify it.”
EFF said the law “pressures platforms to actively monitor speech, including speech that is presently encrypted. The law thus presents a huge threat to security and privacy online.”
The Cyber Civil Rights Initiative (CCRI) welcomed the criminalization of non-consensual distribution of intimate images (NDII), but echoed EFF’s concerns about the takedown provisions.
“While we welcome the long-overdue federal criminalization of NDII, we regret that it is combined with a takedown provision that is highly susceptible to misuse and will likely be counter-productive for victims,” CCRI said.
CCRI also took exception to a provision “that would seemingly allow a person to disclose intimate images without consent” if the disclosing person also appears in the image.
The group said it has “serious concerns about the constitutionality, efficacy, and potential misuse” of the Act’s notice and removal provision:
“While we wholeheartedly support the expeditious removal of nonconsensual intimate content and have long called for increased legal accountability for tech platforms that choose to distribute unlawful content, CCRI objects to the notice and removal provision because it is (1) unlikely to accomplish these goals and (2) likely to be selectively and improperly misused for political or ideological purposes that endanger the very communities most affected by image-based sexual abuse.”
Unlike the Digital Millennium Copyright Act (DMCA), the Take It Down Act fails to include safeguards against false reports, CCRI said.
Related
Media Disclaimer: This report is based on internal and external research obtained through various means. The information provided is for reference purposes only, and users bear full responsibility for their reliance on it. The Cyber Express assumes no liability for the accuracy or consequences of using this information.
Source link