Members of Sampsa, a four-person political art collective based in Helsinki, Finland, are both defiant and a bit worried. The pseudonymously named group, formed by self-described “renegade architects” fifteen years ago, fears that the United States’ Take It Down Act, passed by Congress and signed into law by President Donald Trump on May 19, could make them a target due to the overtly violent and sexual imagery in their work. The group is known for depictions of Russian President Vladimir Putin, Israeli Prime Minister Benjamin Netanyahu, and Trump, which poke not-so-gentle fun at the leaders’ imperialist aggression, and warmongering, and indifference toward those living in poverty.
“People who look at our art might say, ‘Wow, this is going too far,’” a Sampsa spokesperson tells The Progressive. “But when we see what is happening in Ukraine and in Gaza, where thousands of people, including children and people over the age of fifty-five, are being murdered, we ask which is more uncomfortable—our depiction or the reality of the situation politicians have created? For Sampsa, human suffering, despair, and the struggle for day-to-day survival can’t be ignored.”
Many agree. But The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, popularly known as Take It Down, is purportedly aimed at preventing intimate images of an individual—including real photographs, digital manipulations, and images created using artificial intelligence—from being shared online without the subject’s consent.
The law amends the Communications Act of 1934 to add new prohibitions on the publication of what’s known as Non-Consensual Intimate Imagery (NCII) or “revenge porn,” making it illegal “to knowingly publish intimate visual depictions of an identifiable individual,” without consent, regardless of whether the person is a minor or an adult. As enacted, the bill makes it a federal crime to publish NCII. It also mandates that social media platforms and websites remove contested images within forty-eight hours of a complaint being lodged. Noncompliance can result in imprisonment for two to three years, a fine, or both. Enforcement of compliance by Internet service providers will fall to the Federal Trade Commission and the Department of Justice.
The bipartisan bill was co-sponsored by Ted Cruz, Republican of Texas, and Amy Klobuchar, Democrat of Minnesota, and garnered the support of the National Organization for Women and the Center for Missing and Exploited Children. Take It Down has also been championed by both Donald and Melania Trump, the latter of whom has focused on anti-cyberbullying and revenge porn awareness campaigns during her tenure as First Lady. But Trump has added a personal dimension to his support: “I’m going to use the bill for myself, if you don’t mind,” the President said at the bill signing, “because nobody gets treated worse than I do online. Nobody.”
That’s debatable, of course.
But the bill garnered deep opposition from more than twenty advocacy groups, including the ACLU, the Freedom of the Press Foundation, The Internet Society, LGBT Tech, and the Electronic Frontier Foundation, who say the legislation poses a danger to free expression and due process.
At the same time, experts acknowledge that there has been a rampant explosion of NCII online. “[When] NCII is disseminated without consent, it is a privacy violation,” Aaron Mackey, director of Free Speech and Transparency Litigation at the Electronic Frontier Foundation (EFF), tells The Progressive. “The First Amendment does not protect the violator.”
Mackey, whose work involves litigating free speech and privacy cases, says an effective law for this problem would “narrowly help victims and make sure that the removal of objectionable online content takes place after a court hearing determines that the image is actually NCII. Law enforcers would then go after the people who perpetrated the harm.”
But that, he says, is not what Take It Down does. Instead, Mackey says the law opens a gateway to potential abuse by requiring online sources to remove non-copyrighted speech. The bill contains two distinct parts, each of which Mackey believes poses a unique danger to free speech precedent. The first part of the law creates new criminal prohibitions for deepfake imagery and allows the Department of Justice to prosecute its creators. Mackey says this part of the law is already in effect. The second part of the act, set to go into effect in April 2026, requires contested images to be removed by social media within forty-eight hours of a complaint being received.
“If the image is not taken down within this timeframe, the Federal Trade Commission can initiate an investigation of the provider,” Mackey says. “The problem here is that it does not require the complainant to prove that the image is NCII. A depiction may be unflattering or derogatory, but the procedural hooks of Take It Down do not require an investigation to ensure that it is a non-consensual sexual image. Worse, there are no penalties for false complaints. There is also no chance for accused parties to defend themselves under fair use protection laws.”
These protections, he explains, allow artists, journalists, researchers, and writers to use “limited portions” of an existing work—the permissible amount has never been quantified—for commentary, criticism, news reporting, scholarly reports, or visual representations that are meant to provoke “new insights and understandings” in viewers.
Moreover, Mackey argues that the Violence Against Women Act (VAWA), reauthorized in 2022, already allows individuals victimized by unauthorized depictions to sue the disclosing party for financial damages and injunctive relief, making Take It Down unnecessary.
The Internet Society, which advocates for expanded internet access in every region of the world, warns that the law also poses a danger to encrypted materials. “Encryption is a best practice in data privacy and security, protecting all Americans from undue surveillance and censorship,” the organization said in an open letter opposing the legislation. It further warned that providers of encrypted services will be forced to use online content-monitoring technologies to shield themselves from liability.
What’s more, the Internet Society and the Electronic Frontier Foundation, among others, worry that Take It Down will be used to remove protected speech that complainants simply don’t like, leading to the bombardment of Internet providers with frivolous complaints and demands that images be removed. While Christian F. Nunes, president of the National Organization for Women, tells The Progressive the act is a “way to provide a more holistic approach to combating violence against women,” civil libertarians and free speech advocates disagree.
And while no one can predict exactly what will happen when Take It Down goes into full effect next April, artists like the Sampsa collective are committed to making provocative art despite personal fears and potential risk.
“Artists have always been the tip of the spear,” the Sampsa spokesperson says. “We can educate the general population about the troubles we’re facing in society through artwork and use art to push back against the white, Christian nationalist movement that is propping Trump up. Take It Down is a call to arms and a call to defend artistic freedom and the right of artists to create works that reflect the times we live in.” Unflattering or satirical depictions of political leaders, they say, are essential to both free expression and artistic freedom.