The Business & Technology Network
Helping Business Interpret and Use Technology
«  
  »
S M T W T F S
 
 
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 
 
 
 
 

Trump Promises To Abuse Take It Down Act For Censorship, Just As We Warned

DATE POSTED:March 6, 2025

During his address to Congress this week, Donald Trump endorsed the Take It Down Act while openly declaring his plans to abuse it: “And I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.”

(You might think a former president openly declaring his intent to abuse a content moderation law would be big news. The media, apparently swamped with other Trump outbursts, didn’t even seem to notice.)

This is, of course, exactly what we (and many others) warned about in December when discussing the Take It Down Act. The bill aims to address a legitimate problem — non-consensual intimate imagery — but does so with a censorship mechanism so obviously prone to abuse that the president couldn’t even wait until it passed to announce his plans to misuse it.

And Congress laughed. Literally.

Let’s talk about non-consensual intimate imagery (NCII) for a minute. (People used to call it “revenge porn,” but that’s a terrible name — it’s not porn, it’s abuse.) The tech industry, after a fairly slow start, has actually been reasonably good more recently at trying to address this problem. You’ve got NCMEC’s Take It Down system helping kids get abusive images removed. You’ve got StopNCII.org doing clever things with hashes that let platforms identify and remove bad content without anyone having to look at it. These aren’t perfect solutions, but they show what happens when smart people try to solve hard problems thoughtfully.

But Congress (specifically Senators Ted Cruz and Amy Klobuchar) looked at all this work and said “nah, let’s just make websites legally liable if they don’t take down anything someone claims is NCII within 48 hours.” It’s the “nerd harder or we fine you” approach to tech regulations.

You can’t just write a law that says “take down the bad stuff.” I mean, you can, but it will be a disaster. You have to think about how people might abuse it. The DMCA’s notice-and-takedown system for copyright at least tried to include some safeguards — there’s a counternotice process, there are (theoretical) penalties for false notices. But TAKE IT DOWN? Nothing. Zero. Nada.

We already see thousands of bogus DMCA notices attempting to remove content with no basis in the law, even with those safeguards in place. What do you think will happen with a law that has no safeguards at all? (Spoiler alert: The president just told us exactly what will happen.)

Even given the seriousness of the topic, and the president’s support, you might think that Congress would care about the fact that the bill almost certainly violates the First Amendment, and thus would stand a high likelihood of being tossed out as unconstitutional. CDT tried to warn them, explaining that forcing websites to take down content without any court review creates some thorny constitutional problems. (Who knew that requiring private companies to censor speech based on unverified complaints might raise First Amendment concerns? Well, everyone who’s ever taken a constitutional law class, but apparently not Congress.)

Congress could have fixed those problems. But chose not to.

As currently drafted, however, the TAKE IT DOWN Act raises complex questions implicating the First Amendment that must be addressed before final passage. As a general matter, a government mandate for a platform to take down constitutionally protected speech after receiving notice would be subject to close First Amendment scrutiny. The question is whether a narrowly drawn mandate focused on NDII with appropriate protections could pass muster. Although some NDII falls within a category of speech outside of First Amendment protection such as obscenity or defamation, at least some NDII that would be subject to the Act’s takedown provisions, even though unquestionably harmful, is likely protected by the First Amendment. For example, unlike the proposed Act’s criminal provisions, the takedown provision would apply to NDII even when it was a matter of public concern. Moreover, the takedown obligation would apply to all reported content upon receipt of notice, before any court has adjudicated whether the reported image constitutes NDII or violates federal law, let alone whether and how the First Amendment may apply. Legally requiring such take-down without a court order implicates the First Amendment.

Even if you think the concerns about fake takedown notices are overblown, shouldn’t you want to make sure that the law would pass First Amendment scrutiny when it goes to court? It seems important.

Unfortunately, it does not appear that Congress paid attention. The Senate recently passed the Act via unanimous consent, and it’s now headed to the House with strong support. Earlier this week, Melania Trump endorsed the bill, and Donald Trump briefly mentioned it during his address to Congress, and as mentioned above, he explicitly revealed his plans to abuse it:

And Elliston Berry, who became a victim of an illicit deepfake image produced by a peer. With Ellison’s help, the Senate just passed the Take It Down Act and this is so important. Thank you very much, John. John Thune. Thank you. Stand up, John. [Applause] Thank you, John. Thank you all very much. Thank you and thank you to John Thune and the Senate.

Great job. To criminalize the publication of such images online is terrible, terrible thing. And once it passes the House, I look forward to signing that bill into law. Thank you. And I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.

There it is — a sitting president openly declaring his intent to abuse a content moderation law to remove speech he doesn’t like. This isn’t speculation or paranoia about potential misuse — it’s an explicit promise, made in front of both houses of Congress, as well as multiple Supreme Court Justices, of his intent to weaponize the law against protected speech.

So here we are. Civil liberties groups have been jumping up and down and waving their arms about how this bill needs basic safeguards against abuse. The media, apparently suffering from Trump-crazy-statement-fatigue, has mostly yawned. Congress, eager to show they’re “doing something” about online abuse, doesn’t seem interested in the details.

And why would they be? The bill is framed as protecting people from having compromising imagery posted online. Who could be against that? It’s like being against puppies or ice cream.

But here’s the thing: When someone tells you they plan to abuse a law, maybe… listen? When that someone is the President of the United States, and he’s saying it in front of Congress and multiple Supreme Court Justices, maybe pay extra attention?

The good folks at EFF have set up an action alert asking people to contact their representatives about the bill. But realistically, the bill has a strong likelihood of becoming law at this point.

Look, I can already hear the counterargument: “NCII is so harmful that we need strong measures, even if there’s some collateral damage to free speech.” And yes, NCII is genuinely harmful. But here’s the problem — a law designed with giant, exploitable holes doesn’t actually solve the problem. If it becomes primarily a tool for the powerful to suppress criticism (as Trump just promised), victims of actual NCII will be left with a discredited law that courts may eventually strike down entirely. The real goal should be a targeted, constitutional solution — not a censorship free-for-all that the president openly plans to weaponize against his critics. That serves no one except those who want to silence opposition.

We’ve spent the last two decades watching the DMCA’s takedown system be abused to silence legitimate speech, even with its (admittedly weak) safeguards. Now we’re about to create a similar system with no safeguards at all, precisely when the president has announced — to laughter and applause — his plans to weaponize it against critics.

Congress is building a censorship machine and handing the controls to someone who just promised to abuse it. That’s not fighting abuse — that’s enabling it.