A new piece of legislation could spell big trouble for the Internet. Last week, Sens. Rob Portman (R-OH) and Claire McCaskill (D-MO) introduced the “Stop Enabling Sex Traffickers Act of 2017,” or SESTA. While devised with the best intentions, as the name suggests, this bill would effectively undermine the intermediary liability protections afforded by Section 230 of the Communications Decency Act (CDA). Eroding those protections could have a crippling effect on the future of the Internet, which was precisely the reason that Congress passed them over 20 years ago.

In 1996, Section 230 opened the doors to a new wave of online innovation. The carefully constructed balance of its provisions has had a long-lasting impact on the growth of the digital landscape. As David Post noted in a 2015 Washington Post article:

[I]t is impossible to imagine what the Internet ecosystem would look like today without it. Virtually every successful online venture that emerged after 1996—including all the usual suspects, viz. Google, Facebook, Tumblr, Twitter, Reddit, Craigslist, YouTube, Instagram, eBay, Amazon—relies in large part (or entirely) on content provided by their users, who number in the hundreds of millions, or billions.

Simply put, Section 230 is one of a handful of laws that make the Internet work. It’s an integral reason why the digital economy has flourished—a centerpiece of our vital online ecosystem.

SESTA would put all of that in jeopardy, while doing little or nothing to address the issue of online sex trafficking. In particular, the bill would blur the CDA’s central distinction between “interactive computer services” and “information content providers.” Under 47 U.S.C. § 230(f)(2), an “interactive computer service” is defined as:

Any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or education institutions.

And 47 U.S.C. § 230(f)(3) defines “information content provider as:

Any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.

This is an important distinction. It means that websites (interactive computer services) that provide a pass-through for content produced by users (information content providers) are treated as distributors of content, not publishers. (Put simply, it means you’re generally not responsible for content somebody else used your service to publish—just as AT&T and Verizon aren’t responsible for the things people use their cell-phone services to say.) Congress knew back in 1996 that it made sense to provide limited protections under Section 230 to ensure that Internet services were not presumptively held liable due to the actions or comments of their users.

One subsection of 230 makes this point expressly: 47 U.S.C. § 230(c)(1) states flatly that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The protection is stated broadly so as to protect even services that exercise some degree of editorial curation over content hosted on their site—we don’t want providers to fear legal liability so much that they dare not edit content to serve their users. (Before Section 230 was enacted, service providers worried that courts would assume that if they removed anything they’d be responsible for everything, including content they hadn’t seen.)

Under the current reading of Section 230, online service providers aren’t entirely off the hook. If providers themselves are actively engaged in criminal activity, liability protections don’t apply, as is made clear under 47 U.S.C. § 230(e)(1), which states:

Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.

SESTA, however, would expand 47 U.S.C. § 230(e)(1) to read as follows (changes in bold):

Nothing in this section shall be construed to impair (A) the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, Section 1591 (relating to sex trafficking) of that title, or any other Federal criminal statute or (B) any State criminal prosecution or civil enforcement action targeting conduct that violates a Federal criminal law prohibiting (i) sex trafficking of children; or (ii) sex trafficking by force, threats of force, fraud, or coercion.

This would effectively expand the statute to permit the prosecution of both criminal and civil charges at the state level, which opens the door to 50 separate approaches to interpreting what has hitherto been a federal statute. That alone would constitute a concerning development, but the reference to 18 U.S.C. § 1591—the section of the Federal Code that governs sex trafficking of children—is even more unsettling.

Under 18 U.S.C. § 1591 an individual who “knowingly benefits, financially or by receiving anything of value from participation in a venture” relating to the forced sexual trafficking of children is in violation of a Federal criminal statute. Under SESTA, the definition of “participation in a venture” would be expanded to mean: “knowing conduct by an individual or entity, by any means, that assists, supports, or facilitates a violation” of the sex trafficking statute. That would lower the threshold for what actually constitutes “participation in a venture,” requiring that online service providers need not possess intent, but merely knowledge—or an obligation to know (however that may be defined)—of human trafficking.

At that point, an “interactive computer service” provider is no longer simply a distributor for “information content providers;” it becomes complicit in a crime, for which it can be punished according to the penalties laid out under 18 U.S.C. § 1591(b).

These changes alone would constitute a considerable degradation to the legal certainty Section 230 provides to the Internet landscape. But SESTA goes one step further. In its coup de grâce to the Internet, SESTA adds a fifth section to 47 U.S.C. § 230(e), which would read: “Nothing in this section shall be construed to impair the enforcement or limit the application of section 1595 of title 18, United States Code.”

This is where SESTA’s true potential threat blossoms. By itself, 18 U.S.C. § 1595 is a perfectly reasonable mechanism by which victims may seek restitution for the crimes perpetrated against them. However, SESTA’s expanded definition of “participation in a venture,” coupled with the explicit inclusion of both 18 U.S.C. § 1591 and 18 U.S.C. § 1595 under 47 U.S.C. § 230(e), opens the door for potentially vast amounts of litigation against online service providers.

In a recent Slate article, Mike Godwin puts the practical implications of these changes into a more digestible example. Passing SESTA, he argues, would be the equivalent of Congress deciding:

that FedEx was legally liable for anything illegal it ever carries, even where it’s ignorant of the infraction and acts in good faith. That would be a crazy notion in itself, but rather than applying only to FedEx’s tech equivalents—the giants like Google and Facebook—it also would apply to smaller, less well-moneyed services like Wikipedia. Even if the larger [I]nternet companies can bear the burden of defending against a vastly increased number of prosecutions and lawsuits—and that’s by no means certain—it would be fatal for smaller companies and startups.

And indeed, it’s not at all certain that companies like Google and Facebook could afford the tsunami of litigation that would likely follow in SESTA’s wake. Take Facebook, for example; its users upload an enormous amount of user-generated content. In fact, there are almost 5 billion such “pieces of content” shared every day on the platform, with 510,000 comments written, 293,000 statuses updated, and 136,000 photos uploaded every minute. Under SESTA, each one of those are a potential multi-million dollar lawsuit waiting to happen.

While that might be a big boon for trial lawyers, it’s a huge loss for individual Internet users, who are likely to encounter more stringent content controls on their favorite websites. Some sites may even end up applying blanket prohibitions to user-uploaded content, rather than risk running afoul of the narrowed liability protections. That would be one of the worst possible outcomes, not just for the digital economy, but for an emerging and maturing cyber culture and society.

Imagine an Internet where Yelp and Amazon reviews are no longer free-wheeling descriptive manifestos, but simple tiered ratings from which would-be consumers can glean diminished insights from others’ experiences. Is that the kind of experience we expect from the Internet? Or what about the comment threads on online news and magazine websites? What kind of Internet would we have if those websites decided, out of necessity, to prohibit users from commenting—from expressing their opinions—on the articles du jour? Are we better off in such a world? It doesn’t seem likely. That type of world—one in which YouTube, Facebook, and almost every other online service provider is suddenly subjected to an endless barrage of lawsuits and legal threats—wouldn’t have produced the modern digital age we now inhabit.

If SESTA becomes law, we might not have to use our imagination to consider such a world. We’ll just have to prepare for it.

Ryan Hagemann is the Director of Technology Policy at the Niskanen Center