Policy Issues / Big Tech

POLICY BRIEF: How to Reform Big Tech: A Primer on Repealing Section 230

How to Reform Big Tech: A Primer on Repealing Section 230

Introduction
Section 230 of the Communications Decency Act stands at the center of technology debates—and that is not surprising.  The provision creates the liability regime for the internet, and its protections are vital to the business models of social media and other dominant technology firms.  Unfortunately, overly expansive judicial opinions have transformed Section 230 into a statute that sanctions online lawlessness, allowing the dominant internet firms to facilitate the most heinous crimes, ignore court orders, and squelch free speech.  Reform requires congressional action to repeal Section 230 and replace it with a liability framework that corrects misguided court decisions, as well as Big Tech’s ability to silence voices and censor robust democratic debate. 

Publisher and Distributor Liability & Section 230
Under common law, anyone who publishes, reproduces, or presents material to the public bears legal liability for its unlawful content whether or not he or she first wrote the material.  For example, bookstores have liability for books they sell which contain libel, defamation, or fraud—even though they do not write the books.  Similarly, newspapers have liability for letters to the editor that they publish but do not, themselves, pen.

The common law typically recognizes a difference between those who, in fact, write an unlawful statement and those, such as bookstores or telegraph or telephone companies, that simply distribute content.  For the most part, actual or “primary” speakers are strictly liable while distributors face a more limited liability—they are liable only if they “know” about the content’s unlawfulness. 

Section 230 applies this basic publisher/distributor liability to internet platforms—but, thanks not to Congress but to certain expansive judicial opinions—fails to incorporate two vital limitations to this protection.  

  • Under the common law, if the bookstore owner or telegraph owner knew about the unlawfulness of the content they distributed, they would be liable for it.  Thus, a bookstore knowingly distributing child pornography has no legal protection.  Under the Zeran opinion, however, Facebook claims Section 230 protection for knowingly facilitating human sex trafficking and rape.  
  • Distributor liability involves only liability resulting from the primary author’s editorial decisions, i.e., it involves a bookstore’s liability for editorial decisions made by the authors of books that it distributes.  In contrast, numerous courts have bizarrely ruled that Section 230 protects against the platforms’ own editorial decisions.  This expansion of legal immunity gives the platforms the power to adopt content moderation policies that commit consumer fraud, de-platform minority voices in violation of the civil rights laws, or refuse to connect with competitors in violation of antitrust laws. 

Finally, Section 230 was written when the internet consisted of AOL and Prodigy dial-up services.  It did not contemplate dominant social media companies that control national news and information.  All prior dominant communications platforms—from telegraphs to cable systems—received distributor liability in return for non-discriminatory service.  Section 230, on the other hand, is all quid without the quo, which is odd given the power social media has over democratic deliberation, a power Justice Clarence Thomas recognized in his concurrence to Trump v. Knight First Amendment Institute

The way forward is to repeal Section 230 and replace it with a framework with two emphases.  First, correcting bad precedent, Congress must return online behavior to the rule of law by returning to the original understanding of Section 230.  Second, we should move forward with the anti-discrimination laws that Justice Thomas outlined.

What does Section 230 do—and, is it doing what Congress intended it to do?
Given the discussion above, one might reason that Section 230 was an effort to impose a liability regime on internet platforms.  And, that was its result though certainly not its intention.  

Section 230 was part of the Communications Decency Act, a 1996 effort to control pornography on the internet.  Early dial-up internet platforms claimed they could not offer porn-free environments because of a New York State case, Stratton Oakmont.  Applying the existing law of publication, that case ruled that Prodigy was a “publisher” for all statements on its bulletin board because it content moderated posts to render its forum “family friendly.”   

Under Stratton Oakmont, when Prodigy content moderated its bulletin board, it “published” all posts on its bulletin board and was legally accountable for its content.  This legal conclusion created a Hobson’s choice: either content moderate and face liability for all posts on your bulletin board, or don’t moderate at all and have posts filled with obscenity or naked pictures.   That legal rule was hardly an incentive to continue to content moderate obscenity and nudity.

Congress, eager to clean up the internet in 1996, came to the rescue with Section 230(c)(2).  It states that Prodigy—and other platforms such as Facebook, “shall not be held liable” for editing to remove content that is “obscene, lewd, lascivious, filthy, excessively violent, or harassing, or otherwise objectionable.”  Congress therefore eliminated the Hobson’s choice: when platforms content moderate for these specific reasons, they would no longer be held liable for everything on their site.

But Section 230 went beyond fixing the Hobson’s Choice
In its brief legislative history, legislators speaking on Section 230 clearly thought its purpose was to cure the Hobson’s choice and allow the free market mechanism to permit more content moderation, for the specified reasons, in Section 230(c)(2).  Congress thought freeing platforms from liability would encourage them to content moderate for porn.  The legislative history hardly discussed, however, Section 230(c)(1), perhaps an addition from a helpful lobbyist—which proved to be the much more important provision.

Section 230(c)(1) eliminates internet platforms’ “publisher or speaker” liability for the third-party user content they post.  In short, it treats internet platforms as bookstores or libraries or telephones or telegraphs.  They are not responsible for the unlawful content that third parties place on them.  In short, it is a statutory enactment of distributor liability from the common law.

And, Section 230(c)(1), though clearly not the stated purpose of Section 230 makes good sense as written. Early platforms, such as AOL and Prodigy, would have been crushed with the legal liability of having to review all posts.  Section 230(c)(1) said they were not so liable for third party content—and Section 230(c)(2) says they would not become so even if they edited them for certain, enumerated reasons.

What went wrongPart 1?  Courts’ Overbroad Reading of Section 230(c)(1)’s Protections:  Knowingly Facilitating Crime 
So far so good.  Congress created a sound liability approach—but then courts got involved.  The problem emerged early with Zeran v. AOL.  That case involved a posting on AOL that falsely claimed Mr. Zeran sold tee-shirts mocking the Oklahoma bombing and included his phone number.  Zeran, bombarded with hostile calls and even death threats, asked AOL to remove the posting immediately.  It did not.

Zeran, who suffered emotionally and financially from the ordeal, sued AOL for libel.  If AOL had been a telegraph company, the court would have ruled for Zeran because AOL knowingly published false statements. Similarly, if the court simply closely followed Section 230(c)(1)’s text, it would have ruled that AOL could not knowingly distribute unlawful content.While AOL would not have the duty to review every post, it would have to act if unlawful content were brought to its attention. 

Instead, the Zeran decision created a new rule in communications law—a distributor of speech bears no liability for even knowingly carrying unlawful content.  Never in our history has any entity enjoyed immunity from a basic duty of citizenship and corporate accountability: refrain from knowingly facilitating crime.  With this immunity, courts have absolved the major platforms in the roles they have played in encouraging and facilitating rape and other heinous and violent crimes. 

Action Item 1:  Overturn Zeran’s blanket immunity for knowingly publishing unlawful content.  

What went wrongPart 2? Courts’ Overbroad Reading of Section 230(c)(1)’s Protections:  Editorial Functions

Many courts, mostly in California, have expanded Section 230(c)(1) immunity from its proper sphere (unlawful content due to the editorial functions of third parties) to a totally new zone:  unlawful content due to the editorial functions of the platforms themselves.  Under Section 230’s text and common law antecedents, distributor liability protection only involves liability resulting from the editorial decision of the writers of the distributed content, not the distributors’ own editorial decisions.  For instance, distributor liability protects a bookseller against libel and defamation found in books it sells, and a telephone company is not liable for criminal threats it carries.  However, the distributor’s own decisions concerning content, i.e., a telephone company’s discriminatory refusals to serve Jewish Americans or a bookseller’s fraudulent claims about the books it sells, receive no legal protections.

Unfortunately, many courts have blurred this distinction.  They have ruled that Section 230(c)(1) protects the platforms’ own editorial decisions and functions.  This means that Facebook’s decision to ban women or African Americans could not  be challenged under the Civil Rights Laws; Twitter’s false statements to users concerning its content moderation has no remedy under contract or consumer fraud statutes; or any platform’s decision to no longer link to competitors, such as Parler, cannot create an antitrust violation.

Action Item 2:  Clarify court rulings that immunity does not extend to platform’s own editorial decisions.  

And what about Big Tech’s control of political discourse?
 
Section 230 was written at the beginning of the internet era.  Its stated purpose was to encourage new internet industries on a flourishing web.  It did not face, or even contemplate, a world where a few companies have incredible power over what is said and communicated.  And, as the 2020 elections show, these companies are not shy in wielding this power against conservatives.  

From the telegraph to cable systems, dominant communications systems have enjoyed legal privileges—such as distributor liability which limits publisher liability, but, in return, have accepted certain public obligations, such as non-discriminatory service to all per communications regulation.  Similarly, public accommodation law forbids the offering of business services in a non-discriminatory way.  Both types of legal protection promote the fullest participation in society by all members.  Justice Thomas recognized this problem in his recent concurrence to Biden v. Knight First Amendment Institute.  He points to common carriage and public accommodation laws as a remedy to the problem of political active dominant communications firms. 

Congress could pass a short statute declaring large online social media companies to be public accommodations and forbidding discrimination by them on the basis of religion, creed, or political affiliation or belief.  

Action Item 3: Pass a law declaring large social media platforms as places of public accommodation. 


Conclusion

Any serious effort to address the inordinate and concentrated power of the Big Tech firms in American society must include repeal of Section 230. As a result of both misguided judicial extrapolations and the development of the market, Big Tech firms are currently shielded from liability inappropriately without the corresponding duties that normally arise under the law. Congress should repeal Section 230 and replace it with a framework, as laid out above, that returns it to the original purpose of its drafters.