This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

self-study / Internet Law

Maintain immunity under the Communications Decency Act

Matthew Lubniewski


business law

55 2nd St Ste 1700
San Francisco , CA 94105-3493

Phone: (415) 296-1675


Univ of Pittsburgh School of Law

Matthew Lubniewski is an attorney in Buchalter's San Francisco office

Samantha Beatty


real property

55 2nd St Ste 1700
San Francisco , CA 94105-3493

Phone: (415) 227-3616


UC Davis King Hall

Samantha Beatty is an attorney in Buchalter's San Francisco office

What should you do if users start using your client's website to post hateful, obscene or defamatory comments and images? User-posted content can damage your business image and, with the wrong set of facts, result in legal liability.

One of the key laws that operators of websites and online services that allow the upload of user content should be aware of is Section 230 of Communications Decency Act (CDA) of 1996. Section 230 of the CDA is a critically important federal law that can serve as a shield from liability for online service providers (like Twitter, forums and bloggers). It provides that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This means that social networks, for example, can offer platforms for discussion and expression without the concern that they will be sued for user-generated content. This is even true for providers that encourage controversial, hateful and offensive content, such as unfavorable reviews of service, allegations of fraud perpetrated on consumers, reports of unsanitary conditions, and even defamatory comments on social gossip sites.

However, in order to enjoy this protection from liability, content providers must not edit the user-generated content in such a way as to become the "content provider." For this reason, care must be taken when developing a new online service to not overly edit or manage user content in a way that makes the operator more than a mere conduit for expression.

A key case that defines limits of the CDA immunity is a 2008 decision by the 9th U.S. Circuit Court of Appeals, Fair Housing Council v., LLC, 521 F.3d 1157 (9th Cir. 2008). In Roommates, the court found that the operator a website that discriminated against housing applicants in violation of the Fair Housing Act was not protected by the CDA, because the website operator "materially contributed" to the unlawful nature of the content. The court noted that a website operator can only enjoy the immunity of the CDA if that operator is not also an "information content provider" as described in Section 230(f)(3) of the CDA. By building a website that seemed, in part, purpose-built to enable housing discrimination (e.g., by allowing users to filter potential roommates by sexual orientation), the website operator went from being a passive conduit for information to being the content provider.

In 2014, the 6th U.S. Circuit Court of Appeals considered whether the CDA protected the operator of a website that encouraged users to upload "dirt" on everyday people, which often took the form of embarrassing gossip and photographs. Jones v. Dirty World Ent. Recordings LLC, 755 F.3d 398 (6th Cir. 2014). Jones adopted the material contribution test described in Roommates to determine whether a defendant is responsible for the creation or development of the offending online content, which would trigger the exception from immunity under the CDA. In Jones, although the website operator selected which user-submitted information to publish and wrote its own satirical comments along with the posts, the court found that the operator's curation and commentary of the offensive user content did not materially contribute to the defamatory nature of the content and did not trigger the exception to the immunity of the CDA.

In 2016, a California appellate court considered a case involving a website operator who ran one website that encouraged users to upload nude photographs without the consent of the subject along with other personal identifying information and a diabolically complementary website that extorted money from the subjects in exchange for taking down the embarrassing posts. People v. Bollaert, 248 Cal. App. 4th 699 (Cal. App. 4th Dist. 2016). In Bollaert, the court found that Section 230 of the CDA did not protect the operator from liability for violating a California criminal law that prohibited unlawful use of personal identifying information. The court found that Bollaert's websites were "designed to solicit" the unlawful content and that his actions "materially contributed to the illegality of the content and the privacy invasions suffered by the victims," which transformed Bollaert into an "information content provider" and destroyed the protections of Section 230 of the CDA. In reaching this decision, the court in Bollaert relied on the material contribution test from Roommates.

In another 2016 case, the 1st U.S. Circuit Court of Appeals upheld a website operator's immunity under the CDA where human trafficking victims sued the website operator for displaying online advertising from prostitution rings which allegedly violated the Trafficking Victims Protection Reauthorization Act of 2008. Doe v., LLC, 817 F.3d 12 (1st Cir. 2016). The court in Backpage found that the CDA will protect a website operator when it is exercising the "traditional editorial functions" of a "publisher," "such as deciding whether to publish, withdraw, postpone or alter content." The court held that the overall website design choices that the plaintiffs believed made the website appealing for use for human trafficking (such as anonymizing email addresses and stripping metadata from photographs) "are editorial choices that fall within the purview of traditional publisher functions." The court held that even if these design features were intended to make sex trafficking easier they have no effect on the operator's immunity from liability for third-party user content (such as unlawful advertising) under the CDA. In early 2017, the U.S. Supreme Court denied certiorari of the Backpage decision.

There are numerous other recent rulings that contribute to the kaleidoscopic body of law that online service providers must keep in mind when hosting services that allow users to upload content. As is illustrated by the examples in this article, online service providers' ability to rely on immunity under the CDA is dependent on the facts of each case and how close each provider gets to materially contributing to the unlawful nature of its users' content.


Submit your own column for publication to Ben Armistead

Related Tests for Internet law

self-study/Internet Law

Catch up with the California Consumer Privacy Act

By Laura Lim, Jeffrey Rabkin, John A. Vogt Jr.

self-study/Internet Law

When memes make you (or Jordan) cry

By Sam Iverson, Tim Rawson