Skip to Main Content.

In 1996, the Communications Decency Act was passed to provide incentives and protections for “interactive computer service providers” to block and screen offensive material. Under Section 230 of the Communications Decency Act, service providers are protected from civil liability when they remove some objectionable third-party content on their platforms but fail to remove all objectionable content. In other words, service providers cannot be treated as publishers or speakers of third-party content. For nearly 30 years, service providers have prevailed in lawsuits that challenge user account terminations and content removals based on the harmful nature of user posts.

Section 230 does not immunize service providers from all forms of content publications. For example, the Biden administration has argued that algorithms should constitute speech by the service providers. While what falls outside the current definitional bounds of Section 230 is still up for debate, there is a new approach to limit this immunity using the common carrier laws. With recent conflicting rulings by the Fifth Circuit and the Eleventh Circuit Court of Appeals, the U.S. Supreme Court will likely be asked to consider whether certain service providers can be classified as “common carriers.”

Recent laws passed in Texas and Florida have attempted to regulate what they each define as “social media platforms” as common carriers. Common carriers are permitted to have additional restrictions imposed by the government due to the general need of the public to have access to such services and the monopolistic control common carriers have over their operation. By treating the platforms as common carriers, Texas and Florida laws attempt to regulate the platforms’ content moderation as censorship of user’s online speech.

The application of the common carrier laws to modern platforms are not settled. Common carrier laws were originally created to regulate businesses which carried products and goods. Texas and Florida argued speech is being “carried” by the modern platforms, and Texas further argued there is a modern-day business need to access these platforms. For example, it may be necessary for some businesses to post content on a specific platform like YouTube or Twitter to reach their base and customers. Since these modern platforms, are not interchangeable, Texas and Florida argue that they are near monopolies within their segment of the communications industry akin to a common carrier. However, the legal intersection of common carrier laws and First Amendment rights is limited, though precedent suggests common carriers maintain protections for their own speech.

The question then becomes, what type of service provider could be a common carrier, and what type of speech remains a common carrier’s expression under the First Amendment?

If the Supreme Court grants write of certiorari and holds that “social media platforms” are common carriers, service providers would be severely limited from removing third-party content after it is posted. This is because content moderation would not fall within First Amendment protections of the platform’s “editorial discretion.” Editorial discretion would instead only apply to choices not to post content before it is made public on the platform. With the new common carrier doctrine, State laws could target service providers as publishers of content. Consequently, service providers would need to distinguish what is legal and illegal content before publication rather than moderating content after publication. Providers may face the tough choice on how to proceed with posting content which is protected speech required to be posted in one state, but illegal to be posted in another state. Ultimately, we could see platforms change their policies to only publish paid content subject to editorial discretion and end access to any user-generated content to avoid this legal minefield.

If it is determined that service providers are not common carriers, Section 230 protections will continue to apply to social media platforms. There may still be subsequent debates of whether removal of content is censorship of user content or a service provider’s own expression. As the Biden administration has suggested, courts may take another look at what types of conduct are protected based on treating the service provider as a publisher or speaker. The existing carve-outs from Section 230 may be reinterpreted more broadly to narrow the scope of Section 230. But in the meantime, service providers should be protected under Section 230 for not being publishers or for taking good faith actions to restrict offensive content.

If you are a service provider that moderates third-party content, your business may be implicated by the upcoming decisions. Look out for these new decisions to determine where it may be necessary for you to post a disclaimer on your site to separate your message from that of your users. As a result of the new decisions, you may need to consider whether user-generated content on your platform must be published or must be removed. In the meantime, it is still in your best interests to maintain content moderation policies for your business. This is an actively developing landscape that could capture many different forms of internet activity as “social media.”

For more information, please contact any attorney in Frost Brown Todd’s Privacy and Data Security and Media & First Amendment team.