Publication 27 Jun 2025 · Brazil

STF and the Brazilian Civil Rights Framework for the Internet: Final Ruling Issued

5 min read

On this page

Since its enactment in 2014, Article 19 of Law No. 12,965/2014 (the Brazilian Civil Rights Framework for the Internet – Marco Civil da Internet, or MCI) has established a model under which internet platforms could only be held civilly liable for damages arising from third-party content if, after receiving a specific court order, they failed to remove the infringing material. This rule allowed application providers (commonly known as platforms) to operate without the risk of being held liable for their users' exercise of freedom of expression, thereby avoiding prior censorship.

In recent years, the model has been subject to growing criticism, prompting the Judiciary to reconsider its boundaries in light of new technological and social realities.

In this context, the Brazilian Supreme Federal Court (STF) delivered today (26 June 2025) one of the most significant rulings of the decade on the regulation of digital platforms in Brazil. In a decision recognizing the partial and progressive unconstitutionality of Article 19, the Court, by majority vote, established a new interpretative framework for the civil liability of internet platforms for user-generated content.

As a result of such ruling, it was determined that, until new legislation is enacted, Article 19 must be interpreted in accordance with the Constitution, allowing for liability in exceptional circumstances – even in the absence of a judicial order.

The new interpretation sets out a tiered and more protective regime, with emphasis on the following scenarios:

- Civil liability shall apply to crimes and unlawful acts, pursuant to Article 21 of the MCI (take down notice regime)
- Fake or automated accounts (bots)
- Crimes against honor, with the possibility of removal via extrajudicial notice
- Successive reposts of offensive content already deemed unlawful by a court decision, with no need for a new ruling

Additionally, the decision establishes a presumption of liability for platforms in cases involving paid ads or artificially boosted content (such as via bot networks), even in the absence of prior notice – unless the platform demonstrates that it acted diligently and within a reasonable timeframe.

The ruling also outlines a closed list of serious illegal content, the widespread dissemination of which imposes an immediate removal duty on platforms, under penalty of liability. The listed content includes:

- Anti-democratic acts and crimes against the Democratic Rule of Law
- Terrorism and preparatory acts
- Inducement or assistance to suicide or self-harm
- Discrimination based on race, ethnicity, religion, color, gender identity, sexuality, homophobia or transphobia
- Crimes against women and misogynistic hate content
- Child pornography, sexual violence against vulnerable persons, and serious crimes against children and adolescents
- Human trafficking

In such cases, systemic failure – defined as the failure to adopt adequate preventive measures in accordance with the state of the art – triggers platform liability.

Even when content is removed, users may seek judicial reinstatement, provided they demonstrate the absence of illegality. If a court orders reinstatement, the platform will not be required to pay damages, and Article 19 remains applicable.

The Court also clarified that strict liability does not apply – civil liability must be grounded in proven fault, omission, or bad faith.

Certain services remain fully protected under the original version of Article 19: email providers, closed video conferencing platforms, and private messaging services, where communications secrecy prevails (pursuant to Article 5, XII, of the Federal Constitution). Similar rule applies to marketplace platforms, which shall remain subject to consumer protection rules under the Brazilian Consumer Protection Code.

Beyond the issue of liability, the decision imposes new obligations on platforms, who must implement self-regulatory measures covering:

- Notification systems and due process mechanisms
- Transparency reports on removals, ad boosting, and content moderation
- Accessible user support channels

These rules must be published and periodically reviewed in a transparent and accessible manner.

Platforms operating in Brazil must also establish a local office and legal representative in the country, with authority to:

- Act before administrative and judicial authorities
- Comply with court orders and respond to any penalties, fines, or financial sanctions, especially for failure to meet legal or judicial obligations
- Provide information regarding:
         - content moderation procedures and internal complaint mechanisms
         - transparency reports, risk monitoring, and systemic risk management
         - profiling practices (where applicable)
         - ad delivery and paid content boosting.

Finally, the Court modulated the effects of the decision, which shall only apply for future cases, preserving legal certainty and maintaining the validity of final and unappealable judgments. The ruling ends with a call to Congress to legislate on the matter, addressing current regulatory gaps and ensuring more effective protection of fundamental rights in the face of complex emerging technologies.

As expected, the STF ruling reshapes the framework of intermediary liability for online platforms in Brazil.

To facilitate understanding of this highly relevant case, we have developed an interactive website featuring a timeline and case history, summaries of the legal arguments under debate, highlights of the main regulatory impacts, international comparisons, and easy access to key documents and technical analysis.

Acess now: Brazil’s Civil Rights Framework for the Internet: rights, responsibilities and Brazil’s digital future

Questions? Reach out to us at [email protected].

previous page

Brazil’s Civil Rights Framework for the Internet: rights, responsibilities and Brazil's digital future