Publication 23 Apr 2025 · Brazil

Brazil’s Civil Rights Framework for the Internet: rights, responsibilities and Brazil's digital future

Discussion on platform liability and the rulings of the Supreme Federal Court that will shape online freedom.

17 min read

On this page

What is the Civil Rights Framework for the Internet?

Federal Law No. 12,965, dated April 3, 2014, better known as the “Civil Rights Framework for the Internet” (MCI), constitutes the primary Brazilian statute governing the use of the internet within the country. Dubbed the "Internet Constitution," it establishes principles, guarantees, rights, and obligations for users, service providers, and the State, with a focus on safeguarding freedom of expression, privacy, and net neutrality.

The MCI represented a milestone not only due to its content but also because of its innovative and highly participatory drafting process. Unlike the traditional legislative process, which follows a more restricted procedure, the Marco Civil was born from a broad public debate, involving experts, technology companies, academics, civil society organizations, and citizens.

Between 2009 and 2011, the draft text was collaboratively developed through public consultations conducted online, with the creation of a digital platform that enabled any individual to provide input on the topics under discussion.

Prior to the MCI, there was no comprehensive legislation governing the internet in Brazil. Provisions concerning the rights and obligations of users and service providers were interpreted on a case-by-case basis by the Judiciary, leading to legal uncertainty and conflicting rulings.

The new law introduced clear and balanced provisions, fostering the growth of the internet in Brazil by ensuring

  • Net Neutrality: Internet service providers may not discriminate against or block access to content based on commercial interests.
     
  • Privacy Protection: Established privacy as a guiding principle of the online environment, restricting certain uses of personal data by companies without the data subject's consent.
     
  • Freedom of Expression: Established a framework safeguarding freedom of expression online by preventing platforms from being compelled to remove content absent a judicial order, thereby avoiding private censorship

Understanding Article 19 of the Brazilian Civil Rights Framework for the Internet

Article 19 of the Brazilian Civil Rights Framework for the Internet is one of the pillars of freedom of expression on the internet in Brazil. The wording approved and currently in force has the following text:

Article 19. In order to ensure freedom of expression and prevent censorship, the internet application provider may only be held civilly liable for damages arising from content generated by third parties if, following a specific court order, it fails to take measures, within the scope and technical limitations of its service and within the prescribed timeframe, to make the identified infringing content unavailable, except as otherwise provided by law.

§ 1 - The court order referred to in the caput must contain, under penalty of being deemed null and void, clear and specific identification of the content pointed out as infringing, which allows the unequivocal location of the material.

§ 2 - The application of the provisions of this article to infringements of copyright or related rights shall depend on a specific statutory provision on a specific legal provision, which must respect freedom of expression and other guarantees provided for in article 5 of the Federal Constitution.

§ 3 - Cases that deal with compensation for damages arising from content made available on the internet related to honor, reputation, or personality rights, as well as the unavailability of such content by internet application providers, may be brought before the Small Claims Courts.

§ 4 - The judge, including in the procedure provided for in paragraph 3, may grant, in whole or in part, anticipatory relief sought in the initial petition, upon unequivocal evidence of the facts and considering the public interest in maintaining the content online, provided that the requirements of plausibility and well-founded fear of irreparable harm or harm difficult to remedy are met.

By establishing that internet providers can only be held civilly liable for damages arising from content published by third parties if, after a specific court order, they do not take steps to remove the infringing content, this provision enabled application providers to carry out their activities without the risk of liability arising from their users’ exercise of freedom of expression, thereby preventing prior censorship.

What is at stake?
The judgment addresses the constitutionality—or lack thereof—of Article 19, particularly in light of the growing impact of digital platforms on society.
The decision will have profound implications for how companies—particularly technology companies—must address potentially harmful content and define the boundaries of civil liability within the digital environment.

The Case That Gave Rise to the Proceedings Before the Federal Supreme Court

The plaintiff claims that, even without being a user of the social network, she had her image used in a fake profile.

An individual allegedly created an account on the platform using her name and photo without authorization, and through this fake profile, offensive messages were sent to third parties.

Distressed and fearing damage to her reputation, the plaintiff stated that she requested the removal of the fake profile via the reporting tool provided by the platform, but the company remained unresponsive.

In view of the platform’s alleged omission, the plaintiff lodged a claim before the Small Claim Courts (JEC), requesting three measures:

  • Immediate removal of the fake profile
  • Payment of compensation for moral damages
  • Disclosure of data for the future identification of the creator of the fake account

The court of first instance upheld the requests for (i) removal of the profile and (ii) disclosure of data to identify the offender, but dismissed (iii) the platform’s liability for moral damages.

Both parties appealed to the appellate court: the plaintiff sought compensation, while the social network argued that, pursuant to Article 15 of the Brazilian Civil Rights Framework for the Internet, the obligation to retain identifiers (such as IP addresses) was limited to six months, making it impossible to provide the requested data.

The Appeals Panel partially granted both appeals: it awarded moral damages to the plaintiff but exempted the platform from providing the user’s data, applying the limitation of the Civil Rights Framework for the Internet.

The reasoning of the Reporting Judge for holding the platform liable for the indemnity was based especially on the Consumer Protection Code. According to the judge’s opinion, “conditioning the removal of the fake profile only ‘upon specific court order’, as stated in that article, would mean exempting application providers, as in the case of the defendant, from any and all indemnity liability, thereby rendering the protective system established under the Consumer Protection Code ineffective, a circumstance that would also violate a constitutional provision (article 5, item XXXII, of the Federal Constitution).”

Also in the words of the Reporting Judge, the indemnity would be owed by the platform due to:

defective provision of services by the defendant

with the amount set at R$10,000.00.

In light of this decision, the platform filed an Extraordinary Appeal with the Federal Supreme Court (STF), arguing that the award of moral damages violated Article 19 of the Civil Rights Framework for the Internet, which exempts providers from liability for content posted by third parties, except when a specific court order has not been complied with.

The STF then accepted the appeal and recognized the general repercussion of the issue, understanding that the decision in the case would not be limited only to that of the plaintiff and the platform in question, but would have an impact on all digital platforms and internet users in Brazil.

Note: Two cases are under discussion—one prior to (Topic 533) and another following the enactment of the Civil Rights Framework for the Internet (Topic 987). For the purposes of this document, only Topic 987 will be analyzed.

The recognition of general repercussion: the importance of the case for the Federal Supreme Court (STF) and the future of the internet in Brazil

The judgment on the constitutionality of Article 19 of the Civil Rights Framework for the Internet (MCI) by the Federal Supreme Court (STF) is not limited to the parties involved in the specific case.

The recognition of general repercussion means that the decision rendered by the STF will serve as a mandatory reference for all Brazilian courts in similar cases, establishing a binding precedent on the civil liability of application providers, such as digital platforms, for content generated by third parties.

General repercussion is a procedural mechanism designed to ensure that the STF adjudicates only cases of social, political, economic, or legal relevance, thereby preventing an overload of cases that do not extend beyond the interests of the parties involved.

In practice, when an Extraordinary Appeal (Recurso Extraordinário - RE) reaches the STF, the Justices must evaluate whether the case transcends individual interests and affects society as a whole.

For general repercussion to be recognized, a majority of the Justices must vote in favor of its admission.

Once recognized, the case becomes a precedential case (leading case), serving as a model for resolving similar disputes in lower courts. The STF’s decision shall be binding upon all courts across Brazil.

The STF recognized general repercussion in the judgment of Extraordinary Appeal (RE) 1037396 (Topic 987), understanding that the matter extended beyond the specific dispute and addressed a structural issue concerning internet regulation in Brazil.

The central question is whether Article 19 of the MCI, which conditions platform liability on the existence of a specific court order—is compatible with the Federal Constitution.

With general repercussion acknowledged, the STF signaled that its ruling will shape national jurisprudence on key topics such as: freedom of expression, content moderation and responsibility of digital platforms, affecting, for example:

  •  Digital platforms, which may face increased liability for user-generated content and will need to revise their content removal policies.
     
  •  Internet users, who may gain greater protection but also face stricter limitations on their posts.
     
  •  Technology companies, which must adapt to new standards of content governance and transparency.
     
  •  Judiciary, which will adopt a unified interpretation of the matter.

Possible impacts of the STF's decision

Constitutionality
Complete unconstitutionality
Partial constitutionality

With general repercussion established, the STF’s decision on the constitutionality of Article 19 will serve as a regulatory benchmark for Brazil’s digital future. Three potential outcomes are under consideration:

  • Upholding Article 19 as Currently Drafted: The STF may declare the provision constitutional, maintaining that platforms are liable only when failing to comply with a specific court order. 
     
  • Complete Unconstitutionality of Article 19: Platforms could become directly liable for failing to remove offensive content, regardless of prior judicial intervention.
     
  • Partial Unconstitutionality or Text Modification: The STF could implement a hybrid system, establishing varying liability depending on the circumstances, thus creating an intermediate legal framework.

If deemed unconstitutional (in whole or in part), Brazil may adopt a model aligned with the European Union’s Digital Services Act, introducing graduated liability based on the severity of the content. In such a scenario, a modulation of effects (modulação de efeitos) is expected, potentially limiting the decision’s applicability or setting its effectiveness to begin at a defined procedural milestone, such as the final and unappealable judgment (trânsito em julgado).

Conversely, if the STF upholds the constitutionality of Article 19, Brazil will maintain a model akin to the United States’ Section 230 of the Communications Decency Act, which shields providers from direct liability for third-party content.

Timeline of the Brazilian Civil Rights Framework for the Internet (MCI)

Case Information

Case Number: RE 1037396. See the process on the STF website here.
Unique Number: 0006017-80.2014.8.26.0125
Type of Appeal: Extraordinary Appeal (Recurso Extraordinário)
Rapporteur: Justice Dias Toffoli
Leading Case: Topic: 987


Title: discussion on the constitutionality of Article 19 of Law No. 12,965/2014 (Civil Rights Framework for the Internet), which establishes the requirement of a prior and specific court order for content removal as a condition for the civil liability of internet providers, website operators, and social network application managers for damages arising from unlawful acts committed by third parties.


Description: Extraordinary Appeal addressing the constitutionality of Article 19 of Law No. 12,965/2014 (Brazilian Civil Rights Framework for the Internet), in light of Articles 5, items II, IV, IX, XIV, and XXXVI, and Article 220, caput, §§ 1 and 2, of the Federal Constitution. The provision imposes conditions on the civil liability of internet providers, website operators, and managers of social network applications for damages resulting from unlawful acts perpetrated by third parties.

Justices’ Votes

 JusticeVoteProposalDetails
1Minister Dias Toffoli (Rapporteur)Unconstitutionality of article 19 of the MCI

General rule: Sufficient extrajudicial notification; Subsidiary Liability.

Exceptional Cases: Strict liability regardless of notification.

The Reporting Minister, Dias Toffoli, proposes to replace the current model with stricter and more specific rules regarding platform liability, establishing the logic of Article 21 of the MCI (notice and take down) as the general rule. Highlights include:

Proposes the complete repeal of article 19 of the Civil Rights Framework for the Internet.

Argues that platforms should be held responsible when they fail to remove offensive content after victim notification, even without a court decision.

Establishes situations in which the liability is automatic (strict liability), such as:

  • When there is recommendation or promotion of content (whether paid or unpaid).
     
  • In cases involving fake accounts, automated or anonymous profiles (bots).
     
  • Copyright infringements (in which case the platform would be jointly liable with infringing users).
     
  • Publications involving serious crimes, such as racism, violence against children and women, terrorism, or disinformation likely to incite violence or affect elections.

Email providers, closed meeting solutions, and private messaging for interpersonal communications are excluded from the proposed rules.

Additionally, Justice proposes the creation of duties for providers, including:

  • The requirement for platform transparency.
  • The establishment of clear content moderation rules.
  • The obligation of legal representation in Brazil for foreign platforms.

Finally, he calls upon the Executive and Legislative branches to develop, within 18 months, a public policy to combat disinformation and digital violence, and proposes the creation, within the National Council of Justice (CNJ), of an Internet Oversight Department (DAI) to monitor and safeguard fundamental rights in the digital environment.

Download the preliminary vote (available in Portuguese only).

2Justice Luiz FuxUnconstitutionality of Article 19 of the MCI

General Rule: Sufficient extrajudicial notification; subsidiary liability.


Exceptional Cases: Strict liability, regardless of notification.

Justice Luiz Fux concurs with Justice Dias Toffoli's vote, adopting the logic of Article 21 of the MCI (notice and take down) as the rule. However, he advocates for a less detailed model, recognizing platform liability in serious or evident situations, while preserving the notion of prior notification in more subjective cases. Highlights include:

  • Platforms may be held civilly liable, even without a court order, when they have clear and unequivocal knowledge of the illegality of the content—whether due to its obvious nature or through proper notification.
     
  • Content considered inherently illegal, such as:
    • Hate speech
    • Racism
    • Pedophilia
    • Incitement to violence
    • Advocacy of a coup d'état or against the Democratic Rule of Law

In such cases, platforms have an active duty to monitor content.

For cases involving personality rights (e.g., offenses against honor, image, or privacy), liability arises only after a substantiated notification by the victim, made through any appropriate means.

When content is promoted through payment, it is presumed that the platform has full knowledge of its illegality.

Download the preliminary vote (available in Portuguese only)

3Luís Roberto Barroso (President)Partial Unconstitutionality of Article 19 of the MCI

General Rule: Fault-based liability for offenses and crimes against honor, civil torts, and residual content.

Exceptional Cases:

  • Extrajudicial notification for other crimes (except those against honor) will follow a simple notification system with subsidiary liability.
     
  • Liability regardless of notification when content is advertised or promoted, as knowledge will be presumed.

Justice Barroso seeks a balanced approach between freedom of expression and digital liability, with rules proportionate to the severity of the content. He advocates for the creation of an independent legal framework. Highlights include:

  • Article 19 remains valid for cases involving crimes against honor and common civil torts.
     
  • The system established by Article 21 of the MCI (notice and take down) should be extended to other crimes, except those against honor.
     
  • When platforms display advertisements or allow content promotion, it is presumed they had knowledge, making them liable without notification, unless they act diligently and promptly.

In all cases, liability should remain “subjective” (fault-based), except in cases of clear systemic failure.

He proposes that platforms have a duty to mitigate systemic risks, especially concerning extraordinarily harmful content (e.g., child pornography, human trafficking, advocacy of coups).

He emphasizes the duty of care, requiring platforms to proactively prevent extraordinarily harmful content, such as child pornography, incitement to suicide, terrorism, or attacks against the Democratic Rule of Law. Platforms must implement structured notification systems, internal due process, and annual public transparency reports.

Finally, he urges Congress to enact specific legislation regulating digital risk mitigation, including the establishment of an independent regulatory authority.

Download the preliminary vote (available in Portuguese only).

4Justice André MendonçaConstitutionality of Article 19 of the MCIDefense of the current model, with procedural improvements and preservation of the subjective liability standard

Justice André Mendonça advocates for the maintenance of Article 19 of the MCI as constitutional, reaffirming the framework of subjective liability and the requirement of a court order for platform accountability. His opinion outlines specific measures to ensure due process in content moderation procedures. Key elements include: Upholds the current model under Article 19, which conditions platform liability for failure to remove offensive content posted by third parties upon a prior judicial order. Differentiates private messaging services from social media platforms, emphasizing the protection of privacy, confidentiality of communications, and personal data.

Allows for extrajudicial removals only when expressly provided by law or by the platform’s Terms of Service, provided that due process guarantees are observed, including: 

  • access to the reasons for the removal; 
  • preference for human decision-making (automated systems, including AI, to be used only in exceptional cases); 
  • right to appeal the moderation decision; 
  • appropriate and timely response from the platform.

Recognizes the possibility of platform liability in case of breach of procedural duties, such as: 

  • failure to apply Terms of Service equally and without discrimination; 
  • insufficient digital security measures that enable or facilitate unlawful conduct. 

Affirms that platforms are not liable for failing to remove content that is later judicially deemed offensive, unless a specific legal obligation imposes such duty. 

Asserts that court orders must be duly reasoned and made accessible to the platform, even when issued in sealed proceedings. Concludes with an appeal to the Legislative and Executive branches to adopt public policies grounded in a regulated self-regulation model for internet governance in Brazil.

Download the full text (available in Portuguese only

5Justice Flávio DinoPartial Unconstitutionality of Article 19 of the Brazilian Internet Civil Framework (MCI)

Restricts the scope of Article 19 to honor-related offenses.

 

Expands platform liability in other circumstances.

Justice Flávio Dino proposes limiting the applicability of Article 19 of the Internet Civil Framework exclusively to cases involving offenses against honor. For all other situations, he applies Article 21 of the MCI (notice and take down), allowing platforms to be held liable without prior judicial authorization in specific circumstances.

Key elements include:
Strict liability for content disseminated via anonymous or fake profiles;
Liability for unlawful content in paid or sponsored advertisements;
Civil liability for systemic failure in cases involving serious crimes (e.g., offenses against children, suicide incitement, terrorism, or threats to democratic institutions);
• Systemic failure arises from the lack of adequate preventive technical measures;
• Platforms must implement notification systems, procedural safeguards, and transparency reporting, under the supervision of the Office of the Prosecutor-General, until a specific law is enacted.

Download the full text. (available in Portuguese only)

6Justice Cristiano ZaninPartial unconstitutionality of Article 19 of the Brazilian Internet Civil Framework (MCI)

Constitutional interpretation to restrict platform immunity.

 

Civil liability based on active role and manifest illegality of content.

Justice Zanin proposes a constitutional interpretation of Article 19, recognizing its partial unconstitutionality due to legislative omission. In his view, the general immunity clause fails to adequately protect fundamental rights and democratic values. 

Key elements include:

  • Article 19 applies only to: (i) neutral application providers (i.e., mere hosts or conduits), (ii) active providers when the content is not manifestly illegal, or (iii) journalistic or media entities acting as users;
  •  Platforms with an active role (e.g., algorithmic curation) must follow Article 21 notice-and-take-down procedures when notified of manifestly illegal content;
  • Presumption of liability for boosted advertisements and bot-generated content;
  •  Duty of care and risk prevention regarding systemic harms, with use of AI and human oversight, especially in cases involving child sexual abuse material, terrorism, suicide, trafficking, and democratic threats;
  • Platforms must adopt notification protocols, due process guarantees, and publish transparency reports;
  • The decision applies only prospectively, preserving the original immunity regime until the judgment becomes final.

Download the full text (available in Portuguese only)

7Justice Gilmar Mendes (Dean of the Court)Partial unconstitutionality of Article 19 of the Brazilian Internet Civil Framework (MCI)From immunity logic to a regulatory model with tiered liability and structural obligations

Justice Gilmar Mendes argues that Article 19 should be replaced by a model based on graduated liability and regulatory duties, as digital platforms exert an active and opaque influence on content dissemination, shaping public discourse.

Key elements include:
• Platforms are not neutral – algorithms prioritize polarizing content to increase engagement and profits;
• The current regime fosters systemic irresponsibility, even in cases of clearly unlawful content;
• Proposes a progressive liability model, focused on:

  • Transparency in moderation and ad delivery;
  • Obligation to remove illicit sponsored or boosted content;
  • Mandatory takedown of manifestly criminal material (e.g., terrorism, hate speech, attacks on democracy);
  • Robust notification and appeal mechanisms;
  • Periodic public transparency reports and local legal representation;
  • Oversight by a specialized regulatory authority, preferably the Brazilian Data Protection Authority (ANPD);
  • Safe harbor exception where content is not immediately removed due to reasonable legal interpretation of the applicable rules and materials;
    • Argues that regulating platforms enhances freedom of expression by ensuring pluralism and institutional accountability.

Download the full text (available in Portuguese only)

8Justice Alexandre de MoraesUnconstitutionality of Article 19 of the MCIFollows the vote of the Rapporteur Justice Dias ToffoliJustice Alexandre de Moraes concurred with the opinion of Rapporteur Justice Dias Toffoli, endorsing the view that Article 19 of the Internet Civil Framework is unconstitutional in its current form.
9Justice Nunes MarquesPending    
10Justice Edson Fachin (Vice-President)Pending    
11Justice Cármen LúciaPending