Modelo Fotos Site
Homepage
Seta
Digital ECA: What Changes for Companies Under Law No. 15,211/2025

Digital ECA: What Changes for Companies Under Law No. 15,211/2025

10/04/2026

Autores

Dante Machado - Associate

The Digital Statute of the Child and Adolescent (Digital ECA) entered into force on March 17, 2026. As a result, companies operating in the digital environment are now subject to new product, governance, and operational requirements in Brazil.

The new law updates, for the digital environment, the protective framework already present in the Statute of the Child and Adolescent. Its focus is on incorporating risk prevention, privacy, and security into the design of digital products and services, with the best interests of children and adolescents in mind.

The regulation also significantly expands the responsibility of digital product and service providers, which now assume an active role in risk mitigation and in protecting users under the age of 18.

To Whom the Digital ECA Applies

The Digital ECA applies to providers of information technology products or services targeted at children and adolescents or likely to be accessed by this audience in Brazil.

The concept of “probable access” is central to the law’s scope and encompasses services that, although not targeted at minors, are likely to be used by them. To assess this criterion, the following factors are considered:

(i) the attractiveness of the service to children and adolescents;
(ii) ease of access and use; and
(iii) the potential exposure to privacy, security, or biopsychosocial development risks.

Foreign companies are also subject to the law whenever their services are available within Brazilian territory.

Some obligations vary depending on the characteristics and functionalities of the service, the degree of control exercised by the provider over content, the number of users, and the size of the company.

The scope of the law goes beyond big tech companies and social networks. In practice, it may affect providers of:

  • internet applications;
  • operating systems;
  • app stores;
  • electronic games;
  • educational apps;
  • video and streaming platforms;
  • marketplaces with interaction or advertising features;
  • platforms with user-generated content;
  • manufacturers or importers of internet-connected devices; and
  • conversational AI services accessible to minors.

Key Obligations Under the Digital ECA for Platforms and Digital Services

For companies, the Digital ECA has practical implications for decisions regarding registrations, advertising, recommendation systems, parental controls, moderation, and interface design. Below is a summary of the key issues most likely to require attention:

  • Age Verification:
    The law requires reliable age verification mechanisms and expressly prohibits self-declaration. For content, products, or services that are inappropriate, unsuitable, or prohibited for minors, effective measures must be adopted to prevent access, with any data collected used strictly for age verification purposes.

For app stores and operating systems, the regulation also requires proportionate, auditable, and technically secure measures to verify age or age group.
Implementation will occur in two phases, according to a timetable released by the Brazilian Data Protection Authority (ANPD) in March 2026. The first phase focuses on the immediate monitoring of app stores and operating systems. The second phase, scheduled for August 2026, will extend the guidelines to other sectors after consolidation of the Authority’s preliminary guidance.

  • Parental Supervision:
    The law requires the availability of tools accessible to parents or legal guardians, such as usage time control, geolocation restrictions, limitations on interactions with other users, limits on financial transactions, and privacy management. At least in social networks, accounts of children and adolescents up to the age of 16 must be linked to an account held by a legal guardian.
  • Protective Default Settings and Prevention of Compulsive Use:
    The law mandates settings at the highest level of privacy and data protection by default, consistent with the user’s age group and associated risks. It also requires, from the design stage, measures to prevent excessive or compulsive use, including manipulative interface practices such as autoplay, infinite scrolling with no natural stopping points, rewards linked to usage time, and excessive notifications.
  • Prohibition of Loot Boxes and Safeguards in Electronic Games:
    The law prohibits the availability of loot boxes in games targeted at, or with probable access by, children and adolescents. In addition, gaming platforms that allow user interaction must adopt additional layers of security, such as active content moderation, protection against potentially harmful contact, and parental consent tools for communication features.
  • Content Moderation and Removal:
    The law imposes enhanced duties for detecting, handling, and responding to content that violates the rights of children and adolescents. In cases involving apparent sexual exploitation or abuse, kidnapping, or grooming, providers must adopt immediate removal measures and notify the competent authorities. For other violations of children’s and adolescents’ rights, the regulation requires prioritized handling of notifications submitted by authorized parties, with prompt assessment and adoption of appropriate measures.
  • Advertising and Monetization:
    The law prohibits the use of profiling techniques to target commercial advertising to children and adolescents, as well as the use of emotional analysis and immersive technologies for this purpose.
  • Age Rating and Transparency:
    Providers must clearly disclose content age ratings, that is, the appropriate or recommended age group—and ensure consistency between content and its classification.
  • Transparency Reports:
    Providers with more than 1 million registered users under the age of 18 must publish semiannual transparency reports containing data on moderation activities, complaints, risks, and measures adopted.
  • Legal Representation in Brazil:
    Foreign companies subject to the law must maintain a legal representative in Brazil with powers to receive service of process, summons, and notifications in judicial and administrative proceedings.

Failure to comply with these obligations may result in significant sanctions, including warnings and administrative fines of up to 10% of the economic group’s revenue in Brazil in the previous fiscal year, capped at BRL 50 million per violation. Where there is no revenue, fines may range from BRL 10 to BRL 1,000 per registered user, subject to the same maximum cap. Temporary suspension of activities and prohibition from operating may also be imposed.

In practice, the Digital ECA requires a broader compliance approach. For many companies, compliance will go beyond document review and will require changes to products, user flows, control mechanisms, and operational processes.

From an operational standpoint, it will be necessary to implement or enhance mechanisms for age verification, parental supervision, content moderation, and risk management through impact assessments. These assessments should consider exposure to inappropriate content, compulsive use, excessive data collection, recommendation systems, and user interactions. For AI systems that generate content or interact through natural language, transparency regarding automation, algorithmic risk analysis, and safeguards against behavioral manipulation become particularly relevant.

What to Do to Ensure Compliance with the Digital ECA

Compliance with the Digital ECA requires integrating the protection of children and adolescents into products and services from the design stage. In general, the most sensitive issues arise in registrations, advertising, content recommendation, parental controls, moderation, and interface design.

To ensure compliance, the following actions are recommended:

  1. Map the products and services offered or accessible in Brazil, as well as their functionalities, and document their inclusion within the scope of the Digital ECA.
  2. Analyze user journeys, characteristics, and functionalities of such products and services, including access, registration, login, content, interaction, advertising, recommendation systems, transactions, geolocation, and AI.
  3. Document identified gaps, relevant risks, existing controls, and those that still need to be implemented.
  4. Structure an action plan with assigned responsibilities, deadlines, dependencies, and implementation phases.
  5. Draft or update policies, matrices, operational procedures, classification criteria, and control rules.
  6. Implement prioritized technical and operational measures, focusing on age verification, protective default settings, parental supervision, moderation, advertising, design, gaming features, and AI.
  7. Record and organize evidence of adopted measures, preserving tests, logs, approvals, reports, and implementation records.
  8. Establish and maintain continuous monitoring and periodic review, with indicators, reassessment criteria, and control updates.

Anticipating this work helps reduce regulatory exposure while making implementation more organized and efficient.

Continue reading

View posts
Seta
Seta