The European Union (EU) has taken a significant step against Meta (formerly Facebook) by initiating a formal investigation into its child safety practices on Facebook and Instagram. Announced on Thursday, this probe aims to address concerns regarding the potential harm these platforms might be causing to the mental and physical well-being of children.
Focus of the Investigation
The EU Commission will scrutinize whether Meta is in breach of the Digital Services Act (DSA). The main areas of focus are:
- Addiction and Algorithmic Manipulation: There is concern that the design and algorithms of Facebook and Instagram are contributing to "behavioral addictions" among children, keeping them trapped in continuous loops of content.
- Inappropriate Content and Age Verification: The investigation will evaluate if Meta is effectively protecting minors from harmful content and whether its age verification tools are sufficient to prevent underage access.
- Privacy and Safety for Minors: Another key aspect of the probe will be to assess if Meta's content recommendation systems and default privacy settings are adequately prioritizing the privacy, safety, and security of children on its platforms.
Meta's Recent Efforts
Despite Meta's recent initiatives to enhance child safety—such as limiting access to harmful content and curbing interactions with suspicious adult accounts—the EU's investigation indicates that these measures may not be enough.
Possible Outcomes and Consequences
The Commission will gather more evidence to determine the appropriate course of action. Although there is no set deadline for the investigation, the EU has the power to implement temporary enforcement measures against Meta during this period. If Meta is found to be violating the DSA, it could face substantial fines, potentially up to 6% of its global revenue. EU Commissioner Thierry Breton highlighted the bloc’s commitment by stating, “we are sparing no effort to protect youth.”