Meta is turning to advanced AI to solve a decades old problem: kids lying about their age to get onto Facebook and Instagram. In a major update announced Tuesday, the company revealed it is now using AI to analyze physical traits like bone structure and height to identify and remove users under the age of 13.
How the Visual Analysis Works
Last week, Meta also introduced a feature for parents to monitor their children’s Chat. In this update, Meta’s new AI scans photos and videos for physical indicators of age. Rather than just looking at a birthdate on a profile.
- Not Facial Recognition: Meta explicitly states this is not facial recognition and does not identify specific individuals.
- Physical Cues: The system looks for general themes, such as height and bone development, to estimate if a user is a child or an adult.
- Contextual Scanning: Beyond visuals, the AI also hunts for text-based context clues in bios, captions, and comments, such as mentions of school grades or birthday celebrations.
If the AI flags an account as being under 13, the profile is deactivated. To get back online, the user must provide proof of age or the account will be permanently deleted.
Automatic Teen Account Protections

Meta isn’t just looking for kids under 13; it is also proactively moving older children into safer environments.
- Proactive Enrollment: The company is expanding technology that identifies users who might be teens even if they claim to be adults and automatically places them into Teen Accounts.
- Expanded Reach: This tech is launching on Facebook in the US for the first time and expanding across 27 countries in the EU and Brazil for Instagram.
- Built-in Safety: Teen Accounts automatically restrict who can contact the user, limit sensitive content, and (on Facebook) prevent users under 16 from livestreaming.
Legal Pressure and the Push for App Store Laws
This sudden surge in safety measures follows a massive legal blow for Meta. A New Mexico jury recently ordered the company to pay $375 million for misleading the public about child safety and failing to protect minors from predators.
Perhaps because of this pressure, Meta is doubling down on its argument that individual apps shouldn’t be the age police. The company continues to push for legislation that would require App Stores (like Apple and Google) to verify a user’s age at the device level, creating a single, consistent safety standard for every app a teen downloads.