Instagram Strengthens Privacy Measures For Accounts Featuring Children.

New controls aim to safeguard children's presence on Instagram

Shalom Ihuoma
3 Min Read

Instagram is introducing a new set of protections aimed at accounts that primarily feature children. The update is part of Meta’s ongoing efforts to improve platform safety and prevent potential risks related to child exploitation, exposure, and unwanted interactions.

What’s changing?

According to Meta and as reported by TechCrunch, Instagram will now:

  • Detect accounts that regularly post children’s content: Using signals like captions, tagged birthdays, and other patterns, Instagram’s systems can identify accounts that frequently feature minors.
  • Default these accounts to private: If the platform believes an account centers around children but is public, it will recommend switching to private. For new accounts, privacy will be the default.
  • Limit who can interact: These accounts will have restricted direct messaging capabilities only approved followers will be able to send DMs. Unwanted or unsolicited contact is minimized.
  • Notify and guide the adult account owners: Instagram will provide notifications and educational prompts for adults managing such accounts, helping them make informed safety decisions.
  • Expand parental supervision tools: Guardians can oversee child activity on Instagram via Meta’s supervision controls. While this isn’t new, it is now more emphasized for accounts involving children.

Meta says the system isn’t scanning faces or using biometric data, it instead relies on contextual cues and metadata. That’s in line with its previous safety initiatives such as age verification tools, which include video selfie analysis and ID submission.

What prompted this update?

The new changes are partly in response to growing scrutiny from child protection organizations and lawmakers. As noted by The Wall Street Journal, platforms like Instagram face mounting pressure to introduce stronger safeguards for minors, especially as content involving children continues to attract widespread public attention and sometimes the wrong kind. Earlier this year, Meta also introduced “restricted” teen accounts on Facebook and Messenger TechCrunch, giving younger users limited visibility and contact options.

Related: Twitch Just Went Vertical; Because Phones Deserves Better Streams Too

These latest steps aim to extend that same level of protection to even younger audiences, particularly those who may not even be operating their own accounts, but are frequently featured by parents, schools, or creators.

Share This Article