Meta has announced the rollout of its “Teen Accounts” feature for Facebook and Messenger users in Pakistan this week, expanding protections already available on Instagram.
The initiative is aimed at giving teenagers a safer, more age-appropriate online experience while offering parents greater peace of mind.
What are teen accounts?
According to Meta, Teen Accounts automatically place restrictions on who teenagers can interact with, filter the content they see, and introduce features that encourage healthier screen time.
The feature was designed after feedback from parents who voiced concerns about teens’ online safety, inappropriate content, and excessive usage.
Protections and restrictions
Meta said that new users under the age of 18 will automatically be placed into Teen Accounts, while existing teenage users will gradually be transitioned into the feature.
The protections include:
-
Automatic limits on who teens can connect with online.
-
Filters to prevent exposure to unsafe or harmful content.
-
Measures to help manage screen time more effectively.
Parents’ concerns addressed
In its statement, Meta emphasized that Teen Accounts are helping address parents’ top concerns, from online predators to exposure to harmful material. The company said these changes reflect its commitment to giving parents “peace of mind” regarding their children’s online experiences.
Adam Mosseri, Head of Instagram, added: “We want parents to feel good about their teens using social media. They should be able to connect with friends and explore their interests without worrying about unsafe experiences.”
Expanding beyond Instagram
Teen Accounts were first introduced on Instagram a year ago, where Meta added limits on teens going Live, increased restrictions on direct messages, and refined its systems to ensure age-appropriate experiences.
The expansion to Facebook and Messenger in Pakistan reflects Meta’s wider global rollout, which already covers hundreds of millions of teenage users across its platforms.







