Meta, the company behind Instagram and Facebook, has introduced expanded parental controls across both platforms in a bid to make the online experience safer for teenagers. These new features are designed to give parents more visibility and control over their children’s social media activity, while also helping teens form healthier digital habits. Originally launched on Instagram, these tools are now being extended to Facebook and Messenger.
Stronger Controls for Safer Sharing
One of the most significant changes is that teens under the age of 16 will now need parental approval before they can go live or disable automatic filters that blur potentially explicit images in messages. This aims to reduce exposure to harmful or inappropriate content while giving parents a say in what their children can access or share. In addition, all teen accounts will continue to be set to private by default, meaning their content is only visible to approved followers, offering another layer of protection.
Helping Teens Manage Screen Time
Recognising the growing concern around screen addiction and its impact on mental health, Meta is introducing new features to encourage more mindful use of social media. Teens will now receive prompts to take breaks if they’ve been scrolling for extended periods, and notifications will be automatically muted during sleep hours. These features are designed not just to limit screen time, but to foster better habits around how and when young people engage with social platforms.
More Transparency for Parents
To support parental involvement, Meta has updated its supervision tools to allow caregivers to monitor who their teens follow and interact with and even set daily time limits for usage. These updates are part of a wider initiative by Meta to address mounting criticism over teen safety on its platforms. For marketers and business owners, it’s a clear signal that user wellbeing, especially among younger audiences, is becoming a critical part of platform design and brand trust.