Meta, the company behind Instagram, has announced a new age rating system that works similarly to the film industry’s PG-13 classification. The idea is simple: make Instagram safer and easier for parents and young people to understand.
What the New System Does
From now on, accounts belonging to users under 18 will automatically be set to a “13+” experience. This means Instagram will limit or hide posts that include adult themes, risky behaviour, or strong language.
The platform will also block certain search terms — such as “alcohol” or “gore” — and even catch creative misspellings of those words. Meta says this helps reduce exposure to harmful or age-inappropriate material, while still letting teens enjoy a social experience that feels natural.
Why Meta Is Making This Change
The update comes after a critical review by former Meta engineer Arturo Béjar, who found that many of the company’s earlier safety tools weren’t working well enough. Meta disagreed with some of the findings but acknowledged that clearer systems were needed.
UK media regulator Ofcom has also been pushing social media companies to prioritise user safety — especially for children and teenagers — or face tougher regulation. By using a familiar rating model like PG-13, Meta hopes to make these rules easier for families to understand.
What It Means for Everyday Users
For most teens, Instagram will now feel a little more filtered and focused on well-being. Parents will have a clearer sense of what their children see, and brands or influencers will need to make sure their content fits within the new boundaries.
This move is part of a wider industry trend: social platforms are trying to prove they can be both engaging and responsible. Meta’s new age-rating system could set the tone for how social media handles teen safety in the future.