Instagram has introduced improvements in its Teen Accounts system, aimed at enhancing safety measures for younger users. This comes after 97% of teens aged 13-15 opted to stay within these built-in protections, which have been instrumental in regulating contact and controlling content visibility. The firm is now expanding these regulations to Facebook and Messenger, acknowledging parents' concerns about online safety for their teens across Meta’s platforms.
Instagram's Teen Accounts have been designed with multiple safety features. For example, they automatically restrict inappropriate contact and allow only parent-approved changes for users under 16. Importantly, new regulations will require parental consent for teens wishing to participate in Instagram Live or to disable the feature that blurs suggestively inappropriate images in direct messages.
Meta is expanding the Teen Accounts framework to Facebook and Messenger. The upgrade is set to provide consistent safety features across these platforms, starting with users in the US, UK, Australia, and Canada, with other countries to follow.
Since September, there are over 54 million active Teen Accounts, globally bolstered by features such as private account settings, restricted contact permissions, and usage reminders. Meta has collaborated with parents, taking their feedback and concerns into account to shape these protective measures. An Ipsos-conducted survey showed positive parent sentiments, with 94% of respondents acknowledging the usefulness of Teen Accounts in aiding positive online experiences for their children. About 90% of parents also found the protections satisfactory for their teens on Instagram.
Meta remains committed to evolving with the digital safety landscape, continuing efforts to maintain secure environments for teen users across all its platforms.