Instagram moves to protect teens with default private accounts amid criticism

Meta rolls out new safety measures for teens on Instagram as legal and public pressure mounts over social media’s impact on youth mental health.

By
  • Storyboard18,
| September 18, 2024 , 9:56 am
Known as the “adult classifier,” this AI system will analyze various elements within a user’s profile—including their follower list, interactions, and “happy birthday” posts from friends—to estimate the user’s real age.
Known as the “adult classifier,” this AI system will analyze various elements within a user’s profile—including their follower list, interactions, and “happy birthday” posts from friends—to estimate the user’s real age.

Instagram is set to make teen accounts private by default in an effort to safeguard younger users from online dangers. Starting this week, new accounts for anyone under 18 in the U.S., U.K., Canada, and Australia will be private, with existing teen accounts migrating over the next two months.

This change, aimed at reducing unwanted contacts and exposure to harmful content, is part of a broader push by parent company Meta to address growing concerns over how social media impacts the mental health of young users.

Under the new rules, teens will only be able to receive direct messages from people they follow. In addition, “sensitive content” – like violent videos or posts promoting cosmetic procedures will be limited. Teens will also receive reminders to log off after 60 minutes and can activate “sleep mode” that silences notifications between 10 p.m. and 7 a.m. While these safety measures will apply to all teens, 16 and 17-year olds will have the option to disable them, while users under 16 will need parental approval for any changes.

Meta’s Naomi Gleit, head of product, said the move targets three primary concerns from parents: inappropriate content, unwanted contact, and excessive time spent on the app. However, critics argue that these steps are not enough to combat the underlying risks.

U.S. Surgeon General Vivek Murthy and New York Attorney General Letitia James both called for more aggressive measures, with James labelling Meta’s changes “an important first step.” but insufficient.

The announcement follows a wave of lawsuits against Meta by multiple U.S. states, accusing the company of contributing to the youth mental health crisis by designing addictive features on its platforms. Although Meta hinted at potential short-term declines in teen engagement, analysts like Jasmine Enberg from Emarketer believe the revenue impact will be minimal. She added that while the changes might curb some harmful behaviours, teenagers are likely to find ways to bypass restrictions, possibly even encouraging them to explore workarounds.

Meta is also expanding parental control options, allowing parents to monitor their child’s online activity through its Family Center. With these changes, parents will be able to view who is messaging their teens, potentially aiding conversations about online safety and harassment.

Yet, as Murthy pointed out, the burden of monitoring remains largely on parents, who may struggle to keep up with the rapidly evolving digital landscape.

Leave a comment