‘It’s not a solution for teen girls like me’: Instagram’s new under-18 rules met with skepticism | Instagram


Sevey Morton first got an Instagram account when she was 10 years old. She used it to keep up with friends, but also to follow pop culture trends. Now 16, the San Diego high schooler says all the airbrushed perfection and slickly edited selfies from celebrities and influencers made her hyper-focused on her appearance, causing anxiety and body image issues.

“Being exposed to that at a very young age impacted the way I grew into myself,” Morton said. “There is a huge part of me that wishes social media did not exist.”

Morton’s struggles inspired her film-maker mother, Laura, to direct Anxious Nation, a documentary on America’s so-called anxiety epidemic among adolescents. When Morton heard last week that Meta set new rules for teen accounts, she thought it was a good start – but not a solution.

Meta, which owns Instagram, rolled out changes that give parents the ability to set daily time limits on the app and block teens from using Instagram at night. Parents can also see the accounts their children message, along with the content categories they view. Teen accounts are now private by default, and Meta said “sensitive content” – which could range from violence to influencers hawking plastic surgery – will be “limited”.

Teens with Instagram accounts will notice these rules go into effect within 60 days. If a child under the age of 16 wants to nix or alter these settings, they need parental permission; 16- and 17-year-olds can change the features without an adult. (One very easy loophole for teens: lying about their age. Meta also said it is working on improved age verification measures to prevent teens from circumventing age restrictions.)

“I feel these changes are very positive in a lot of ways, especially because they’re restricting sensitive content, but I don’t think it’s a solution,” Morton said. “Especially for teen girls, if you ask them what the main problem with Instagram is, they would say body image stuff.”

The issue of teen safety has dogged Meta since its start as Facebook, and these new rules come amid revived backlash from parents and watchdog groups. Instagram has come under fire for not protecting children from child predators and feeding them self-harm content. While testifying at a Senate hearing on online child safety in January, Meta CEO Mark Zuckerberg apologized to parents in the audience holding signs with pictures of children lost to suicide or exploited on the app.

And according to a 2021 Wall Street Journal investigation, researchers at Instagram have been studying how the app harms young users, especially young girls, for years. One internal slide from a 2019 company meeting said: “We make body image issues worse for one in three teen girls.” Until recently, executives at the company such as Zuckerberg and Adam Mosseri, the head of Instagram, minimized these concerns.

The Kids Online Safety Act, a bill that passed in the Senate this summer, would establish guidelines aimed at protecting minors from harmful social media content, including disabling “addictive” features on platforms. A House panel advanced the bill last week.

Jim Steyer, founder and CEO of Common Sense Media, an organization that promotes safe technology for children, called the timing of Meta’s announcement “transparent.

“This is basically another attempt to make a splashy announcement when the company’s feeling the heat politically, period” Steyer said. “Meta has always had these capabilities and the ability to develop new features, and they could have done this to protect young people for the last 10 years. Now that we’re in the middle of a mental health crisis among young people that’s been significantly brought on by social media platforms like Instagram, they’re acting now under pressure from lawmakers and advocates.”

This summer, Vivek H Murthy, US surgeon general, called on Congress to issue a warning label on social media, not unlike the ones found on cigarettes or alcohol. Describing the mental health crisis among young people as an “emergency”, Murthy cited the fact that teens who spend more than three hours a day on social media face double the risk of anxiety and depression symptoms, and that nearly half of all adolescents say these apps make them feel worse about their bodies.

skip past newsletter promotion

For Jon-Patrick Allem, an associate professor at Rutgers School of Public Health who researches social media’s effects on teens, Instagram’s new rules do not seem radical. “I read a line in the New York Times that described these rules as ‘a sweeping overhaul’,” he said. “I can’t think of a worse way of describing this. I think instead, these are slight modifications on one app that will probably do some good, but not enough good.”

Stephen Balkam, founder of the Family Online Safety Institute, is concerned that regulators and researchers will not see internal data from Instagram’s new teen rules. “Without a requirement that [Meta] tell us the data on child safety, I don’t know if this will move the needle or not,” he said.

A recent Harris poll of 1,006 gen Z adults (age 18 to 27) published in the New York Times found that 34% of respondents wish Instagram had never been invented. Even more wished the same for TikTok and X: 47% and 50%, respectively.

Morton, the 16-year-old, says that among her classmates, TikTok and Snapchat are the most popular social media apps, but she still checks Instagram a few times a day. “I have a tendency to open the app, refresh my like feed, close the app and then reopen the app, like, five minutes later,” she said.

Morton added that she would “love” to have a phone with no social media, only her contacts, iMessage app and camera. “That would be a dream,” she said. “People ask me, ‘why can’t you just delete social media?’ But it’s not that easy. It’s where all my friends are. I’d miss out on parties and hangouts. If I deleted it, I guarantee I would have it back within 24 hours.”



Source link

Leave a Reply

Back To Top