There has been increasing concern about the adverse effects of social media on teenagers. As a result, various platforms such as Snapchat, TikTok, and Instagram have added new features that are supposed to make their services more suitable for children and safer to use. However, these changes often do not address the root issue, which is that the algorithms used by these platforms constantly push vast amounts of content that can lead anyone, not just teenagers, into harmful or destructive content.
While some of the tools offered by social media platforms may provide some level of protection, such as blocking strangers from messaging children, they also have significant limitations. One of these limitations is that teenagers can easily bypass age restrictions by simply lying about their age. Additionally, these tools often place the responsibility of enforcing rules and regulations on parents rather than on the platforms themselves. Furthermore, these tools do little to address the problem of harmful and inappropriate material being served up to users by algorithms, which can negatively impact teenagers’ mental and physical well-being.
According to Irene Ly, privacy counsel at the non-profit organization Common Sense Media, “These platforms know that their algorithms can sometimes amplify harmful content, but they are not taking steps to stop it.” Ly also pointed out that the more teenagers spend time scrolling through social media, the more engaged they become with the content, making them more profitable for the platforms. She stated, “I don’t think they have too much incentive to be changing that.“
For example, Snapchat recently introduced new parental controls in a feature called the “Family Center,” which allows parents to see who their teenagers are messaging but not the content of the messages themselves. However, both parents and their children must opt into this service in order to use it.
Nona Farahnik Yadegar, Snap’s Director of Platform Policy and Social Impact, compared the use of the Family Center to parents wanting to know who their children are going out with.
Farahnik Yadegar stated that the Family Center is similar to a parent asking their child, “Hey, who are you going to meet up with? How do you know them?” when their child is going out. She said the tool is intended to give parents “the insight they really want to have in order to have these conversations with their teen while preserving teen privacy and autonomy.“
Experts agree that parents must have regular, honest conversations with their children about social media and the risks and pitfalls of the online world. In an ideal situation, these conversations would take place frequently and cover various topics related to social media use.
However, it can be difficult for parents to keep up with the numerous social media platforms their children may be using and the constant changes and updates to these platforms. This makes it challenging for parents to effectively master and monitor the controls on multiple platforms, according to Josh Golin, Executive Director of the children’s digital advocacy group Fairplay.
Golin argued that it would be more effective for social media platforms to make their platforms safer by design and default rather than burdening already overwhelmed parents to try to keep up with the controls and settings on multiple platforms. He stated, “Far better to require platforms to make their platforms safer by design and default instead of increasing the workload on already overburdened parents.”
Golin pointed out that the new controls introduced by Snapchat do not address several ongoing issues with the platform, such as children lying about their ages, the app’s Snapstreak feature encouraging compulsive use, and the disappearing messages feature making it easier for users to engage in cyberbullying. These issues, he said, are among the problems that continue to plague Snapchat.
Farahnik Yadegar stated that Snapchat has “strong measures” in place to prevent children from falsely claiming to be over 13 years old. She said that accounts of users who are caught lying about their age are immediately deleted and that teenagers who are over 13 but pretend to be older are given one chance to correct their age.
While it is not always possible to detect when a user is lying about their age, social media platforms have various ways of determining the truth. For example, suppose a user’s friends are primarily teenagers. In that case, it is likely that the user is also a teenager, even if they indicated a different birth year when signing up for the platform. Companies may also use artificial intelligence to identify age mismatches and look for clues about a user’s age in their interests and activities. Farahnik Yadegar also mentioned that parents may discover that their children have lied about their age if they try to enable parental controls and find that their teens are ineligible to use them.
Child safety and teenagers’ mental health are critical issues raised by both Democratic and Republican politicians in relation to tech companies. In addition, individual states, which have been more proactive in regulating technology companies compared to the federal government, have also begun to address these issues. For example, in March of this year, a group of state attorneys general launched a nationwide investigation into TikTok and its potential adverse effects on the mental health of young users.
According to a recent report from the Pew Research Center, TikTok is the most popular social media app used by teenagers in the United States, with 67% of respondents stating that they use the Chinese-owned video-sharing platform. TikTok has claimed that it prioritizes age-appropriate experiences and has pointed out that certain features, such as direct messaging, are unavailable to younger users. The company also claims that features such as a screen-time management tool help young people and parents regulate the amount of time children spend on the app and the content they are exposed to. However, critics have noted that such controls are not always adequate.
According to Ly of Common Sense Media, “It’s really easy for kids to try to get past these features and just go off on their own.” The Pew Research Center’s report also found that Instagram, owned by Facebook’s parent company Meta, is the second most popular app with teenagers, with 62% saying that they use it. Snapchat came in third, with 59% of teens reporting that they use it. The report also noted that only 32% of teens have used Facebook at some point, a significant decrease from the 71% who reported using the platform in 2014 and 2015.
Last year, Frances Haugen, a former employee of Facebook, revealed internal research conducted by the company indicating that the attention-seeking algorithms used by Instagram may contribute to mental health and emotional problems among teenage users, particularly girls. This revelation led to some changes, such as Meta canceling plans for an Instagram version targeted at children under 13. The company has also introduced new parental control and teen well-being features, such as prompting teenagers to take a break from the app if they spend too much time scrolling.
Ly commented that while these solutions may address some aspects of the problem, they do not address the root cause and are merely “going around it.”