by Sally Jamrog
Opinion
November 23, 2021
Facebook’s launch in 2004 plunged the world into a new era of constant engagement, and as diverse social media platforms have continued to rise and thrive, by capitalist societal standards, they have become the embodiment of twenty-first-century commercial success. The real success of these platforms, however, should instead be quantified based on how they affect their users, especially younger generations, since Meta Platforms, Inc. and other social media companies have, in light of their commercial aptitude, turned a blind eye to social media’s more toxic effects. With American teens spending an average of more than a third of their day outside of school-related activities on screens absorbing media, these dangers have become crucial to address.1
Logging onto social media is no longer a fully conscious choice. According to Tristan Harris, co-founder and president of the Center for Humane Technology and former design ethicist at Google, “Social media isn’t a tool that’s just waiting to be used; it has its own goals and it has its own means of pursuing them by using your own psychology against you.”2 Social media has become a serious addiction, ingrained in our daily habits and whims to the extent that it steals our time and interferes with schedules. To generate content that keeps users engaged and scrolling, social media platforms use algorithms and artificial intelligence to best determine what types of content will keep a user’s attention and thus have the most economic success. In and of itself, this type of tactic is to be expected within the bounds of American capitalism, but with social media, when people are exclusively treated as products with rarely any regulation or concern for mental health, it seems morally questionable to continue to advertise these platforms as ways to foster human connection. Even more questionable is for these companies to keep targeting a younger, more vulnerable population, as children and young adults are still experiencing cognitive development. Ultimately, Facebook, Instagram, Snap(chat), and other social media platforms are for-profit organizations that were not designed to protect kids. It might be even in these organizations’ best commercial interests to exploit their younger users as a way to increase their user base.
In recent years, companies like Meta Platforms, Inc. have worked on creating social media platforms more suitable for kids aged thirteen years or younger, such as Instagram Kids. These platforms may create a safer social media environment for kids, as many of these platforms do regulate their content accordingly. But since large-scale censorship on these platforms is usually determined by an algorithm rather than a human, it is often not possible to completely censor harmful content on a site. “To be honest, I don’t think there’s a way to create a completely ‘safe’ version of anything online. I think you can put in restrictions and try your best to make a safe space, but there will always be people who bypass that,” says Sarah Emmert ‘24. According to a study performed by Common Sense Media, an organization dedicated to informing families about media, on YouTube Kids, 27% of the videos watched by children ages eight and younger contain depictions of violence and other graphic content.3 To keep turning a profit, these companies also still advertise to YouTube Kids users who have not purchased the ad-free YouTube Premium.
Similarly, while parental controls can work to mitigate social media access for kids, they are not a solution for all families. “I think parental controls are effective only if the relationship between parent and child is one that sets boundaries in a healthy way and there is complete trust on both sides. Parental controls are only truly effective at keeping kids away from social media if they are backed up by understanding on both sides,” says Therese Askarbek ‘24. In the same vein, general age restrictions on potentially inappropriate social media platforms such as YouTube or Instagram often fail to deter kids from interacting with these platforms. “The age limit for most social media platforms is thirteen because of the Children’s Online Privacy Protection Act (COPPA), which was passed in 1998,” says Rohan Biju ‘23, leader of BUA’s YouTube Tech Review club. “COPPA restricts websites from tracking data on children under thirteen, which is why most apps do not want kids younger than thirteen to join.” Unfortunately, a recent report instigated by Thorn, an organization that works to combat international child abuse, claims that out of one thousand children surveyed internationally from ages eight to seventeen, 40% of the individuals under thirteen years of age already had access to either Facebook or Instagram.4 Because kids are getting more exposure to these platforms at increasingly younger ages, they are more and more likely to form habits of frequent social media usage that will follow them through their teenage years and into adulthood.
Researchers have also observed that some social media platforms have more of an effect on mental health than others, particularly when it comes to teens and preteens. Meta Platforms’ published Instagram investigation states that “social comparison is worse on Instagram,” whose primary focus, unlike other social media apps such as TikTok that focus more on video sharing, is on a person’s physical appearance and way of life.5 With posts on Instagram specifically reflecting everyone’s personal “highlight reel,” an unhealthy culture of comparison has emerged that has proved detrimental to teenage mental health, exacerbating depression and anxiety by creating impossible standards for beauty and lifestyle. An increase in suicide rates and rates for hospital admissions for non-fatal self-harm in teenage girls even seems to correlate with the point when social media first became available on mobile devices, increasing substantially since 2009. Compared with the average rate from 2001 to 2010, in teenage girls aged fifteen to nineteen, suicide rates saw a 70% increase after 2009, while in teenage girls aged ten to fourteen, suicide rates increased by 151%.2
Despite these negative effects, social media platforms are not entirely without positive value. Vicky Rideout, an advocate for children and families concerning social media, recently released a study on the ways social media can affect teens, which shows that social media platforms can have both positive and negative consequences to mental health. During her experiment, she interviewed a sample of teens. Although 17% of the teens reported that social media had the opposite effect and 40% remained neutral, 43% reported that social media increased their positive emotions.6 Unfortunately, the positive aspects of social media have been overshadowed by the negative health consequences of social media companies’ profit-oriented agenda. For instance, if these platforms defined their success based on user happiness instead, social media could have a more worthwhile and positive influence overall. The issue then becomes reconciling the commercial motives of these companies with more ethical behavior. In his article for the Harvard Business Review, Andy Wu, a professor in business administration, illustrates this issue with what he calls the “Facebook Trap,” arguing that the same networking strategies that made Meta Platforms, Inc. (formerly Facebook) incredibly successful will now be the cause of its downfall.7 A more ethical approach, in combination with educating young users about the effects of social media, would be a step in a healthier direction. As Alvin Lu ‘23, co-leader of BUA’s computer science club, says, “I find the most effective way [to do this] is to teach children how to not let social media impact their mental health negatively. […] Proper education will definitely become more important as children use [social media] more often.” The education of children and teens to cultivate more awareness of these platforms’ motives and side effects could help to check social media’s negative ramifications. For social media to live up to its best purpose, that of uniting communities and strengthening global relationships, Meta Platforms, Inc. and similar companies need to balance making money with preserving human sanity.
1 Hayley Tsukayama, “Teens spend nearly nine hours every day consuming media,” The Washington Post, November 3, 2015,
https://www.washingtonpost.com/news/the-switch/wp/2015/11/03/teens-spend-nearly-nine-hours-every-day-consuming-media/.
2 Jeff Orlowski, dir. The Social Dilemma, The Space Program, Argent Pictures, and Exposure Labs, Netflix, 2020,
https://www.netflix.com/title/81254224.
3 Caroline Knorr, “Parents’ Ultimate Guide to YouTube Kids,” Common Sense Media, March 12, 2021,
https://www.commonsensemedia.org/blog/parents-ultimate-guide-to-youtube-kids.
4 Katie Canales, “40% of kids under 13 already use Instagram and some are experiencing abuse and sexual solicitation, a report finds, as the tech giant considers building an Instagram app for kids,” Insider, May 13, 2021,
https://www.businessinsider.com/kids-under-13-use-facebook-instagram-2021-5.
5 Bill Chappel, “The Facebook Papers: What you need to know,” NPR, October 25, 2021,
https://www.npr.org/2021/10/25/1049015366/the-facebook-papers-what-you-need-to-know.
6 Anya Kamenetz, “Facebook’s own data is not as conclusive as you think about teens and mental health,” NPR, October 6, 2021,
https://www.npr.org/2021/10/06/1043138622/facebook-instagram-teens-mental-health.
7 Andy Wu, “The Facebook Trap,” Harvard Business Review, October 19, 2021,
https://hbr.org/2021/10/the-facebook-trap.