close
close

‘Generation Swipe’: What the new era of social media regulation means for young people

If you last checked social media in the last 24 hours, you’re among the more than half of Americans who say they regularly use social media, according to a study released in January by the Pew Research Center.

No matter what social media you choose, the list of platforms is woven into the fabric of how Americans work, date, think and live. And young people whose brains are not yet fully developed are no exception. Ninety-five percent of teenagers have social media accounts, and one in three say they use them “almost constantly,” according to the Pew Research Center.

However, although 13 is the youngest age to register on most social media platforms, a 2023 advisory issued by US Surgeon General Vivek Murthy found that 40% of children aged 8 to 12 remain active users of social media.

The report cited data showing that when teenagers spend more than three hours a day on social media, they are at greater risk of sleep disorders, anxiety and depression.

“A lot of my social interactions are on social media… like my whole life revolves around it,” Juniper Galvani, a 17-year-old student from Vermont advocating for Vermont Kids Code, told ABC News.

Vermont’s Children’s Code is described on its website as a “consumer protection bill that would require online products that are likely to be used by children under 18 to be age appropriate, privacy by design, and by default and are designed with the best interests of children in mind.”

Vermont is one of many states working to pass similar legislation, the intended goal of which is to make social media safer for children by holding technology companies accountable for age verification and privacy protections, and for removing videos that could harm minors.

Earlier this month, Gov. Wes Moore signed a bill imposing similar rules on big tech, known as the Maryland Kids Code. It requires social media sites to review their data privacy policies and, if a child is found likely to be using their site or platform, to implement safeguards for parents and stricter child privacy laws by October 2024.

At the signing ceremony, Todd and Mia Minor, who lost their son to the social media challenge, stood next to Maryland’s governor and witnessed the moment they have been fighting for for the past five years: legislation to protect children online.

It was March 7, 2019, when Todd and Mia’s son, Matthew Minor, asked to play on his computer after dinner. “You have an hour” before you have to go back to doing your homework, his father, Todd, told him.

This hour will change Todd and Mia’s lives forever. The older brother found Matthew unresponsive.

“He was screaming… come upstairs quickly,” Todd Minor told ABC News’ Elizabeth Schulze. “We saw that Matthew had something around his neck. We didn’t know what was going on with it, but we found out he had something around his neck.”

“When I was doing CPR, all I could think about was that I was asking God to take me instead of my baby,” added the Maryland father and U.S. Army veteran.

“He had the biggest dimples. He was always smiling,” Mia Minor said. “We didn’t do anything else in his room. I mean, it’s exactly the same.”

Matthew was known at school for his charisma, energy and for standing up against child bullying, his parents say. He was also an ambassador for his school, welcoming new students and showing them how and where to find classes.

After reviewing Matthew’s devices, police detectives told the Minor family about some challenges circulating on social media.

Children and teenagers online were calling it the “choking challenge” or the “choking game” – a dangerous viral trend in which people on social media deliberately try to choke themselves to enter a brief state of euphoria.

“It was very cartoony, fun and entertaining. From a kid’s point of view, it was just, ‘Why not?'” Todd Minor said of his reaction to searching for these videos after Matthew’s death. “They have electronics in their pockets. Messages regularly appear on social media: «Try, try. You have to try. You have to try””.

It was at Matthew’s funeral that many of Matt’s classmates revealed to Todd and Mia that they too had tried this challenge.

“That’s when we started thinking something had to be done,” Todd said.

Turning their pain into purpose, Todd and Mia founded the Matthew E. Minor Awareness Foundation to highlight the risks associated with viral challenges on apps and sites like TikTok, YouTube and Instagram.

“You know, because Matthew can’t speak for himself and other kids can’t speak for themselves, and the kids who were reporting things to the big tech companies didn’t do anything about it,” Todd shared about the reason for starting the foundation.

Todd and Mia Minor have made it their life mission to share Matthew’s story whenever and wherever they can, from churches to classrooms to the Capitol. In January, they sat right behind the CEOs of the five largest technology companies during a Senate hearing on children’s online safety.

Midway through the hearing, Meta CEO Mark Zuckerberg – whose company owns the online social media platforms Facebook, Instagram, WhatsApp and Threads – stood up, turned around and apologized directly to Minors and other parents who have lost children to social media.

“I’m sorry for everything you’ve been through. This is terrible. No one should have to go through what your family experienced,” Zuckerberg said.

Todd Minor told ABC News he’s “happy that” the tech giant’s CEO is “saying something.”

“I started seeing it through Matthew’s eyes. And honestly, I started to feel sorry for Mark Zuckerberg because I saw how dizzy he was,” Minor added.

Snapchat, Meta, YouTube and TikTok sent statements to ABC News, all highlighting features they have added to the platforms that they believe make use safer for children.

“If you had a car seat and 50 kids broke their arm because of a manufacturing defect in that seat, we would recall that car seat,” Frances Haugen told ABC News. “And yet, every year, thousands of children in the United States alone are seriously harmed by social media, and we are not doing it.”

Haugen worked for Facebook for about two years when she left the company in 2021 and subsequently leaked tens of thousands of internal documents from her former employer that showed Facebook and Instagram were ignoring risks to young users and instead prioritizing profits.

“Facebook had the opportunity to make its products safer for children. And yet, he repeatedly failed to do so. He lied to the public all the time,” Haugen said.

Her testimony on Capitol Hill in October 2021 sparked a quick response from Zuckerberg, who wrote on Facebook: “I’ve spent a lot of time thinking about the types of experiences I want my children and others to have online, and it’s very important to me that everything we build is safe and good for children.”

After Haugen leaked the documents, and in response to the surgeon general’s warning last May, a bipartisan group of 33 state attorneys general filed a major lawsuit against Meta in October 2023, citing an internal company email in which pointed out that “the lifetime value of a 13-year-old teenager is approximately $270,” which is how much the revenues were worth, and accuses the tech giant of intentionally designing algorithms and notification features that encourage compulsive use.

In response to the lawsuit, Meta said: “We share the Attorney General’s commitment to providing teens with safe, positive experiences online and have already introduced over 50 tools to support teens and their families.”

Director of the Mount Sinai Parenting Center, Dr. Aliza Pressman, emphasized to ABC News the timeline of teen brain development, noting that teens “will not have all the skills that are really useful for being effective on social media.”

As mentioned earlier, states across the country are working to pass more stringent laws aimed at making social media more transparent, accountable, and safe for young users.

Florida Governor Ron DeSantis signed one of the strictest social media bills in March, prohibiting people under 14 from opening or owning social media accounts. Teenagers aged 14 and 15 will need parental consent to use social media. Tech companies could face fines for failing to remove underage users, and parents could sue them for up to $10,000.

“We see adults who can’t regulate themselves on social media. But, you know, we expect that kids who are 13, 14, 15, even younger, will be able to regulate,” Michele Rayner, a Democratic representative from Florida who led the legislation, told ABC News. “We are putting the obligation back on platforms to comply with Florida law.”

Critics say the Florida bill, scheduled to go into effect in January 2025, would disadvantage Florida children and teens who have created successful businesses and communities using social media.

“There are two sides to this matter. I know there are kids who just like to scroll for hours, even all day. But there are other kids like me who want to promote themselves, for example,” said Marley Desinord, a 13-year-old DJ for Miami Heat. “It’s like there’s a gray area there.”

Desinord started posting her mixes on social media at the age of seven. This led to success and made her the youngest NBA DJ, which she says “wouldn’t have been possible without social media.”