Skip to main content
Cloudy icon
73º

'Addictive' social media feeds that keep children online targeted by New York lawmakers

1 / 6

Copyright 2023 The Associated Press. All rights reserved.

New York Gov. Kathy Hochul speaks during a news conference in New York, Wednesday, Oct. 11, 2023. New York is bidding to put new controls on social media platforms that state leaders say will protect the mental health of younger users. (AP Photo/Seth Wenig)

New York would restrict the way online platforms like Instagram and YouTube can collect and share children’s personal information and let parents keep their kids from being bombarded by “addictive” feeds from accounts they don’t follow, under legislation proposed Wednesday.

The bills offered by state leaders are aimed at protecting young people from features designed to keep them endlessly scrolling, endangering their mental health and development, Attorney General Letitia James said.

Recommended Videos



“Young New Yorkers are struggling with record levels of anxiety and depression, and social media companies that use addictive features to keep minors on their platforms longer are largely to blame,” James said. “This legislation will help tackle the risks of social media affecting our children and protect their privacy.”

The regulations sought by James and Gov. Kathy Hochul, both Democrats, are similar to rules already in place in Europe, where violations could incur fines worth a percentage of revenue, which could run into the billions of dollars for wealthy tech companies.

One of the bills, the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, would would allow parents to opt their kids out of getting feeds curated by an algorithm. Instead it would have them get a chronological feed of content from users they already follow. Algorithms are the automated systems that social media platforms use to keep users engaged by suggesting content based on the groups, friends, topics and headlines a user has clicked on in the past.

Middle school teacher Kathleen Spence said some of her students come to class half asleep after spending nights immersed in the social media content teed up by their smartphones. But it’s her own daughter’s eating disorder and near suicide that pushed her to speak in support of the legislation.

Spence attributes her now-21-year-old daughter's past mental health struggles to the thousands of inappropriate posts and images that peppered her social media feed after she made her first account at age 11 with an interest in Webkinz plush toys.

“I don’t want even one more family to experience what my daughter and our family had gone through,” Spence said.

The legislation also would let users block access to social media platforms from midnight to 6 a.m. and limit the hours a child spends on a site.

The second bill, the New York Child Data Protection Act, would prohibit all online sites from collecting, using, sharing, or selling personal data of anyone under 18 years old, unless they receive informed consent or it's otherwise necessary.

California-based Meta, which owns Facebook and Instagram, said parental supervision tools and other measures already are in place to ensure teens have age-appropriate experiences online, adding that algorithms also are used to filter out harmful content.

“We refer to research, feedback from parents, teens, experts, and academics to inform our approach,” Antigone Davis, Meta’s head of global safety, said in a statement, “and we’ll continue evaluating proposed legislation and working with policymakers on developing simple, easy solutions for parents on these important industrywide issues.”

Companies could be fined $5,000 per violation of either law.

Under new digital rules that came into force this year across the 27-nation European Union, platforms have to give users an alternative to automated systems that recommend videos and posts based on their profiles. Thus Meta, for example, now also allows European users to see chronological Facebook and Instagram posts only from people they follow.

The rules, known as the Digital Services Act, also prohibit platforms from using children’s data and online activity to target them with personalized ads.

Another set of rules, the General Data Protection Regulation, or GDPR, provide beefed-up data safeguards and rights for EU residents. Regulators slapped TikTok with a $366 million fine last month for breaching GDPR by failing to protect children’s privacy.

The legislation in New York also follows actions taken by other U.S. states this year to curb social media use among children. In March, Utah became the first state to pass laws that require minors to get parental consent before using social media. The laws also compel companies to verify the ages of all their Utah users, impose a digital curfew for people under 18 and ban ads from being promoted to minors. But experts have noted the new rules, which are set to take effect next year, could be difficult to enforce.

Meanwhile, another state law in Arkansas that would have also required parental consent for children to create social media accounts was put on hold by a federal judge in August.

The New York proposals drew swift opposition from a tech industry trade group, which urged the state to consider an alternative approach to what it termed an “unconstitutional, wasteful effort.”

“It’s unfortunate for New Yorkers that the state is stripping parents of their right to raise their children as they deem appropriate, all while ignoring the simple steps of working with schools and community leaders to educate students and adults how to use social media in a safe and responsible manner,” said Carl Szabo, vice president and general counsel for NetChoice, whose members include Meta and TikTok.

TikTok, in a statement, did not address the legislation directly but pointed to an increase in safety features announced earlier this year, including a requirement for teens to enter a passcode if they want to keep watching after 60 minutes on the site.

James said she believes that the New York legislation's narrow focus on “the addictive features that keep kids online longer” would allow it to withstand any potential legal challenges.

Aside from issuing new laws, some states also have been taking social media companies to court over a host of issues, including their algorithms and data collection practices. This week, Utah filed a lawsuit against TikTok, alleging the app’s addictive algorithm is damaging minors. Arkansas is also suing TikTok and Meta, which owns Facebook and Instagram. Indiana sued TikTok last year claiming the Chinese-owned app misleads users about the level of inappropriate content on the app and the security of their information, but the challenge doesn’t appear to be going in the state's favor.

The U.S. Supreme Court is preparing to decide whether state attempts to regulate social media platforms violate the Constitution. The justices will review two laws from Florida and Texas that mostly aim to prevent social platforms from censoring users based on their viewpoints.

___

Associated Press writers Maysoon Khan in Albany and Kelvin Chan in London contributed to this report.


Loading...