Editorial: A safer 'net for kids: The Kids Online Safety Act is worth passing
Published in Op Eds
Whether it’s TikTok or Instagram or Snapchat or some yet-to-be-released app sure to enrapture its users, it’s pretty clear — to us, at least — that too many young people are now spending too much of their time falling all the way down shallow-yet-deep online rabbit holes designed by profit-hungry companies to draw them in.
You don’t have to subscribe wholesale to the research presented in Jon Haidt’s bestseller “ The Anxious Generation” to be worried that the rapacious use of new technology, while conditioning kids to seek approval from friends and strangers alike, has thrown impressionable young minds out of balance, and that corrective measures are long overdue.
A Gallup survey of nearly 1,500 adolescents last year said the average teen is on the big seven social media platforms for nearly five hours per day. The clinical term for that amount of time is one helluvalot.
We supported the Kids Online Safety Act before it passed the U.S. Senate in late July, and we support it now as it struggles to get over the hurdle in the House. It would create a “duty of care” for online platforms to protect minors, requiring them to take steps to “prevent and mitigate” harms ranging from bullying to the promotion of suicide, eating disorders and more.
The platforms would have to give children more control over their personal information — and for minors disable or limit some of the features that make their products especially addictive, like video autoplay or platform rewards. When users are vulnerable young people, settings would by default be less exploitative — lest kids, with their parents, choose to override them.
As with any complicated legislation that touches on the First Amendment, which provides robust protection for almost all types of expression, we have reservations. There is the possibility that a very well-intentioned bill could, if it becomes law, wind up arbitrarily preventing youngsters from accessing important information. There’s a risk that platforms will be so worried about their legal liability for enforcing the new standard that they’ll enforce their rules too rigidly.
It’s all worth following carefully, and recalibrating if and when problems surface. But given how profoundly social media apps have changed American childhood, and widespread evidence that boys and girls especially are struggling in this perilous new landscape, it would be irresponsible for government not to take some strong steps soon.
Fortunately, the scrutiny and impending legislation already seems to be prompting folks in Silicon Valley’s C suites to be developing and implementing some new safety features, like car companies deploying antilock brakes or collision avoidance systems. This week it was Instagram, which is owned by Meta (also the parent company of Facebook), rolling out “teen accounts” to give parents more control — limiting the time their offspring spend on the site, what content they see, who can find them via search, and the like. Parents can also see who their youngsters are messaging.
“Messenger Kids,” a kid-friendly Facebook person-to-person chat app, came out way back in 2017 with some of the same features — albeit aimed at little kids.
Parents need to exercise more control, to be sure. But given how ubiquitous these platforms are, given how easy it is for youngsters to find workarounds, there’s a role for government, too. Pass the bill.
___
©2024 New York Daily News. Visit at nydailynews.com. Distributed by Tribune Content Agency, LLC.
Comments