A proposed Senate bill aims to protect Vermont's children from big tech companies.Photo courtesy of Centers for Disease Control and Prevention.
However, companies like Meta and TikTok will be responsible for evaluating their data protection policies under the bill and determining whether they comply with the law.
by brooke burnscommunity news service Documents from Vermont Attorney General Charity Clark's lawsuit against meth show that in 2020, approximately 80% of Vermont teens used Instagram, one of the highest rates in the country. Regardless, it became clear that the company was still looking for ways to give these teens more hours per year. That day on the app.
Amid these revelations, it was revealed that a new Senate bill aims to protect Vermont's children from predatory data collection and online content that exploits their vulnerabilities.
The bill, S.289, colloquially known as the “Vermont Kids Code,” would allow large technology companies to prohibit legal action in ways that create a reasonable risk of physical harm, severe emotional distress, or economic harm. It prohibits the collection of data about children or the design of products. A highly objectionable intrusion on expectations of privacy, or any method of discriminating on the basis of a protected class.
The ban applies to individuals and companies that collect or cause data about people to be collected. Active in Vermont. At least one of the three criteria must be met. That his annual gross revenue exceeds $25 million. Buy, receive, sell, or share data for more than 50,000 households, devices, or people per year. More than 50% of his annual revenue comes from selling user data.
However, companies like Meta and TikTok will be responsible for evaluating their data protection policies under the bill and determining whether they comply with the law.
Lead sponsor Sen. Kesha Lamb Hinsdale, D-Chittenden, said in an interview that the bill is modeled on bills in the European Union and the United Kingdom that have proven successful.
“Most of the bill is modeled on UK policy, not because we assume these companies are essentially trying to take our children, but because they “Because countries already meet many of these standards in countries, the EU, even the UK,” Lamb Hinsdale said. “So as much as they're fighting design and code principles here, they're also responding to them elsewhere, so they know exactly what we want.”
Lamb Hinsdale also said in an interview that while there could be delays in evaluating the safety of features companies may add to their products, it would not be difficult for state attorneys general to find violations.
““It became very clear to me what it’s like to have a platform that’s healthy and safe for kids,” she said. “Even if you come up with a new feature, it may take some time for researchers and scientists to decide that it is not in the best interest of children. There are many pediatricians and pediatric researchers across the country who are highlighting some of the most harmful features and techniques that are currently being used.”
Lamb Hinsdale introduced the bill at the February 15th meeting of the Senate Economic Development, Housing, and General Affairs Committee.
“We were told this genie is out of the bottle, there's nothing you can do. We can't accept that answer anymore,” Ram Hindsale said at the meeting. “We need to understand what we're up against, and we especially need to protect our children from such things and protect their privacy and mental health.”
Lamb Hinsdale emphasized that the bill's language is tailored to go after large companies operating in Silicon Valley that impact Vermont's children, not local tech companies.
“There's some new draft language that will really help reassure them,” Lamb Hinsdale said of local businesses.
In addition to the necessary assessments, eligible companies must include a plan to ensure that existing and future products are designed with the best interests of children in mind.
Heidi Schumacher, an assistant professor of medicine at the University of Vermont and a pediatrician, testified in support of the bill at the Feb. 15 meeting and said she was speaking on behalf of the profession.
“There is no doubt that social media can play a really positive role in our lives, expanding our social networks and enabling the development of particularly marginalized young people.” said Schumacher. “But in many cases, young people themselves believe they are spending too much time on social media, and that features intentionally designed to help them stay connected to social media are making them less connected. I realize I can't undo it.”
“While spending so much time online, they are regularly exposed to dangerous content and unhealthy habits that pose a direct risk to their health and well-being,” she said.
The bill follows a similar law enacted in California in 2022. The state's age-appropriate design standards were blocked by a federal judge in September 2023 following a lawsuit filed by technology trade group NetChoice. The judge agreed with his NetChoice opinion that California's law violates the First Amendment by targeting certain speakers, namely commercial organizations. The ruling is controversial among child advocacy groups.
“These companies are not having a principled or nuanced discussion about the First Amendment,” said Meetali Jain, founder of the Tech Justice Law Project, a Washington, D.C., organization that advocates for policy frameworks appropriate for the digital age. said at the February 15th meeting. “They are making wild arguments to avoid responsibility. And at the heart of their strategy is that all companies' business activities, including how they collect data about children and how they design their products, are unregulated. It is an attempt to manipulate the First Amendment by claiming it is uncontrollable First Amendment speech.”
But Marisa Shea, senior policy manager at 5Rights, a London-based nonprofit focused on children's digital safety, said at a Feb. 15 meeting that Vermont's bill has proven to be more legally airtight. He said that there is a possibility that
“This bill contains a narrowly tailored definition of best interest and does not require platforms to do more than certain things to provide children with the best possible online experience. Because we agree that it's too subjective,” Shea said. “Rather, it is carefully defined to prevent established harms that have been recognized by our legal system for decades.”
Community News Service is a program where students work with professional editors to provide content for free to local news organizations.