6 months into war, Palestinians face high unemployment and a ‘disastrous’ economy
April 10, 2024White-sounding names get called back for jobs more than Black ones, a new study finds
April 11, 2024As Minnesota kids report mounting mental health side effects stemming from social media, lawmakers are looking to address what they view as the root cause: the tech companies that design the applications that children are often hooked on.
Without federal guidance on how tech companies should interact with kids, many states are taking their own stabs at curbing adverse impacts of apps, games or sites for their youngest users.
While the approach has varied by state, several have landed in court over laws that require parent consent to access apps or websites or prohibit the sale or sharing of young users’ information.
And tech industry groups have said that a similar bill in Minnesota could face a constitutional challenge.
Despite that, families whose children have experienced adverse effects of the sites, along with mental health providers, said it’s worth taking a stand.
Erich Mische, executive director of Suicide Awareness Voices of Education, pointed to an increase in suicide deaths among young people that surged after social media companies went public around 2007.
“You could call it a coincidence. Or you could say, ‘Where there’s smoke, there’s fire,’” Mische said. “The reality is there is a corresponding increase to suicidal ideation, mental health, harm and suicide among today’s youth because of the unregulated nature of big tech, social media platforms. It’s time to fight fire with fire.”
Supporters of the Minnesota Age-Appropriate Design Code said the bill would do just that. The proposal would require large tech companies to automatically implement the highest privacy protection levels as a default for minors.
It would also bar tech companies from sharing or selling’ kids’ private information, tracking their location without notice or including design features aimed to keep users on longer, such as autoplay or push notifications.
“It compels companies to design better, safer products through the lens of child safety, like any other product on the market,” bill author Rep. Kristin Bahner, DFL-Maple Grove, said at a news conference recently. “We must hold tech accountable and responsible for their products.”
Under the bill, the attorney general could file a lawsuit against companies that fall short of the requirements and those who run afoul of the rules could face fines.
While well-intended, representatives for tech companies said the legislation is too vague.
“An unconstitutional law protects no one, including kids,” Amy Bos, who works with Net Choice, a trade group that represents Meta, TikTok, Google, X and others, told lawmakers at one of various hearings on the proposals.
Bos pointed to injunctions to a similar law enacted in California, as well as to laws in Ohio and Arkansas that require parental consent for minors to access social media sites. She warned that efforts to restrict what kids see online could face similar constitutional challenges in Minnesota.
“Many platforms will default to taking down all content on entire subjects which is likely to remove beneficial, constitutionally protected material, along with anything harmful,” Bos said.
Rep. Anne Neu Brindley, R-North Branch, echoed those concerns during a House hearing on the bill.
“I actually am in no way concerned about how difficult it would be for businesses to implement this. That is frankly the least of my concerns,” Neu Brindley said. “I am however very concerned about passing a bill that can withstand scrutiny, that is clearly defined and that even is able to be implemented.”
Bahner said her proposal gives companies an opportunity to show steps they are taking to keep data private, if users raise concerns. And they’d have a 90-day window to resolve any issues if they fall short of the standards.
“I don’t know of very many places in the law, where you get a veritable do over, right?” Bahner said. “The point of this bill is to build better safer products and protect kids, it is not to punish companies.”
‘No more’
Sixteen-year-old Shamail Henderson and her mom Shama Tolbert spoke to dozens of young people at the Capitol last week about the proposal. They said it could help prevent dangerous situations like what Shamail experienced on social media.
The pair told the crowd about how starting at 12 years old, Shamail went to the library with her friends to log onto Facebook.
“We thought they were studying,” Tolbert said. “But they were online meeting with adults, meeting strangers who knew how to target them through social media, because of the security issue not being in place.”
Tolbert said the app suggested that Shamail add older men who posed as peers on the site. And from there, they lured and kidnapped Shamail and subjected her to sex trafficking.
“I held off on allowing my daughter to have a phone. I monitor social media, I did everything I was supposed to. And she was still not safe,” Tolbert said.
Shamail, along with pediatricians and families whose children experienced bullying and dangerous interactions on social media, called on legislators to impose additional restrictions for games, streaming sites and social media apps. Shamail told her peers that none of them should experience what she went through.
“I am here declaring with all my might: no more,” she said. “This should not — cannot — be the reality for kids.”
A report from the Minnesota Attorney General’s Office suggests that Shamail is not alone. The assessment found that social media notifications, scrolling options and algorithms spurred increased rates of harassment, cyberbullying and unwanted content for minors on the sites.
Compensating ‘kidfluencers’
A separate bill moving through the Capitol would require influencers that feature minor family members in more than 30 percent of their content to keep records about when the children appear and how much they earn. It would also require that a portion of the pay be kept in a trust fund for the child.
Minors featured in paid influencer accounts could also request that work they appeared in be deleted later. The bill’s backers said that while there are legal protections now for minors who work in acting, modeling or other professions, those don’t apply to kid influencers.
“This new and exploding industry, this multibillion-dollar industry has no guardrails to ensure that minors are protected, and compensated,” said Sen. Erin Maye Quade, DFL-Apple Valley. “Regardless of their age, people deserve to be compensated for their work and children deserve a chance to consent to how their image is kept online.”
Illinois has enacted a similar proposal setting aside earnings for kid influencers. A handful of other states are weighing similar bills this year.
The bills are part of a broader push for online safety at the Capitol. Both are moving through committees ahead of a deadline next week.
Another, which would prompt tech companies to update privacy terms for all Minnesota users, is on the cusp of a House vote.