No more making money off videos of kids

The sponsors of bills to regulate what are sometimes called “mommy run accounts” on social media already thought they were on the leading edge of legislation nationally.

Rep. Zack Stephenson’s House File 3488 and Sen. Erin Maye Quade’s Senate File 3496 sought to make Minnesota just the second state after Illinois to require parents who profit from videos that feature their children to share money with their kids.

Now, both realize they didn’t go far enough after a series of stories in The New York Times that revealed how some of the accounts exploit children — sometimes sexually.

“Thousands of accounts examined by The Times offer disturbing insights into how social media is reshaping childhood, especially for girls, with direct parental encouragement and involvement,” the series stated. “Some parents are the driving force behind the sale of photos, exclusive chat sessions and even the girls’ worn leotards and cheer outfits to mostly unknown followers.”

State Sen. Erin Maye Quade

“I need to spend more time thinking about a better solution to that problem than my bill,” Stephenson, DFL-Coon Rapids, said in a text message after the Times story posted. “My bill, while a good idea, is not a solution to that problem.”

Maye Quade, DFL-Apple Valley, said in an interview that she and Stephenson have been struggling to find a way to address “the sexual exploitation and grooming issue that we’re seeing pop up.” 

The solution came with an amendment: The first version said parents who make money off their kids would have to set 30% of the money aside in a trust fund; the amended version says no Minnesota-based social media account could make money from videos featuring children under age 14. It would consider taking part in social media content creation among the many jobs that children are not allowed to perform under state law.

RELATED: Minnesota lawmakers introduce legislation aimed at protecting children on social media

State Rep. Zack StephensonState Rep. Zack Stephenson

“I encourage you to read the whole thing,” Stephenson said after distributing the Times series to members of the House Labor Committee, “but not over the lunch hour. It will turn your stomach.”

“Members, this is vile and we can’t allow it to continue,” he said. Social media postings that do not make money from platforms or advertisers could continue to include photos or videos of children. Children between 14 and 18 could be included in a parental account as long as 30% of the earnings are set aside in a trust. After age 14, children are allowed to have their own social media accounts and could continue to profit from them under the bill.

But the bill also creates a mechanism for children featured in photos and videos posted on social media accounts to demand once they reach 14 that the images be taken down.

“Children have a right to have childhoods free of working, just like they do for pretty much every industry,” Maye Quade told the Senate Judiciary Committee Monday. She said that the bill would not impact all videos or photos that show children.

“This is targeting content creation that generates compensation at a really high level: your super influencers, your family vlogging channels that have 12 million followers,” she said. “The kid that does streaming videos and makes 50 bucks, that’s not included.”

Generally the accounts are slice-of-life serials on Tik Tok, YouTube or Instagram that let viewers watch how families lead their lives. The phenomenon is sometimes dubbed “sharenting.” Money comes from advertising or more-often product placement by brands. But the accounts are facing increased scrutiny from the media (as in this Cosmopolitan magazine article) and from other states. The issue is whether the sites exploit children, expose them to risks, or take away their privacy.

The sites have also spawned organized campaigns urging social media users to avoid such sites because they can exploit children. One of those is called Quit Clicking Kids.

The trust fund idea comes from laws first adopted in California to prevent child actors from being exploited by parents. Called the Coogan Law, it was named for child star Jackie Coogan who saw little of the money movie studios paid to his parents.

Some members of the House Labor Committee asked why the bill would include innocent videos that happen to become popular enough to attract payments. Rep. Shane Mekeland, R-Clear Lake, said he has a family member who posts motocross and snowmobiling videos that have a following.

“If they monetize it, do they get caught into this?” he asked.

But Stephenson said he could not find a way to separate non-sexualized content from the types of exploitation shown in the Times series. So any social media content that includes under age-14 children for a significant amount of time could not make money, he said. 

“Children should be children, and they should not be engaged in this kind of behavior. And we have made that decision for a whole host of jobs under the age of 14,” he said. “The cost of that dark side is so immense that we have to take action against it.

“I understand the instinct to find a more tailored approach,” he said. “I can’t find one.”

Exceptions under state child labor laws include modeling and acting. The proposed legislation doesn’t change those exceptions but regulates content creation separately as “content shared on an online platform in exchange for compensation.”

While Illinois has a law requiring compensation, the creation of trust funds and the right for minors to have images removed, Minnesota would be the first state to ban social media accounts featuring children to make money.

“This is an emerging problem,” Stephenson said. “We are behind on it as a society. We have not done nearly enough to put guardrails on social media, in terms of its impact, particularly on young people.”

Two other major changes to the original bills would require social media sites to remove content if requested by children and would assign the state attorney general the duty of enforcing the provisions around trust accounts.

The original version said children would have to make requests of the content creator to remove content. Since that often would mean their own parents, the sponsors have now decided to require the platforms to have some responsibility as well.

It is that aspect of the bill that has raised concerns from some in the tech industry.

“Although the bill aims to protect the privacy of a young person, it could also result in the opposite effect by creating an implicit requirement for platforms to collect sensitive, personally-identifiable information to authenticate identity, age, and parental relationship in order to prove they are the subject requesting the deletion,” wrote Tyler Diers, executive director, Midwest for TechNet, a trade group.

But Maye Quade said she has “full confidence in tech companies’ ability to do a lot of things. It’s whether they want to or not.”

The original versions also did not assign any enforcement to the state, opting instead for private lawsuits by children once they reach 14. While that is still allowed, the bill sponsors wanted additional enforcement of the provisions.

Both the House and Senate versions of the bill are expected to meet Friday’s deadline for legislation hoping to advance this session.

Originally Appeared Here

You May Also Like

About the Author: Rayne Chancer