Editor’s note: Kara Alaimo, an associate professor of communication at Fairleigh Dickinson University, writes about issues affecting women and social media. Her book, “This Feed Is on Fire: Why Social Media Is Toxic for Women and Girls — And How We Can Reclaim It,” will be published by Alcove Press in 2024. The views expressed in this comment are their own. Read on More feedback On CNN
Tech executives could face time behind bars in the UK if they deliberately ignore laws designed to protect children online Online Security Bill.
As currently written, the bill would require Social media companies To identify and remove self-harming content, including content that glorifies, and does not condone, suicide Children Under 13 years of age to use our Platforms. In a written statement to Parliament, Secretary of State for Digital, Culture, Media and Sport Michelle Donnellan said: Tech leaders Those who act in “good faith” will not be affected, but those who “consent or acquiesce” in not following the new rules may. Jail time.
Let’s hope this bill gets passed. For too long, tech leaders have shied away from responsibility for the harmful effects their products have on users. And while it’s unlikely that an amendment to the UK bill will ever pass in the US. gave Its strongly pro-business environment, broad constitutional protections for free speech, and regulations that LIMIT LIABILITY FOR INTERNET PLATFORMS. Other countries should consider similar fines for tech executives — on what their users post online.
The tech industry certainly disagrees. Tech UKAn industry trade association in the country said jail time would not make social networks safer for children but would discourage investment in the country. But I think this law will do just the opposite: act as a wake-up call to tech leaders to be accountable for the products they build.
One of the reasons tech executives have avoided personal responsibility for their impact on society for so long is because of the way we think about social media. We talk about what happens in real life to distinguish it from what happens online. But the effects social networks have on users – especially children – are often felt much more in “real” life.
For example, in September, a British coroner ruled that “the negative effects of online content” were partly responsible for the suicide of a 14-year-old boy. Molly Russell. In the six months before he took his own life in 2017, from Data, The Guardian reports Meta revealed that Molly saw 2,100 pieces of content on Instagram related to self-harm, depression and suicide.
MetaInstagram’s parent company has acknowledged that Molly saw content that violated its community standards, and in 2019, added new policies against graphic images of self-harm. It also began offering links to resources to users viewing depressing content.
But, in 2021, the US Staff of Sen. Richard Blumenthal Created an account pretending to be a 13-year-old girl and followed accounts promoting an eating disorder. Instagram then promoted random food accounts with names like “Always Hungry.” Instagram told CNN that it had removed the accounts and that they should not have been allowed in the first place because they violated the platform’s rules against content promoting eating disorders.
And a terrible report that Center for Countering Digital Hate A statement released last month described what happened when researchers set up TikTok accounts where 13-year-olds instantly liked and liked mental health and body image content. Within 2.6 minutes, TikTok was showing suicide content.
Within eight minutes, the platform was recommending content about eating disorders. When an account used a name that suggested the user suffered from an eating disorder, TikTok featured even more of this type of scary content. Tick tock has stated that the content the researchers saw does not reflect what other users see because of the study’s limited sample size and time constraints, and that it removes content that violates its standards. and provides resources for those who need them.
And a former Facebook employee turned whistleblower. Francis Hogan In 2021, it was revealed that Meta was well aware of the harmful effects Instagram had on some young users. But Hogan said the company prioritizes making money over protecting children. Meta has said it is developing parental controls and features to help teens take control. Instagram use, and CEO Mark Zuckerberg disputed Hogan’s characterization of the company. wrong.
Only two members of Congress in the United States have passed. Laws Regulating how companies interact with children online over the past 25 years – a site requiring parental consent to submit. Data About minors 13 years of age and one who is responsible for facilitating the Sites. Human trafficking and prostitution.
There’s no reason tech leaders should be exempt from responsibility for what their products can do to consumers. This amendment In Britain Should be too A wake up call To parents and others Social media users About the dangers we and our children may face online.
If prison sounds harsh, it’s nothing compared to the price Molly Russell and her family have paid. But, five years after her suicide, social platforms are still serving up the same kind of toxic content to vulnerable youth. It must stop — even if it means putting tech executives behind bars.