In a recent statement, Ofcom, the UK’s communications regulator, has emphasized the need for social media companies to intensify their efforts in preventing algorithms from suggesting harmful content to minors.
Ofcom has outlined new proposals in its preliminary children’s online safety code, which is a key part of the government’s Online Safety Act, that tech corporations will be expected to uphold in order to safeguard young users.
Two mothers, whose sons tragically died in what they believe to be incidents linked to perilous online challenges, expressed that they feel “belittled” by Ofcom’s perceived inaction and lack of engagement with bereaved parents.
In reporting by Sky News, the mothers of Archie Battersbee, a 12-year-old who died in an accident at home, and Isaac Kenevan, a 13-year-old who is thought to have died attempting a choking challenge found online, shared their experiences.
Isaac’s mother, Lisa, demanded more attentiveness, “They should be listening to us as bereaved parents. We feel like we’ve been belittled… there is no action at the moment.”
Hollie, Archie’s mother, voiced her grief at seeing other families endure similar tragedies: “It’s heartbreaking… this should not happen in a civilized society.”
The passing of the Online Safety Act last October granted Ofcom new powers to enforce online safety regulations.
Both Hollie and Lisa worked hard to advance the bill and are distressed by the sluggish progress in implementation.
Lisa Kenevan implored Ofcom to act promptly to prevent the spread of dangerous content, sharing her frustration that despite the introduction of the law, little has appeared to change.
“It is too late, our boys have gone… but Ofcom needs to step up and enforce… step in fast,” she said.
Related Articles:
E-gates now working at UK airports after delays
Porn star recounts unexpected encounter with Trump
Extended pub hours possible during Euro semi-finals if England or Scotland qualify
Controlling harmful algorithms
Among its various measures, Ofcom’s draft code necessitates strict age verification, improved grievance procedures, and duty from social media companies to amend algorithms pushing dangerous content to children.
Failing to comply could mean being fined up to 10% of global company turnover.
Dame Melanie Dawes, CEO of Ofcom, has urged the tech industry to not delay action, hinting at vigorous enforcement once the regulations become operational.
Create a significant shift in the industry
Ofcom refutes any claims of exclusion during its consultation process, highlighting that conversations have been held with over 15,000 children and 7,000 parents, including those affected by online tragedies.
Dame Melanie encouraged continued collaboration: “Those families who’ve lost children to tragic online incidents, we invite you to continue working with us.”
She stressed the importance of the new proposals, which are poised to significantly transform industry practices, and the value of ongoing dialogue to achieve the best outcomes.
Fears for children’s safety persist
For Archie and Isaac’s parents, the concern for the well-being of young internet users remains high.
Lisa Kenevan highlighted the urgency of the situation, noting an increase in children’s deaths while laws are formulated, emphasizing, “We don’t want anyone else joining our club.”
The consensus is clear – for the sake of every parent’s peace of mind, the need for change is immediate.
FAQs about Children’s Safety Online and the Role of Social Media Platforms
- What is Ofcom’s role in online safety for children?
Ofcom is the UK’s communications regulator and is responsible for ensuring online platforms protect children by adhering to codes of practice and regulations such as the Online Safety Act.
- What consequences do social media platforms face for non-compliance with the new standards?
Social media companies can be fined up to 10% of their global turnover if they fail to meet the requirements set out in the draft code of conduct by Ofcom.
- Have bereaved families been involved in the consultation process?
Ofcom claims to have involved over 22,000 individuals, including children, parents, and those affected by online incidents, in their consultation process.
- When will the new Ofcom regulations be enforced?
Ofcom expects to enforce the new codes of practice within a year, and they are encouraging the tech industry to begin making changes ahead of this timeline.
- Why are strict age checks important?
Strict age checks are crucial to prevent children from accessing content that is inappropriate or harmful to their age group, thereby reducing the risk of exposure to dangerous challenges or information.
Conclusion
As digital platforms continue to influence the lives of young users, the call for stricter regulations to protect children from harmful online content is more pressing than ever. The tragic stories of Archie Battersbee and Isaac Kenevan serve as heart-wrenching reminders of the grave consequences that can result from unregulated social media algorithms. Ofcom’s proposed code of conduct is a step towards addressing these concerns, although for some parents, the measures and their implementation cannot come swiftly enough. It is essential that social media firms take immediate action to prevent further harm and ensure a safe online environment for all children.