8.7 C
Japan
Wednesday, February 18, 2026

Tech Secretary Condemns Ofcom Delays in Online Safety Implementation

Must read

Liz Kendall has penned a strong letter to Ofcom, expressing her significant worry and dissatisfaction regarding the delays in implementing its online safety obligations.

The Technology Secretary criticized Ofcom for its slow progress, stating that families nationwide have been eagerly awaiting the safeguards outlined in the Online Safety Act (OSA).

Specifically, she highlighted the issue of antisemitic content proliferation online, emphasizing to Ofcom’s chief executive Dame Melanie Dawes the government’s focus on addressing antisemitism.

Ofcom is postponing the enactment of its new responsibilities, which involve addressing harmful but legal content, such as discriminatory and offensive material related to race, religion, sexual orientation, gender identity, or disability.

These duties would require social media platforms to allow adults to decide whether they want such content to appear in their feeds, which is already prohibited for minors.

In its recent roadmap, Ofcom disclosed that it does not intend to release the categorization register and seek feedback on the additional obligations for categorized services until approximately July 2026.

Although the OSA was officially enacted in October 2023, Ofcom only began utilizing some of its new powers this year. The regulator has faced criticism for the prolonged process of implementing the law, involving extensive consultations to update its guidelines.

Ms. Kendall, in her letter, expressed disappointment over the delays in implementing additional duties for categorized services as outlined in Ofcom’s roadmap, emphasizing the importance of maintaining momentum to complete the remaining tasks efficiently.

She underscored concerns about potential hindrances to safeguarding women, girls, and users from harmful content and antisemitism due to delays in duties like user empowerment. Ms. Kendall reiterated the government’s support for Ofcom in prioritizing user safety and encouraged expediting processes, particularly concerning user empowerment responsibilities.

In a dedicated section addressing antisemitism, the Cabinet minister stressed the government’s commitment to combatting the spread of antisemitic content and highlighted it as a top priority.

An Ofcom spokesperson acknowledged that external factors, including a legal challenge against the government, have impacted the timeline for categorization. Despite these challenges, progress has been made, with sites and apps now obligated to protect users, especially children, and investigations initiated on over 70 services.

Ofcom’s children’s code of conduct, effective as of July this year, requires online platforms to implement robust age verification tools, such as facial scans and ID checks, to prevent minors from accessing inappropriate content like pornography.

Additionally, platforms have been instructed to address harmful content promptly, including self-harm, suicide-related material, eating disorders, extreme violence, and dangerous online trends.

During a parliamentary session, MPs called on AI minister Kanishka Narayan to address chatbots encouraging self-harm and suicide among children. Mr. Narayan indicated that AI-based search tools are covered by the Online Safety Act to guide young users away from illicit content.

Conservative MP Bob Blackman raised concerns about chatbots potentially inciting self-harm and suicide among young individuals, prompting Mr. Narayan to emphasize the tragic nature of such incidents. He assured that efforts were being made to enforce regulations effectively and address any legislative gaps to safeguard users.

(Note: The article has been rewritten for clarity, conciseness, and SEO optimization.)

More articles

Latest article