Wake-up Call to the Tech Industry: Online Safety Act Demands Action Against Illegal Content

  • Ofcom defines over 40 security measures to protect children and users from harmful content.
  • The Online Safety Act obliges platforms to control illegal content from March 2025.

Eulerpool News·

The clock is ticking for online platforms: From March 16, 2025, they are required to review their services for potentially illegal content to avoid hefty fines. This is due to the enactment of the new Online Safety Act (OSA), enforced by the regulatory authority Ofcom. Ofcom has recently published the final codes of conduct for combating illegal online content. Platforms have only three months to conduct risk assessments and identify potential dangers on their services. Non-compliance may result in fines of up to 10 percent of global revenue. Ofcom has defined over 40 safety measures that must be implemented from March 2025. Ofcom chief Dame Melanie Dawes stated: "For too long, websites and apps have been unregulated and insufficiently concerned with the well-being of their users. This is now changing fundamentally." Dawes emphasizes that the safety standards of tech companies will be strictly monitored in the future. The regulatory authority intends to act decisively in cases of violations. However, critical voices argue that the OSA does not sufficiently address the risks to children. Andy Burrows of the Molly Rose Foundation, among others, expressed concern over the lack of concrete measures against content such as suicide prompts and self-harm depictions. Ofcom clearly states in its codes that platforms must determine how illegal content can appear on their services and find ways to keep this content away from users. This includes materials related to child sexual abuse, extreme violence depictions, and the glorification of self-harm. Since the beginning of consultations on illegal content in November 2023, Ofcom has tightened its guidelines for tech firms. This also addresses the handling of intimate image abuse and coerced sex work. To ensure children's safety, social networks should stop making friendship suggestions between adult users and child accounts. Additionally, some platforms must use hash-matching technology to detect CSAM - now also a requirement for smaller file-hosting services.
Eulerpool Data & Analytics

Modern Financial Markets Data
Better  · Faster  · Cheaper

The highest-quality data scrubbed, verified and continually updated.

  • 10m securities worldwide: equities, ETFs, bonds
  • 100 % realtime data: 100k+ updates/day
  • Full 50-year history and 10-year estimates
  • World's leading ESG data w/ 50 billion stats
  • Europe's #1 news agency w/ 10.000+ sources

Get in touch

Save up to 68 % compared to legacy data vendors