UK's Online Safety Act: New Guidelines Impact Global Tech Firms

On Monday, Ofcom, the U.K.'s internet regulator, released its first final guidelines for online service providers under the Online Safety Act. This marks the beginning of compliance deadlines for a law aimed at curbing online harms, with the first deadline set for March 16, 2025.

Ofcom has faced pressure to expedite the implementation of this online safety regime, particularly after social media's role in summer riots. The new guidelines compel over 100,000 tech firms globally to assess and mitigate risks associated with illegal content, including terrorism, hate speech, and child exploitation.

Failure to comply could result in fines reaching up to 10% of a company's global annual turnover. The Act applies to all service providers with ties to the U.K., regardless of their location, and mandates that even smaller platforms adopt various safety measures.

Key obligations include content moderation systems for swift illegal content removal, user complaint mechanisms, and privacy settings for minors. Notably, the law introduces potential criminal liability for senior executives, emphasizing accountability at the highest levels.

Ofcom CEO Melanie Dawes highlighted that significant changes in tech operations are expected by 2025, as companies must adjust algorithms to prevent illegal content from surfacing. Future directives will focus on enhancing protections for children, including age verification measures and restrictions on harmful content.

As the digital landscape evolves, Ofcom plans to adapt its guidelines to address new challenges, such as the rise of generative AI, ensuring that online safety remains a priority in an increasingly complex environment.

Знайшли помилку чи неточність?

Ми розглянемо ваші коментарі якомога швидше.