UK Online Safety Act 2023: what platforms must do now
UK Online Safety Act 2023: what platforms must do now
The Online Safety Act 2023 establishes a statutory duty of care for UK-accessible platforms, search services and online publishers, creating one of the most detailed digital safety regimes internationally. Ofcom has begun phased implementation, with information notices, supervisory monitoring and early investigative steps already underway. For operators, the immediate priority is understanding service classification, completing compliant risk assessments and preparing evidence packs capable of withstanding regulatory scrutiny. This article sets out what has changed, why it matters and the operational actions that platforms need to take now.
Regulatory background
The Online Safety Act 2023 (OSA) introduces legally binding duties to manage the risks of illegal content and, for services likely to be accessed by children, harms to children. The regulator, Ofcom, is granted investigation and enforcement powers, including information notices, audits, penalties up to the higher of £18 million or 10 per cent of global annual turnover, and the ability to compel use of accredited technology. The framework sits alongside and interacts with the Data Protection Act 2018, the UK General Data Protection Regulation, the Communications Act 2003 and relevant Ofcom statutory duties.
Implementation is staged. In November 2023, Ofcom issued its first consultation on illegal-content safety duties, followed in 2024 by draft codes, consultations on children’s safety, age assurance and pornography enforcement, and a timetable for categorisation and transparency reporting. Ofcom’s enforcement guidance explains its investigatory tools, approach to risk and expectations for governance and documentation. The regime therefore now combines statutory obligations with detailed, technical guidance that platforms must integrate into internal compliance programmes.
Analysis
The Online Safety Act requires platforms to follow a structured, evidenced and repeatable approach to online safety. Three areas are consistently attracting early regulatory attention: the sufficiency of risk assessments, children’s access determinations and responsiveness to information notices.
Illegal-content duties apply to all user-to-user and search services, requiring operators to undertake a suitable and sufficient illegal-content risk assessment, covering the design, operation and use of the service, and to implement proportionate safety measures. These measures include systems and processes for detection, reporting, content moderation, user reporting and governance oversight. Ofcom’s draft codes emphasise practical, auditable controls such as rulebooks, content-classification tools, review cycles and staff training.
For services likely to be accessed by children, the children’s safety framework requires an additional assessment of the risk of harm to children and enforcement of appropriate age-assurance measures. Ofcom is consulting on what constitutes “highly effective” age assurance. Platforms will need to reconcile these requirements with privacy obligations relating to data minimisation, DPIAs and children’s data protections under the DPA 2018 and UK GDPR.
Service categorisation introduces additional duties for larger or higher-risk platforms. Categorised services must produce transparency reports, operate certain safety features and meet heightened governance obligations. Although final Category 1, 2A and 2B thresholds will depend on future secondary legislation, platforms should evaluate their likely categorisation and prepare for corresponding obligations.
Ofcom’s investigatory approach is becoming clearer: requests for information are increasing in volume and complexity, response windows may be short, and the regulator has been explicit that poor documentation, weak governance and unsystematic decision-making will attract closer supervision. In this environment, the ability to produce an evidence pack demonstrating risk assessment methodologies, mitigation decisions, policy frameworks, KPIs, audit trails and records of action is essential.
Commercial and operational implications of the UK Online Safety Act
For product, legal, safety and engineering teams, the main operational burden is not the individual duties but the need to create a coherent compliance framework that spans product design, risk management, governance and engineering.
First, platforms must confirm service classification and complete the required risk assessments. These need to be living documents with owners, review cycles, documented methodologies and clear links to mitigation decisions. Poor or incomplete assessments are already a common cause of regulatory concern.
Secondly, platforms should implement governance frameworks that include senior accountability, reporting lines, escalation processes, KPIs and monitoring triggers. Ofcom expects structured decision-making and the ability to justify how safety mitigations were selected as “proportionate” to the service’s nature and risk.
Thirdly, organisations must prepare an Ofcom-ready evidence pack. This should include risk assessments, DPIAs, policy frameworks, moderation data, terms and conditions, enforcement statistics, user-reporting processes, age-assurance documentation and change logs. Operators that have experience with financial or telecoms regulatory audits will recognise the importance of consistent documentation and decision trails.
Finally, platforms should ensure alignment between OSA compliance and existing workstreams, including data protection, incident response, security, records management and internal audit. Fragmented governance or duplicated processes create operational inefficiency and increase regulatory risk.
Viewpoint
The Online Safety Act represents a significant regulatory shift. Its scale is considerable, the guidance is technical, and the compliance burden is material for any operator with UK-facing users. However, the core duty—structured risk assessment, proportionate mitigation and auditable governance—is consistent with regulatory practice in other digital and infrastructure sectors. Platforms that approach compliance as a cross-functional governance programme rather than a content-moderation issue will be better placed to satisfy Ofcom. As enforcement matures, the differentiator will be documentation quality, governance discipline and responsiveness under scrutiny.
For operators assessing their compliance position or preparing for engagement with Ofcom, Bratby Law can provide structured assessments, governance frameworks and practical implementation support.
