Singapore Introduces a Landmark New Online Safety: Relief and Accountability Law

By Ronald JJ Wong

Introduction

Singapore has introduced significant new legislation aimed at enhancing online safety and providing better recourse for victims of harmful online activity.

The Online Safety (Relief and Accountability) Bill 2025 (“OSRA”) passed in Parliament on 5 November 2025 proposes a dual-pronged approach:

  1. Establishing an administrative route for timely relief through a new Commissioner of Online Safety / Online Safety Commission (“OSC”), and;

  2. Creating specific statutory torts to allow victims to pursue civil proceedings to seek judicial remedies in court against perpetrators and intermediaries.

This Bill aims to provide timely redress, promote online safety, deter harmful activity, and foster accountability in the online environment. Public agencies are generally excluded from making reports (except on behalf of others) or being subject to directions, orders, or tort liability under this Act.

The Bill complements the Broadcasting Act's Online Safety Code (which sets platform duties regarding online safety), and also amends the Protection from Harassment Act 2014 (“POHA”) and defamation law.

It is a necessary law to provide real legal powers to address a severe plague faced by many individuals, young and old, rich and poor, otherwise often helpless in the digital world.

The proposed law is also significant and interesting for other reasons:

  1. It is probably the first comprehensive Singapore legislation to explicitly deal with the harms of generative artificial intelligence (“AI”) like deepfake content.

  2. It alludes to introducing opt-in class action suits for certain statutory torts.

The OSC and Administrative Intervention

Victims (or their representatives) meeting eligibility criteria (e.g., citizens, PRs) can report online harmful activities to a Commissioner of Online Safety or the OSC.

They must first have reported to the relevant Online Service Provider (“OSP”) (save for certain severe harms requiring urgent relief). An OSP is a provider of an online service (through which online activity can occur) but excluding internet access services and app distribution services.

After assessing a report, if the Commissioner has “reason to suspect” an online harmful activity occurred, they may issue one or more Part 5 Directions to communicators, administrators, or OSPs (even those overseas). The Commissioner considers factors like harm degree, likelihood of recurrence, and public interest.

An “administrator” is a person who develops, maintains, organises, manages, supervises, regulates access to, or exercises editorial control over an online location (e.g., website, forum, chat group). This excludes entities like online service providers or internet access providers merely hosting the location.

A “communicator” is a person who makes material available online to users in Singapore. This excludes administrators or service providers solely for hosting content communicated by others.

Part 5 Directions which the OSC may issue include:

  • Stop Communication Direction (Section 29, 30): Requires removal/cessation of specific or a class of harmful material or even termination of an online location. A class of harmful material may be identified by specific identifiers (e.g., username, keyword).

  • Access Disabling Directions (Section 32, 33): Requires an OSP to block access for Singapore users to specific or a class of material or locations on their service.

  • Account Restriction Directions (Administrator/Online Service) (Section 38, 39): Requires disallowing/restricting a specific account's access or interaction, or disabling the account. Can apply to new accounts created by the same owner for up to 2 years for prescribed OSPs.

  • Restraining Direction (Section 31): Requires the recipient to refrain from communicating similar material, conducting similar activity, or administering a location facilitating such activity, for a period or indefinitely.

  • Right-of-Reply Directions (User/Online Service) (Section 34, 35): Requires communication of a reply notice alongside relevant material (mainly for false statements, reputational harm, disproportionate harm instigation). OSPs must ensure the notice is easily perceived.

  • Labelling Direction (Section 37): Requires an administrator to publish a notice at a location indicating that Part 5 Directions have been issued concerning it.

  • Engagement Reduction Direction (Section 40): Requires a prescribed OSP to take steps to reduce user engagement with a class of harmful material for up to 3 months.

Failure to comply with a Part 5 Direction (except Right-of-Reply) can lead the Commissioner to issue Orders Following Non-Compliance (Section 43), such as an Access Blocking Order to internet access service providers or an App Removal Order to app distribution services. Further, it will be a criminal offence to unreasonably fail to comply with directions or orders following non-compliance (Sections 71-72).

Information Gathering and Accountability Powers - Part 6 equips the Commissioner with powers to investigate harms and identify perpetrators:

  • Power to Require Information (Section 49).

  • Power to Examine and Secure Attendance (Section 51).

  • End-User Identity Collection Notice (Section 52): If an end-user is reasonably suspected of harmful activity via a prescribed OSP, the Commissioner can require that OSP to take reasonable steps to obtain specified identity information and provide it to the Commissioner.

  • Disclosure of Identity to Victims (Section 53): The Commissioner may, upon application by a victim (or eligible representative), disclose known identity or contact details of a suspected perpetrator obtained under the Act.

These powers will enable real action that can be taken against online harms. While foreign actors or servers may be technically out of reach from accountability, there will at least be possible access disability to protect against general reach locally. Likely, geo-blocking technology will be used. Of course, technical circumvention is inevitable: e.g., use of virtual private networks (VPNs). But local actors, even those who then try to circumvent and perpetrate, may then be exposed and held accountable. 

Certain affected persons who disagree with a decision of the Commissioner may apply to have the Commissioner reconsider their decision (Sections 56-59). A further level of appeal to an Appeal Committee (not to the Court) is available for certain eligible persons (Sections 61-65). There is no further right of appeal to the Court. It is however plausible that the Commissioner’s decision may be subject to judicial review.

What Constitutes “Online Harm”? Key Definitions

The Bill defines specific online harms (in Part 3) including:

  • Online Harassment: Adopting the definition of “harassment” in POHA definition, but as amended to include “sexual or indecent” content and causing “humiliation”.

  • Doxxing: Publishing identity information likely intended to cause harassment.

  • Non-consensual Disclosure of Private Info: Publishing private facts without consent, reasonably likely to cause harassment. [Query whether this should have a requirement of subjective intention. Otherwise, it could overlap with a breach of the Personal Data Protection Act 2012 (“PDPA”); any unintentional publication of private information could be subject to this law.]

  • Online Stalking: Course of online conduct reasonably likely to cause harassment (e.g., repeated unwanted contact).

  • Intimate Image Abuse: Non-consensual sharing of intimate images/recordings (real or generated, including deepfakes) reasonably likely to cause harassment.

  • Image-based Child Abuse: Sharing child abuse material (real or generated, including deepfakes). This means depictions of children subject to torture, cruelty, abuse, involved in a sexual pose or activity; sexual and reasonably offensive depictions of genital or anal region; sexual and reasonably offensive depiction of breasts.

  • Online Impersonation: Pretending to be someone else online without consent that leads a reasonable person to believe it was conducted by the victim. Exclusion: parody, satire or commentary, which no reasonable person would believe is made by a person.

  • Inauthentic Material Abuse: Sharing false, deepfake/digitally altered, realistic, content presented as real, reasonably likely to cause harassment because it is false or misleading. 

  • Publication of false material: publication of false statements of facts that are reasonably likely to cause harm.

  • Publication of statement harmful to reputation: publication of statement reasonably likely to cause harm to reputation and any other harm. [Query the extent of overlap with general defamation law.]

  • Online Instigation of Disproportionate Harm: Communicating statements that tend to instigate increased risk of harm which is disproportionate to the victim’s conduct. Exclusion: instigating political action or activity.

  • Incitement of enmity/violence: Inciting, or reasonably likely to incite, enmity, hatred, hostility, or unlawful violence against any group.

Broadly, some of these definitions exclude legitimate purposes related to science, medicine, education or art.

Statutory Torts for Civil Recourse to Victims of Online Harms

Beyond administrative relief from the OSC’s directions and orders, the Bill creates new civil legal avenues for victims through statutory torts (Part 10). Victims can now directly sue perpetrators in court for specific harms:

  • Intimate Image Abuse (Section 83): Requires proving non-consensual communication likely causing harm. Defences include reasonableness, reasonable belief in consent, or lack of knowledge the material communicated was an intimate image or recording.

  • Image-based Child Abuse (Section 84): Requires proving communication of child abuse material. These are for victims aged 16 years old and below. Defences include reasonableness or lack of knowledge.

  • Online Impersonation (Section 85): Requires proving non-consensual impersonation likely causing harassment. Defences include reasonableness or reasonable belief in consent.

  • Inauthentic Material Abuse (Section 86): Requires proving communication of material known/reasonably believed to be inauthentic, likely causing harassment. Defences include reasonableness or reasonable belief in consent.

  • Online Instigation of disproportionate Harm (Section 87): Requires proving communication instigating action causing disproportionate harm. Defences include lack of causation of loss/damage, lack of foreseeability, and reasonableness.

Remedies include damages (including loss of future earnings and earning capacity), account of profits, and injunctions. Regulations may specify minimum and maximum damages for certain torts. There may be enhanced damages if the victim reasonably wrote to the perpetrator to address the harm and the latter failed reasonably to address within reasonable time.

For incitement of violence to a group, damages may be determined based on harm to members or group as a whole, and the court may order damages be applied to purposes that benefit the group, if it cannot be distributed.

Platform and Administrator Accountability

The Bill does not just target perpetrators; it imposes duties on intermediaries: Administrators (e.g., forum/chat group managers), and Online Service Providers (OSPs) now face potential tortious liability for:

  • Facilitating/Permitting Harm (Section 90 - Admins): Developing and maintaining or administering an online location in a way that intentionally or knowingly facilitates or permits applicable online harms. E.g., administrator of a chat group where users are invited to share intimate images of women. Taken into account, will be the moderation policies and practices applied at the online location, among others.

  • Failing to Respond Reasonably (Section 91 - Admins; Section 94 - OSPs): Breaching a duty to take reasonable care to assess and address applicable harms after receiving a valid “online harm notice” from a victim. E.g., if an OSP receives an online harm notice of intimate image abuse by way of a user’s post, and promptly disables access to that post, that OSP has taken reasonable steps to address the harm.

Conversely, an admin or OSP has a right to sue a purported victim for sending an online harm notice to them that is frivolous or materially false which is known to be false (Sections 92, 95).

This is a significant legal development in imposing statutory duties of care in relation to online harms. There is wide-ranging impact since anyone can be deemed an “administrator” of an online location e.g., admin of a WhatsApp chat group, publisher or editor of a webpage. Thus, anyone in such a position would have to take reasonable care in responding promptly to any notice of alleged online harm. While there is deterrence in availing recourse by the tort of frivolous or false notices, actual exercise of such rights by non-commercial individuals may be practically limited.

For victims, the statutory torts provide civil remedies in addition to the OSC’s executive action. This allows for more options, especially if the OSC or the Appeal Committee may have taken a certain view against taking action.

Court Remedies

As regards remedies, the court may award damages (including damages for loss of future earnings and loss of earning capacity) or an account of profits (even if no loss or harm is proved).

For incitement of violence against a group, the court may award damages to the members of the group and determine the quantum of damages having regard to loss or harm suffered by individual members or suffered by the group as a whole. Damages are to be allocated and distributed in accordance with procedures to be prescribed under the Rules of Court. Where it is not possible or practicable to allocate or distribute all or any part of the damages awarded to members of the group, the court may order that those damages be applied to purposes and activities that will fulfil the objectives of the group or otherwise for its benefit.

Crucially, Section 104 empowers the Rules Committee to make Rules of Court providing procedures for such proceedings to be brought by one or more members on behalf of the group, potentially without naming all members or requiring their consent, and allowing members to register their interest. This structure strongly suggests the introduction of an opt-in class action or representative action mechanism specifically for this tort, a significant procedural development potentially paving the way for similar mechanisms for other group-based harms in the future.

The court may also award enhanced damages to a victim if the victim had made a reasonable written request to the respondent to address the online harm and the respondent failed without reasonable cause to address the harm within a reasonable time. Such enhanced damages are in addition to general and special damages, and are distinct from punitive or aggravated damages, and may be additional to those such damages.

The court may also order removal of online material or other actions to stop availing any online material or to suspend, remove, delete or terminate an online location.

Interaction with other laws

This Bill significantly interacts with and amends Singapore's existing legal landscape:

  • Protection from Harassment Act 2014 (POHA): The Bill extensively amends POHA (via the Schedule). It aligns definitions of harassment and harm (adding “sexual/indecent” and “humiliation”). It introduces new POHA court orders against administrators mirroring some Part 5 Directions and adds provisions for enhanced damages in POHA for online conduct. The Bill creates separate torts for some harms (like intimate image abuse) previously covered broadly under POHA, offering potentially more specific grounds for action alongside POHA's existing framework.

  • Broadcasting Act (Online Safety Amendments and Codes): This Bill complements the Broadcasting Act. While the Broadcasting Act imposes systemic duties on designated large platforms to create a safer environment (especially for children), this Bill provides reactive tools for the Commissioner and courts to address specific instances of harm impacting individuals, on any platform or location.

  • Penal Code: Many defined harms (e.g., harassment, stalking, intimate image abuse, incitement) overlap with criminal offences. This Bill provides parallel civil and administrative pathways for victims seeking redress, distinct from criminal prosecution.

  • Defamation Act 1957 and Common Law Defamation: The Bill introduces “Publication of statement harmful to reputation” (Section 18) as a basis for Commissioner action (limited to Right-of-Reply directions). This overlaps with defamation. Furthermore, the Bill amends the Defamation Act itself to introduce enhanced damages for failing to publish a requested reply statement online, mirroring the Bill's own enhanced damages mechanism. This offers potential alternative or supplementary avenues for dealing with online reputational harm.

  • Rules of Court / Pre-Action Disclosure: The Bill anticipates procedural rules (to be made via Rules of Court) to support the tort actions. This includes provisions for pre-action and non-party discovery, which could potentially interact with or supplement existing common law principles (e.g., Norwich Pharmacal orders) for identifying anonymous perpetrators before commencing a suit. The Commissioner's power to disclose user identity (Section 53) also provides a distinct administrative route to obtaining such information for prescribed purposes.

Conclusion

The Online Safety (Relief and Accountability) Bill represents a significant step towards creating a more comprehensive framework for tackling online harms in Singapore.

By establishing the Commissioner for administrative relief and introducing specific statutory torts for judicial recourse, it aims to provide victims with more effective tools and hold perpetrators and intermediaries more accountable. Its interaction with existing laws like POHA and the Broadcasting Act presents a move towards a multi-layered approach, combining systemic platform regulation with targeted interventions and victim empowerment.


Speak to our lawyers to find out more about legal protection against online harms.

Ronald JJ Wong

Deputy Managing Director

Next
Next

Case Update: Landmark Appeal– CAB Clarifies Key Competition Law Principles in CCCS Price Fixing Decision