Online business

Online Safety Act 2021 – application to global online commerce.

Does your online business need to comply?

The Online Safety Act 2021 (Cth) (Act) was passed on June 23, 2021 and entered into force on January 23, 2022. The law creates the world’s first adult cyberabuse suppression program, in addition to strengthening online safety protections. If your business is an online service provider with Australian end users, regardless of where your business is located in the world, the law will apply to your service.

What does the law do?

The law implements five “programs” to deal with the removal of harmful online material relating to:

  1. children (cyberbullying scheme);
  2. adults (adult cyber abuse program);
  3. sharing intimate images without consent (image based scheme);
  4. hateful violent material (obnoxious violent material blocking system); and
  5. harmful online content (online content schema).

The Act affects various online service providers (PSO) with Australian end users, including (but not limited to) providers of:

  • electronic services, which allow end users to:
    • communicate by e-mail, instant messaging, SMS, MMS or a chat service; Where
    • player online games with other end users;
  • social media services (where the sole or primary purpose of the service is to enable social interaction between end users and end users are permitted to post material to the service);
  • Internet service providers;
  • application delivery services (such as Apple’s App Store or Google’s Play Store); and
  • search engine.

Foreign-based PSOs will have to comply with the law, which has extraterritorial application.

The Electronic Security Commissioner has the power to issue “removal notices” to PSOs, which include providers of online games, email and chat services, and social media services, among others. Any OSP receiving a removal notice must remove the offending content within 24 hours.

Industry expectations

The law also establishes “basic expectations for online safety” (BOSE) framework.1 This requires PSOs to take reasonable steps to keep users safe and minimize the extent to which harmful material is provided on their service. Generally, the BOSE requires that an OSP:

  • take reasonable steps to ensure that end users can access the online services securely;
  • consulting with the Electronic Security Commissioner to determine what “reasonable steps” means for this PSO (which could involve following directions issued by the Electronic Security Commissioner rather than a duty of positive consultation);
  • take reasonable steps to minimize the extent to which certain content is provided (including cyberbullying content, cyberabuse content, non-consensual intimate imagery, and content that promotes or depicts abhorrent violent behavior); and
  • ensure that online services have clear and easily identifiable mechanisms for users to report and lodge complaints about objectionable content.

To ensure compliance with BOSE, the Electronic Security Commissioner may require PSOs – individually or as a group – to provide specific information (such as their response to terrorism or heinous violent content) and report measures they have taken to comply with the BOSE.

Online Content Schema

The online content system, which regulates offensive content2 through a complaints-based mechanism under the Act, requires PSOs to develop mandatory industry codes and standards. Once registered, codes and standards will be subject to fines and injunctions.

The Electronic Security Commissioner has also made it mandatory for PSOs to implement a “restricted access system”3 which requires users to declare that they are 18 or older to access R 18+ or Category 1 content, which includes movies and computer games. These restricted contents usually include sex, high impact nudity, drugs or violence.

The access control system must also notify the user of the nature of the content.

Adult Cyber ​​Abuse Program

The Adult Cyber ​​Abuse Program is new and the first of its kind to help adult victims of cyber abuse. It provides for the removal of material that seriously harms Australian adults. The Electronic Security Commissioner has the power to issue takedown notices to “end users” and PSOs asking them to remove abusive content within 24 hours. A deletion notice can be sent to an OSP or an end user around the world as long as the alleged victim is an Australian resident.

The test for what constitutes abuse directed at adults requires that the material be intended to have the effect of causing “serious harm”, which is defined as serious bodily harm or serious harm to the mental health of a person and includes severe psychological harm and severe distress. The test for adults is higher than that for children in recognition of their resilience and the need to balance the need for freedom of expression.

Individuals should lodge a complaint with the relevant service provider in the first instance. If OSP does not help them, they can ask the eSafety Commissioner to issue a removal notice.

Penalties

Failure to comply with the obligations set out in the Act can result in a fine of $110,000 for individuals or $555,000 for companies. The Electronic Security Commissioner can also request binding undertakings and issue formal warnings.

What else has changed?

The act updated and expanded existing laws regarding harmful online content. Some key points include:

  • the time limit for complying with a removal notice (previously only effective in cases of cyberbullying of children) has been reduced from 48 hours to 24 hours; and
  • takedown notices may be sent not only to social media platforms and websites, but also to online gaming platforms and messaging services. This expands the scope of previous cyberbullying laws regarding children.

The Electronic Security Commissioner now also has specific powers to:

  • obtain information about the identity of “trolls”, or people who use fake accounts or anonymity to abuse others or exchange illegal content;
  • Block websites in response to online crises (such as violent events) by requiring Internet Service Providers to block access to content for a limited time. This mirrors the heinous and violent provisions of the Criminal Code, which were enacted in response to the Christchurch terrorist attack; and
  • force search engines and app stores to remove access to websites or apps from the app store that have not complied with previous removal orders.

Key points to remember

The law applies not only to social media platforms and search engines, but also, for example, to providers of online games and email services. Although the Electronic Safety Commissioner has said that she will prioritize the investigation of complaints about the most harmful content (such as child sexual exploitation content, content advocating the commission of a terrorist act and content that encourages, instructs or incites crime and violence),4 we recommend that OSPs with Australian end users review and update their online security procedures to ensure compliance with BOSE, the Restricted Access System regime and the law generally. PSOs will also need to put mechanisms in place to respond quickly to takedown notices.