This chapter deals with the law from an individual’s perspective wanting to use the internet to find information, publish material, engage in e-commerce or communicate using social media. It also offers advice on managing children’s internet access.

Contributor

John Leung

Barrister

Managing access to internet content

Last updated

17 May 2021

Many people are concerned about children encountering inappropriate material on the internet. This section outlines how internet material is regulated in Australia, how children can be kept safe, and where to find more information.

Australia’s internet content regulation scheme

Australia’s scheme for regulating internet content is administered by the federal government. It is co-regulatory, meaning that the internet industry and the community are also involved. The scheme is guided by industry practicalities and the principle that what is restricted offline should also be restricted online.

Internet content is regulated by a public complaints procedure, laws, and industry codes of practice.

What material can be complained about?

Anyone can complain about internet content they feel is objectionable. The specific procedure and solutions vary, depending on the nature and source of the material. For more information, contact ACMA or the Australian eSafety Commissioner.

Classifications

Internet content is classified using the same categories as used for films and computer games, as follows.

RC (Refused Classification) content cannot be legally hosted on an internet site in Australia, just as a RC film cannot legally be brought into the country. Material is refused classification if it is deemed to deal with sensitive topics like sex, drug misuse, crime and violence in a way that offends against the standards of reasonable adults, or offensively depicts a person who is or appears to be under 16.

X-rated material (i.e. depictions of actual sexual activity) is also prohibited on the internet, as are X-rated films in most states (except the ACT and the Northern Territory). Content that contains real depictions of actual sexual activity between consenting adults, and is classified as unsuitable for a minor to see – and does not fall into the RC category – is classified X. However, some films can be exempt from classification; for example, if they are screened at a particular film festival, or made for scientific purposes. Other types of content may only be illegal if children can easily get access to them.

R content is material that is not RC or X but is unsuitable for a minor to see. Accordingly, there must be a restricted access system to prevent access to the content by people under 18. If there is not, this material can also be the subject of a complaint.

The Classification (Publications, Films and Computer Games) Act 1995 (Cth) was amended in 2012, bringing the classification system for computer games into line with the existing system for films and online content, and with international standards. The new classification was introduced on 1 January 2013 and allows adult computer gamers in Australia to access the full range of games with adult content.

Offensive content

Illegal and offensive online content is regulated by the Online Content Scheme under schedules 5 and 7 of the Broadcasting Services Act 1992 (Cth) through a complaints-based mechanism. The restrictions focus primarily on child pornography, sexual violence and other illegal activities.

Under the current National Classification Scheme, RC-rated material includes any material that depicts child sex abuse, bestiality, sexual violence and the detailed instruction of crime.

ISPs and ICHs that become aware their service can be used to access child pornography or material related to child abuse must refer the material to the Australian Federal Police.

It is illegal under the National Classification Scheme and related legislation to distribute, sell or make available for hire RC-rated films, computer games and publications. 

However, such measures are only effective when content is hosted in Australia.

In relation to content hosted overseas that would be prohibited if it was classified in Australia, ISPs have a responsibility to follow the procedures set out in an industry Code of Practice (or in the absence of a code, an industry standard). This could involve blocking access or providing a notification system.

Where to complain

Since July 2015, complaints about objectionable internet content can be made to the Australian eSafety Commissioner (formerly complaints were made to ACMA). The Australian eSafety Commissioner is an independent statutory office created by the Enhancing Online Safety Act 2015 (Cth) (‘EOS Act’).

Under the EOSC Act, the commissioner administers the Online Content Scheme and also has the power to investigate serious cyber-bullying material aimed at a child.

What happens to complaints?

If the content is hosted in Australia and is prohibited or likely to be prohibited, the ICH is directed to remove the content from their service. Prohibited content is content that is or would be classified RC or X. In serious cases (e.g. involving child pornography), state or territory police are notified.

If the illegal content is hosted outside Australia, the Australian Federal Police are notified by Interpol. All overseas-hosted content that is prohibited or potentially prohibited that is investigated, is referred to accredited providers of optional end-user (PC-based) family friendly filters in accordance with the industry codes of practice.

What else can be done?

Besides the complaints system, the shared effort to regulate internet content includes Codes of Practice developed by the Internet Industry Association (now managed by the Communications Alliance, see ‘More information’, below). While the codes are largely voluntary and self-regulated, ISPs and ICHs can be directed to comply with their responsibilities under the codes.

Other information and advice sites are listed under ‘More information about the internet and the law‘.

Filters, labels and safe zones

Email and internet content provided in real-time (e.g. chat rooms, live audio or video streaming) are not generally covered by the classification procedures or the industry codes (Victoria is a partial exception, with racially or religiously vilifying email being illegal).

Filters are programs that in some way block access to inappropriate material from websites, newsgroups, chat rooms and email. Filters can also restrict the results from search engines.

Labelling tools help filters by creating lists of sites. ‘Black’ lists use the names of sites with offensive content to block access to them. ‘White’ lists block everything except inoffensive sites. Content-based filters block access to sites based on key offensive words or on some photographic content that might be unsuitable for children. The different types of filter can be used in combination, depending on what is required.

Filter programs can operate on a home computer or via an ISP. Your ISP is obliged to provide infor­mation about filtering software and the filters they offer. ISPs must provide a filter approved in the Internet Industry Association Codes of Practice. The NetAlert, Communications Association and Internet Content Rating Association sites give more background information (see ‘More information about the internet and the law‘).

Safe zones are networks suitable for young children and are separated from the rest of the internet. They are available via subscription or through some ISPs. Specific children’s zones may also be hosted on commercial sites or supported by advertising.

It is important to remember that no tool is completely infallible. Consumer advice websites can help parents and guardians to choose the best strategy (see ‘More information about the internet and the law‘).

Chat rooms

Chat rooms are places where real-time written conversations take place. They are usually public, although private chat rooms are offered on some sites. Most people, including children, use pseudonyms in chat rooms so that a person’s real identity is not apparent. This means that sometimes a child may believe they are chatting to another 12-year-old, when it may in fact be a much older person. There have been instances where adults have attempted to exploit children by contacting them in chat rooms.

The current regulatory approach emphasises education and guided information for children. It is important that children know what personal details they can give out when they are online, for their safety and for the security of the household as a whole.

For useful websites, see ‘More information about the internet and the law‘.

Back to
Consumers, contracts, the internet and copyright