Chapter 17 Proposal for a mandatory filtering system
A significant amount of attention in this Inquiry focused on a proposed
national, mandatory filtering scheme so that internet service providers (ISPs),
can remove access to Refused Classification material online. Other ways of
restricting access will also be outlined. Refused Classification material
includes child sex abuse, bestiality, extreme violence including rape, detailed
instructions on crime or drug use, and advocating a terrorist act. The
Government has stated that Refused Classification C content has no place in our
society and therefore should not be available in the internet.
Significantly, three of Australia’s largest ISP’s, Telstra, Optus and Primus,
have agreed to voluntarily block child abuse material at the server level.
Webshield, and ItXtreme have also volunteered to block this content.
The role of the Australian Communications Media Authority (ACMA) in
regulating online content is to administer the co-regulatory scheme established
under the Broadcasting Act 1992 (the Act). Complaints about online
content can be made to ACMA and, if the material is found to be prohibited or
potentially prohibited, it must either:
issue an interim or final take-down notice (for content hosted in
refer the content to industry accredited Family Friendly Filters
(for content hosted overseas) under a recognised alternative access-prevention
arrangement outlined within a registered Code of Practice.
The online content co-regulatory scheme is under-pinned by the National
Classification Scheme (NCS), applicable to films, computer games and certain
publications. Determinations about prohibited/potentially prohibited material
are made by reference to classification categories established under the NCS.
ACMA must refer Australian-hosted content that is potentially prohibited
to the Classification Board for classification before it can take action.
Content hosted overseas may be referred to the Board.
Prohibited or potentially prohibited content is assessed against the
following classification categories:
Refused Classification, including offensive depictions of
children and material advocating terrorists acts;
R18+ items not subject to restricted access systems; and
Certain limited MA15+ content classified MA15+, provided for
profit or on payment of a fee and not consisting of one or more images and/or
There are no technical issues preventing the adoption of filtering a
list of URLs, and many ISPs around the world have been doing so voluntarily
’for many years’.
Late in 2010, Telstra Corporation, Optus and Primus agreed to introduce
voluntary filtering of child abuse URLs on ACMA’s list of prohibited sites.
These ISPs cover about 70 percent of all Internet users in Australia. About 30
percent of ACMA’s black-listed sites included depictions of child abuse and
child sexual abuse material. Recently, Webshield, and
ItXtreme have also volunteered to block child abuse material at the ISP level.
The Government will continue to encourage other Australian ISPs to follow the
example of these ISPs.
ACMA is working to develop measures to enable these prohibited sites to
be transmitted to participating ISPs on an automated and secure basis. It awaits
responses to invitations to these three ISPs to begin trialling that
The Department of Broadband, Communications and the Digital Economy was
hopeful of getting the cooperation of other ISPs to filter voluntarily material
on ACMA’s blacklist, by working with the Internet Industry Association. That
body has announced that it will assist in encouraging a wider range of ISPs to
adopt voluntary filtering. Until recently, ISPs have
refused to take action on blocking Refused Classification material.
There is no evidence of reluctance by ISPs to take down Refused
Classification material, and it is not clear that legislation would be any more
effective than a voluntary arrangement. The user policies of large
multi-national websites are ‘very broad’ and cover a ’much wider range’ of
material they can take down, compared to what is described as ‘inappropriate’
in the Act.
Under its powers in the Act, ACMA also issues industry codes to ISPs,
and these co-regulatory instruments are enforceable immediately they are
registered. Compliance is ‘close to universal’ and probably as high as would be
achieved by legislation.
Mr Mark Newton made the point that about two-thirds of Australian
households do not have school age children and applying restrictions to these
households would be poor targeting.
Further, according to ACMA surveys, between 40 and 50 percent of parents
use filtering devices at home. Considerable evidence was presented to this
Inquiry on the range of such devices. These devices more
material than Refused Classification content.
There are many commercial and free filtering options available, at many
search engines, such as Bing, Yahoo! and Google;
browser level, including Microsoft; and
software applications, such as a product of a US company Blue
However, there is a lack of awareness by parents.
While most participants concentrated on expressing views of the
filtering of Refused Classification material, Symantec Corporation noted that
less than 50 percent of small to medium businesses in Australia had security
systems installed and operating. Only when they became victims of fraud or
identity theft did such businesses seek out educational resources or assistance
from government agencies, or the police.
Support for the proposal
BraveHearts saw ISP filtering as part of a ‘holistic’ approach to online
threats. It argued that material such as child pornography, already blacklisted
by ACMA, breached Australian laws and it was illegal to produce, own and
distribute it. It should not be available online. This organisation supported a
second tier of filtering that would allow families, organisations or businesses
to request optional filtering of other objectionable material, such as
promotions of terrorism, suicide, drug use or adult pornography. It was aware
that no filtering systems were foolproof, and that they can be circumvented.
The Victorian and Tasmanian Synod of the Uniting Church gave four
reasons for requiring ISPs to block Refused Classification material:
Sale and distribution of this category is already banned in all
other media, including the Internet hosted in Australia;
They have a ‘crucial role’ in preventing the domestic consumer
from accessing it by accident, and in preventing those who do not know how to
access it but are curious, as well as those who are at an early stage of
developing or feeding a sexual interest in children;
It undermines the commercial trade in images of child abuse and
actively disrupts its success; and
It is reasonable to expect ISPs to accept some responsibility for
what their clients seek to view, and for the material to which they provide
The Synod did not see placing such obligations on ISPs as a replacement
for education and awareness programs and law enforcement, but as a
complementary measure to a wider cyber-safety strategy. Requiring ISPs to be
socially responsible and not facilitate trans-national criminal activity would
assist in providing increased cyber-safety to young people who would otherwise
become victims of the demand for commercial child sexual abuse materials.
Family Voice Australia supported the proposal for mandatory ISP-level
filtering, noting that opponents’ arguments could be addressed because:
There would be minimal degradation to Internet performance;
The right to free access to information has always been qualified
by the need to protect the community, and there was no logical reason why the
Internet should be different;
The implementation of any filtering scheme would be protected by
scrutiny in the Parliament and in the media; and
Even if a total blockage of all Refused Classification material
cannot be achieved, a significant reduction was a worthwhile goal.
It believed that including some of the following features when the
proposed scheme was implemented could improve cyber-safety:
Providing an R18+ classification for computer games;
Excluding X18+ material; and
Ensuring that ACMA’s black list was not simply compiled from
complaints and the supply of lists of child abuse sites from overseas
Family Voice Australia also suggested that a tender should be sought for
a system based on a web crawler that actively seeks out URLs containing
Concerns about the proposal
Ms Robyn Treyvaud noted that, as technology being used at schools can be
bypassed using proxy sites, if mandatory filtering was introduced there would
be no way of knowing what students were accessing.
The NSW Secondary Principals’ Council stated that consideration needed
to be given to differentiating filters for staff and students. It is difficult
for school personnel to follow-up an issue when the site is blocked to staff.
While Professor Marilyn Campbell supported filtering pornography out,
she thought that filtering only worked when children were actually protected
from accidentally going into inappropriate sites.
The Northern Territory Government stated that there was a significant
role for researchers to develop filtering software that was ‘effective and
Symantec Corporation noted that, in the past, young people had not been
stakeholders in proposals for filtering. Unless they were included, they would
find ways around the technology. Young people’s views on
Internet filtering are discussed below.
The Australian Privacy Foundation believed that the current proposal had
been developed and debated without the expected level of investigation of
issues, such as the nature of purported harms, the limits and application of
various remedies and regulatory models against current/future versions of those
harms and comparisons with other options.
The Victorian Office of the Child Safety Commissioner stated that it was
important to strike the right balance between filtering harmful material,
particularly for younger children, while still enabling older children access
to information about issues relevant to them.
The Australian Library and Information Association opposed filtering on
the basis of freedom of access of information and would like to find a balance
between censoring adults and protecting children.
The Queensland Catholic Education Commission has online filtering, and
there is monthly feedback to schools about sites that are accessed in each
case. It believed, however, that the major focus should be on the development
of positive e-security habits for all users, rather than on technological solutions
such as filtering. These simply present a challenge to those who are ‘computer
savvy, and are rapidly superseded as technology advances. The Commission saw
filtering as part of a package, and emphasises giving skills to students to
have the right attitudes. It saw putting key values in place, and giving some
specific skills and attitudes, as the most effective way of dealing with
Referring to ‘problematic Internet use’, Netbox Blue noted that if a
filter was installed, many people would consider that their technological
problem(s) had been solved.
The Safer Internet Group reiterated that the proposed filter would give
parents/carers a false sense of security about online safety, and that it has
changed the way the world viewed Australia.
Facebook has two concerns about the proposal:
It will distract people from other things that need to be done to
make the Internet safe; amd
Filtering attracts social costs, as there may be a ‘chilling
effect’ on expression. It also has economic costs, as some investment in
innovative ways to use new information in Australia will go elsewhere if there
is a government screen.
Professor Karen Vered did not think that the government needed to
dictate ‘a kind of blanket filtering’, and believed that parents/carers should
make their own decisions about purchases, installation and learning how to use
it. Filtering would be costly and put Australia at an even greater disadvantage
internationally. It would also make Australian ISPs responsible for problems
they had not caused, as they are not responsible for ‘unsavoury material’ from
foreign sites. If Australian ISPs were to be made responsible for filtering,
their costs would be passed onto consumers.
Moreover, technological barriers are not a solution, as they are not
going to help young people develop their ability to discriminate, evaluate and
act under circumstances where they are required to exercise their own
While supportive of the Government’s initiative in proposing to filter
child pornography and extremely violent content, Symantec Corporation noted
that filtering did not solve issues such as fraud, identity theft, or
The Alannah and Madeline Foundation confirmed that home filtering was
not often applied, despite the widespread availability of systems. When it was
applied, there was a risk that parents/carers were given a false sense of
security about access to inappropriate content, or the risk of their children
being contacted by strangers online. Parents/carers were then encouraged to
think that their children could be left to go online unsupervised. ‘Software
cannot replace the eyes and awareness of an engaged parent or carer.’
Feedback from young Australians
The Committee’s Are you safe? survey asked participants what they
believed could be done to make the internet safer. Though young people appear
to welcome localised internet filters installed on personal computers, they are
less receptive of an ISP-level filter.