3. Age verification for online pornography

3.1
According to the eSafety Commissioner, research conducted in 2018 found that one third of parents of children aged 2 to 17 years reported that they were concerned about their children accessing or being exposed to pornography.1
3.2
This concern was reflected in a large number of submissions to the inquiry—many from parents, carers, and others responsible for the welfare of children and young people.
3.3
This chapter considers evidence in relation to the nature and effect of children and young people’s exposure to pornography.
3.4
The chapter then considers how online pornography is currently regulated in Australia, and how age verification for online pornography has been pursued in other jurisdictions.
3.5
The chapter concludes with evidence about the possible implementation of a mandatory regime for age verification for online pornography in Australia.

Nature of children’s exposure to pornography

3.6
The Committee received a large number of submissions and other correspondence expressing concern about the ease with which children and young people are able to access online pornography.
3.7
The Committee notes that evidence on this subject was also considered by the Senate Environment and Communications References Committee in its 2016 report, Harm being done to Australian children through access to pornography on the Internet.2
3.8
eChildhood, a charity seeking to address the impacts of exposure to pornography on children and young people, submitted that children are ‘increasingly accessing or being accidentally exposed to pornography on the internet’:
Whilst exact statistics vary due to the inherent research limitations on this topic, studies have shown that high percentages of children and young persons above the age of 10 have been exposed to pornographic material, with males being at a significantly greater risk of exposure.3
3.9
A number of submissions referred to research attempting to quantify the nature of children’s exposure to online pornography at different age groups. For example, eChildhood submitted:
According to recent data from an internet filtering software company used in schools, a third of students aged eight and under attempted to access online pornography in the past six months. This includes accidental access through unwanted pop-up ads and banners as well as deliberate searches for explicit material.4
3.10
Advocacy group Collective Shout referred to UK research which found that 28 per cent of children aged 11 to 12 years had seen pornography online. The same study found that 65 per cent of children aged 15 to 16 years had seen pornography online, with 94 per cent of those having first seen pornography by age 14 years.5
3.11
Collective Shout also cited Australian research which indicated that 69 per cent of males and 23 per cent of females had first viewed pornography at age 13 years or younger.6
3.12
The Victorian Aboriginal Child Care Agency (VACCA) referred to a survey undertaken in Australia in 2010 which found that nearly 1 in 4 children aged between 9 to 16 years had viewed sexually explicit images online. VACCA noted that these figures were higher than the average for the 25 other countries included in the survey as a reference point.7
3.13
The Committee heard that smartphones and other handheld devices were associated with an increase in children’s exposure to pornographic material. For example, VACCA explained:
Australian and international research and expert consensus shows that with the growing use of tablets and smartphones, children and young people are being exposed to online pornography at an ever-increasing rate. According to two studies in the UK, the average age of first exposure to online pornography is 11 years of age...8
3.14
Similarly, Collective Shout suggested that children’s ability to access pornography was becoming easier ‘due to the proliferation of handheld devices, including smartphones, as well as the availability of unfiltered public wifi’.9 Ms Melinda Liszewski, Campaigns Manager for Collective Shout, told the Committee:
Our take-up of technology has been rapid, and our safeguarding of children who are connected to the internet at the earliest of ages has been very slow.10
3.15
This point was echoed by WA Child Safety Services (WACSS), a not-for-profit provider of child safety education:
Children and young people with access to the internet on any device - at home, at a friend’s place, at school or in any of our community spaces with Wi-Fi - are at risk of exposure. It’s now not a matter of ‘if’ a child will see pornography but ‘when’ and the when is getting younger and younger. In Australia the average age of first exposure is being reported at between 8 and 10 years of age. While pornography is not new, the nature and accessibility of today’s pornography has changed considerably.11
3.16
Ms Tamara Newlands, Executive Director of eChildhood, characterised the situation as children having ‘unfettered access to hard-core pornography at their fingertips 24/7’.12
3.17
A common theme in evidence was that children’s exposure to pornographic material could be not only deliberate but also inadvertent.13 While exposure due to ‘pop-up’ advertisements was raised, Collective Shout also submitted that ‘innocuous activities like key-stroke errors and searches for cartoon characters’ could inadvertently direct children to pornographic websites:
Our children don’t need to be looking for porn, but porn will find them. It’s simply not a fair fight.14
3.18
Many of the individual submissions received by the Committee came from parents, teachers and others recounting first-hand experiences of children in their care encountering pornography online through friends or schoolmates, unintended results of innocuous web searches, or the appearance of unsolicited ‘pop-up’ material.
3.19
Another strong theme in evidence was that the nature of pornography had changed such that children are exposed to pornographic material that is more extreme. Ms Julie Inman-Grant, the eSafety Commissioner, told the Committee that pornographic material is today much more extreme and, in some cases, more violent than in the past.15
3.20
Ms Inman-Grant went on:
Most of the parents we’ve talked to and done research with have no idea what kind of pornography is readily accessible and available online: ...it can be very violent, very confronting...16
3.21
WACSS submitted that pornography is ‘often violent, graphic and portrays distorted gender roles’.17 Similarly, Melinda Tankard-Reist told the Committee that ‘children are frequently seeing violent depictions of sex, torture, rape and incest porn’.18
3.22
VACCA also raised this issue in its submission, explaining:
A great deal of the pornography that children view – whether accidentally or intentionally – contains violent imagery and themes. In a UK study it was found that 100 per cent of 15-year-old boys and 80 per cent of 15-year-old girls in 2013 had viewed “violent, degrading online pornography, usually before they have had a sexual experience themselves.”19
3.23
In a recent analysis of pornographic videos commonly watched in New Zealand, the Office of Film and Literature Classification found that 10 per cent of videos showed physical aggression and 35 per cent showed some non-consensual behaviour.20

Effect of children’s exposure to pornography

3.24
Many submissions to the inquiry expressed concern about a range of possible consequences associated with the increased exposure of children and young people to online pornography.
3.25
As noted above, evidence on this subject was also considered in the 2016 report of the Senate Environment and Communications References Committee,21 and several submissions referred to evidence included in this report.
3.26
In its submission, eChildhood provided a summary of research into outcomes associated with children’s exposure to pornography:
Access to pornography at a young age, when the child is still developing, can have extensive negative impacts on the child. The literature reveals links between children’s access to pornography and the following non-exhaustive list of outcomes:
1
Poor mental health – including, but not limited to, being distressed and upset by the images, self-objectification and body image concerns, sexual conditioning and developing an addiction to pornography;
2
Sexism and objectification – such as reinforcing gender roles that women are ‘sex objects’ and men should be dominant while women should be submissive;
3
Sexual aggression and violence – consistently, there is a demonstrated association between regular viewing of online pornography and the perpetration of sexual harassment, sexual coercion and sexual abuse by boys;
4
Child-on-child and peer-on-peer sexual abuse; and
5
Shaping sexual behaviours, such as engaging in younger sexual behaviour, more frequent premarital and casual sexual behaviour and more ‘risky’ sexual behaviour.22
3.27
Similarly, summarising research on the effects of pornography on children and young people, the Australian Institute of Family Studies (AIFS) found:
The available studies suggest that the effects of frequent and routine viewing of pornography and other sexualised images may:
reinforce harmful gender stereotypes;
contribute to young people forming unhealthy and sexist views of women and sex; and
contribute to condoning violence against women.
There is also evidence to suggest an association between frequent viewing of online pornography and sexually coercive behaviour exhibited by young men.
Pornography consumption by young people may also normalise sexual violence and contribute to unrealistic understandings of sex and sexuality.
...Pornography consumption has also been associated with the practice of “sexting”, and young women have reported being coerced or feeling pressured to share naked images of themselves online.23
3.28
Speaking to the Committee, Ms Julie Inman-Grant, the eSafety Commissioner, outlined a range of concerns about the negative effects of pornography on young people:
This includes harmful effects of online pornography on young people’s mental health and wellbeing and negatively altering their knowledge, attitudes, beliefs and expectations about sex, gender and respectful relationships and what they should look like. There are deep and legitimate concerns about how ready access to online pornography might impact the social sexualisation of an entire generation. We are right to worry about whether this conditioning might heighten involvement in risky, violent or harmful sexual practices and behaviours. In short, I don’t think anyone here would disagree that this is a pressing and urgent social concern.24
3.29
These issues were discussed in further detail in a number of submissions to the inquiry.25
3.30
Several submissions referred to the Final Report of the Royal Commission into Institutional Responses to Child Sexual Abuse, which found that exposure to pornography had been identified in cohorts of children displaying harmful sexual behaviours.26
3.31
Similarly, some submissions referred to Prevent. Support. Believe. Queensland’s Framework to address Sexual Violence, which states that exposure to pornography is a potential driver of sexual violence and problematic sexual behaviour between young people.27
3.32
The Committee heard that Aboriginal children in care or experiencing vulnerability in their family setting are especially vulnerable to the impacts of exposure to pornography.28
3.33
Concern was also expressed that due to real or perceived inadequacies in sex education and the limited availability of age-appropriate educational material, children and young people were turning to pornography for education.29 For example, Mr Chern Eu Kuan, Student Contributor at the University of New South Wales (UNSW) Law Society, told the Committee:
The most detrimental harm caused by exposing minors to pornography is the normalisation of unsafe sex. Since sex education in school is generally underwhelming. Minors often derive their understanding of sex from online pornography. This can be detrimental to their development as pornography depicts acts of unsafe sex, misogyny, physical aggression and verbal aggression.30
3.34
Further evidence on education is discussed later in this chapter.

Social impacts

3.35
The Committee also received evidence about broader social impacts that may be associated with exposure to pornography, such as anxiety about body image, broader mental health issues, reduced academic performance, erectile dysfunction, and systemic issues such as violence against women.31
3.36
For example, eChildhood noted a link between children’s access to pornography and self-objectification and concerns about body image. Mrs Liz Walker, Deputy Chair of eChildhood, told the Committee:
What we know from the literature is that there’s certainly a correlation between a hypersexualised media in general, hypersexualised advertising and then of course pornography as well in body image issues. ...we do know that an increasing number of young women are seeking out plastic surgery, labiaplasties, which is trimming of the labia, to meet some sort of porn perfection representation. There are young men who are insecure about the size of their penis.32
3.37
More broadly, the eSafety Commissioner noted that there is evidence to suggest that exposure to pornography can negatively impact the mental health and wellbeing of young people.33 Summarising research on the relationship between pornography use and mental health, eChildhood submitted that porn users experience higher incidence of depressive symptoms and lower degrees of social integration.34
3.38
eChildhood cited research that found a link between use of pornography and declining academic performance among 12- to 15-year old boys:
... the more boys used sexually explicit Internet content, the poorer their school grades were 6 months later. Boys’ use of sexually explicit websites significantly predicted their school performance.35
3.39
eChildhood also cited a study of college-aged students with similar findings, along with research on the impact on a range of cognitive abilities associated with frequent use of online pornography.36
3.40
The Committee also heard about the association between pornography use and erectile dysfunction. Mrs Walker told the Committee:
This is something that the average teen boy wouldn’t consider. He’d be thinking, ‘Well, if I watch pornography I’m going to become a great lover.’ In actual fact, by the time he’s 20 he might be discovering that he’s actually unable to be aroused by the person he’s with because he’s watched so much pornography.37
3.41
Mrs Walker went on to explain that studies show an increase in the number of young men experiencing ‘porn induced erectile dysfunction’:
The literature indicates that, pre-internet, we were seeing around three to five per cent of men under the age of 40 experiencing erectile dysfunction, and that was usually due to some sort of physiological condition... Now the literature is indicating that around one-third of men under the age of 40 are experiencing erectile dysfunction, and a large percentage of that looks like it’s due to internet pornography.38
3.42
The Committee is also aware of the findings of a survey conducted by researchers at Monash University that indicate that in many cases young women’s experience of sex is unsatisfying and associated with feelings of guilt, embarrassment, and stress, and that this may be due to them being pressured by their partners to live up to unrealistic expectations of sex encouraged by online pornography.39
3.43
A number of submitters raised a link between exposure to online pornography and sexual aggression and violence.40 eChildhood cited a meta-analysis on the relationship between pornography and sexual aggression, which found that:
... on the average, individuals who consume pornography more frequently are more likely to hold attitudes conducive to sexual aggression and engage in actual acts of sexual aggression than individuals who do not consume pornography or who consume pornography less frequently.41
3.44
Ms Newlands from eChildhood elaborated on this point:
Minors who have been exposed to pornography are more likely to view women as sex objects. Minors who view pornography and other sexualised media are more accepting of sexual violence and more likely to believe rape myths. Adolescents who are exposed to pornography are more likely to engage in sexual violence. In addition, a correlation has been shown between a child being exposed to pornography and their likelihood of being a victim of sexual violence.42
3.45
In relation to violence against women, eChildhood noted that the Third Action Plan of the National Plan to Reduce Violence against Women and their Children has a focus on ‘better understanding and countering the impact of pornography given increasing evidence showing a correlation between exposure to online pornography and the sexual objectification of women and girls, the development of rape cultures and the proliferation of sexual assault’.43
3.46
Ms Liszewski from Collective Shout told the Committee that ‘harmful attitudes and behaviours’ associated with viewing pornography ‘are widely recognised as underpinning the epidemic of violence against women in Australia’.44

United Nations Convention on the Rights of the Child

3.47
Given concern about the potential harms caused to children from exposure to pornography, a number of submitters referred to Australia’s obligations under the United Nations Convention on the Rights of the Child.45
3.48
eChildhood submitted that under the Child Convention, Australia ‘has an obligation to protect children, and this obligation is extended to children’s use of the internet’.46

Regulation of online pornography in Australia

3.49
Online content (content accessed through the internet, mobile phones, content services, and live-streaming) is regulated under the Broadcasting Services Act 1992. Under Schedule 7 of the Act, prohibited content includes content that has been, or is likely to be, classified as:
RC (refused classification);
X18+;
R18+ unless it is subject to a restricted access system; and
MA15+ and is provided on a commercial basis unless it is subject to a restricted access system.47
3.50
Material that contains ‘real depictions of actual sexual activity between consenting adults’ and is ‘unsuitable for a minor to see’ falls within the X18+ classification.48
3.51
Where prohibited content is hosted in Australia, the eSafety Commissioner has the authority to direct the relevant content service provider to remove the content from its service.49
3.52
Where prohibited content is hosted overseas, the eSafety Commissioner will notify the content to the suppliers of approved filters under the Family Friendly Filter scheme, so that access to the content is blocked.50
3.53
Regardless of where content is hosted, if the content is of a sufficiently serious nature (for example, child sexual abuse material), the eSafety Commissioner can take additional measures, including referring the content to law enforcement and to the International Association of Internet Hotlines (INHOPE), a global network to facilitate the removal of child sexual abuse material from the internet.51
3.54
Mr Toby Dagg, Manager of Cyber Report at the Office of the eSafety Commissioner, confirmed to the Committee that it is prohibited for any X18+ material ‘showing explicit sexual activity between consenting adults’ to be hosted in Australia. However, Mr Dagg explained that ‘very little of that content’ is hosted in Australia and was instead ‘overwhelmingly’ hosted overseas.52

Age verification for online pornography in other jurisdictions

3.55
This section summarises evidence received on age verification for online pornography in international jurisdictions, which predominantly considered the proposed regime for mandatory age verification in the United Kingdom.

UK Digital Economy Act

3.56
The terms of reference for the inquiry required the Committee to consider the proposed regime for mandatory age verification for access to online pornography in the United Kingdom.
3.57
The regime was set out in Part 3 of the Digital Economy Act 2017 (DEA) and would have made the distribution of online pornography in the UK without age verification controls an offence.53 The DEA is discussed in further detail later in this section.
3.58
However, in October 2019, following the establishment of the present inquiry, the UK Government announced that it would not commence age verification under the DEA.54 The Government stated that it had concluded that a coherent agenda on these issues ‘will be best achieved through our wider online harms proposals’, which ‘will give the regulator discretion on the most effective means for companies to meet their duty of care’.55
3.59
Nevertheless, as outlined in this section, the Committee received a range of evidence on the DEA and the proposed regime which may inform the consideration of age verification for online pornography in Australia.

Scope and requirements

3.60
Under section 14(1) of the DEA, pornographic material may not be made available on the internet to persons in the UK on a commercial basis unless controls are in place to ensure that the material is not normally accessible by persons under the age of 18.56 In addition, section 23 of the DEA provides powers to block access to extreme pornographic material.57
3.61
Pornographic material is defined in Section 15 of the DEA to include:
video works issued with an R18 certificate by the British Board of Film Classification (BBFC); and
any other material that could be assumed from its nature was produced solely or principally for the purposes of sexual arousal.58
3.62
According to the BBFC, the R18 category is ‘primarily explicit works of consenting sex or strong fetish material involving adults’.59
3.63
The circumstances in which pornographic material would be regarded as made available on a commercial basis are defined in the Online Pornography (Commercial Basis) Regulations 2018 to include:
if access to that material is available only upon payment; or
if that material is made available free of charge and the person who makes it available receives (or reasonably expects to receive) a payment, reward or other benefit in connection with making it available on the internet.60
3.64
The regulations exclude circumstances where it is reasonable for the regulator to assume that pornographic material makes up less than one-third of the content made available on a site. However, this exclusion does not apply if the site is marketed as a site by means of which pornographic material is made available.61 This definition includes websites ‘which offer pornographic content for free, but which generate revenue through advertising or premium content’, but excludes social media platforms.62
3.65
In February 2018, under section 16 of the DEA, the Secretary of State appointed the BBFC as the age-verification regulator. The BBFC is the non-governmental, not-for-profit independent regulator of film and video in the UK.63
3.66
The BBFC would be responsible for determining whether arrangements for making pornographic material available complied with section 14(1) of the DEA.64 The DEA requires the BBFC to publish guidance on compliant ageverification arrangements (the published guidance is discussed in the previous chapter of this report).65
3.67
The DEA also gives the BBFC powers to:
issue notices requiring information;66
issue enforcement notices and/or impose financial penalties where a person contravened section 14(1);67
issue notices to payment-services providers and ancillary service providers where a person contravened section 14(1);68 and
issue notices to internet-service providers (ISPs) where a person had contravened section 14(1) requiring them to prevent access to the offending material.69
3.68
Guidance published by the BBFC sets out the BBFC’s considerations in terms of enforcement action, in addition to a list of classes of ancillary service providers included under section 21 of the DEA, ‘such as social media platforms, search engines and advertising networks’.70
3.69
Speaking to the Committee via teleconference, Ms Amelia Erratt, Head of Age Verification at the BBFC, confirmed that the BBFC would have the ability to block non-compliant websites and request that service providers withdraw their services from non-compliant websites.71 Ms Erratt submitted that these powers would give the BBFC ‘international reach’:
Obviously, with ISP blocking you would block a website from being accessible in the UK, but what we had understood from the payment service provider power is that, if you cut off a website’s payment services, that can have a ‘global impact on their business. Given that these websites are quite financially focused, the threat of having their payments cut off was enough to make them want to comply with the legislation.72
3.70
The BBFC submitted that VISA and Mastercard had ‘confirmed a willingness to cooperate with the regime’.73

Implementation status

3.71
As noted above, in October 2019 the UK Government announced that it would not commence the age verification provisions of the DEA. This followed an earlier announcement in June that the date of entry into force of the scheme would be postponed ‘due to the failure of officials to notify the BBFC’s Guidance on Age-verification Arrangements to the European Commission as required by the Technical Standards and Regulations Directive’.74
3.72
Despite electing not to commence the age-verification scheme, the UK Government stated that it expected age verification tools ‘to continue to play a key role in protecting children online’ as part of its broader online harms regulatory regime.75 The Under-Secretary of State for Digital, Culture, Media and Sport elaborated in Parliament that the Government wanted to take more time to review the regulatory regime, including its definition of pornographic material and coverage of social media platforms. He stated that:
Age verification will be a key part of the online harms agenda. It will be a key tool in the box, but the toolbox will, through the online harms agenda, be bigger…we will be bringing it forward for pre-legislative scrutiny so that we can get it right. I hope that the BBFC will be a key part of the future of this process, because its expertise is in the classification of content…We look forward to working together with the BBFC.76
3.73
The BBFC submitted that it had been ready to implement the scheme:
The BBFC had all systems in place ready to undertake our role, to ensure all commercial pornographic websites accessible from the UK would have age-verification controls in place or face swift enforcement action. The adult industry was similarly prepared to implement age-verification, and age-verification providers were undergoing a robust certification process to ensure they too were ready for entry into force. The Government has acknowledged our preparedness...77
3.74
The BBFC submitted that it was required under the DEA to act proportionately and had prioritised the most popular pornographic websites:
As 70% of UK traffic visit just the top 50 sites - and these sites are owned by an even smaller number of companies - we were confident our efforts would have made a significant impact in a relatively brief period of time.78
3.75
The BBFC explained that its engagement with the adult industry had been positive and that it was confident that these large companies would have complied with the requirements of the DEA.79 Ms Erratt expanded on this point:
The adult industry believed the new regime would work and saw regulation as inevitable given the international consensus that children should not have unrestricted access to pornography. The BBFC engaged directly with the adult industry, and we anticipated that over 80 per cent of pornographic websites were set to comply with age verification from day one.80
3.76
Ms Erratt said there was ‘no pushback’ from the industry as far as she was aware.81
3.77
The eSafety Commissioner submitted that the UK Government had estimated the cost to date of implementing the scheme was approximately $4.15 million, and that the UK Government had requested indemnity of up to approximately $18.84 million for the BBFC in its first year of operation.82

Lessons for Australia

3.78
Reflecting on the experience to date in the UK, and in particular criticisms of the proposed age-verification regime, a number of witnesses put forward lessons for Australia in any consideration of age verification for online pornography.
3.79
Speaking on behalf of the BBFC about the UK experience, Ms Erratt suggested that there were three ‘crucial factors’ for Australia to consider:
ensuring there is a level playing field in terms of regulation;
ensuring age verification is both robust and easy for consumers to use; and
raising public awareness of age verification, so that consumers understand that age verification is a child protection measure and so that they trust and understand how to use ageverification systems in place.83
3.80
A common theme in evidence was the importance of raising awareness and addressing concerns among members of the public. For example, the eSafety Commissioner explained:
A number of concerns about age verification were raised, including: concerns about data security and privacy; freedom of expression; ease of circumnavigation; and difficulties of enforcement (particularly against non-UK companies). Whilst a number of these concerns were subsequently addressed by the age verification regulator, these were not necessarily adequately explained or communicated to the general public or media. It is imperative that public concerns are addressed head-on, and that there is full transparency over effectiveness and measures taken to address concerns, so that the community and media is brought along.84
3.81
Similarly, the Age Verification Providers Association (AVPA) submitted that while ‘no significant, valid criticisms’ in relation to privacy or technology were put forward by its members, there were clear lessons to be learnt in relation to ‘raising public awareness and maintaining credibility that policies will be enforced to drive adoption by content publishers’.85
3.82
Mr Iain Corby, Executive Director of the AVPA, told the Committee that there was a failure to explain to the public the rationale of the age-verification regime and the privacy protections that would be in place.86
3.83
Dr Julia Fossi, Expert Advisor at the Office of the eSafety Commissioner, suggested that a basis for criticism of the regime was the fact that data protection and privacy standards were not enshrined in the DEA:
The BBFC put [data protection and privacy] into their certification and their guidance and it’s in their standards. But the simple fact that that wasn’t put into the legislation itself was an open door to the privacy rights cohort of people.87
3.84
Another criticism raised in evidence to the inquiry was that the age-verification regime did not include social media. For example, the eSafety Commissioner explained:
One of the main criticisms of the DEA was that the legislation was limited to online commercial providers, and therefore did not address the plethora of online pornography that can be easily accessed on social media, gaming websites and search engines. As such, these services would not be required to carry age-verification.88
3.85
The eSafety Commissioner suggested that, given evidence about the exposure of young people to sexual and pornographic material on social media, this omission was a ‘grave concern for many’. It argued that the UK Government’s decision to pursue a wider regulatory regime ‘highlights the importance of taking a broad harm-minimisation approach’.89
3.86
In its submission, the BBFC stated that the age-verification regime could have been expanded to include social media.90 Ms Erratt expanded on this point:
... there were clauses in the [DEA] that required the regulator to report to government 12 months after entry into force on the effectiveness of the regime. In that kind of report, we may have recommended that the government needed to look at social media.91
3.87
Evidence in relation to age-verification for social media is discussed in further detail later in this chapter.
3.88
Lastly, eChildhood submitted that a common criticism of the UK approach was ‘that it was an imperfect solution that determined teenagers would be able to get around, such as through the use of VPNs [virtual private networks]’.92
3.89
In response to this criticism, eChildhood, the BBFC and the AVPA argued that age-verification would prevent many children from inadvertently accessing pornography.93 Similarly, Collective Shout explained:
For children, some may be sufficiently tech-savvy to use a free VPN to bypass age verification, or use a parent’s VPN if this is not well protected. However, the goal of age verification is not the unrealistic prevention of any access by any child to online pornography but rather to create a significant barrier to access and in particular to prevent inadvertent and casual access.94
3.90
Evidence on the effectiveness of online age verification is discussed in further detail in the previous chapter.

Other international approaches

3.91
The Committee received limited evidence on approaches to restricting access to online pornography in other nations.
3.92
The eSafety Commissioner provided an overview which indicated that a range of approaches were in place or under consideration. For example, these approaches included:
labelling content for certain age groups, authenticating users, and imposing scheduling restrictions on live streaming (in Germany);
mandating default settings to block adult content, which can only be uninstalled with valid identification proving that the user is over 18 (in South Africa); and
national-level content filtering (in South Korea).95
3.93
New Zealand is monitoring the implementation of age verification in the UK and has indicated that it is an option for consideration.96
3.94
However, the Committee heard that no jurisdiction had implemented a mandatory scheme for age verification for online pornography.97

Evidence on implementation in Australia

3.95
Consistent with the widespread concern among submitters about the harms to children and young people due to exposure to pornography, there was strong support for the introduction of a regime of mandatory age verification for online pornography in Australia.
3.96
This section summarises evidence received during the inquiry on the implementation of such a regime. Evidence on the implementation of online age verification more generally is discussed in the previous chapter.

Possible scope

3.97
As outlined above, the proposed regime for age verification in the UK was intended to cover pornography made available on a commercial basis as defined under the DEA. In evidence to the inquiry, there was some discussion about the appropriate scope of any such scheme in Australia.
3.98
The UNSW Law Society argued that the ‘narrow focus’ of the UK scheme meant that children would be able to access pornographic material from free sites, through sharing on mobile phones, or from non-pornographic sites such as Twitter, Reddit, and Imgur.98
3.99
Again referring to the UK scheme, TrustElevate suggested there may be benefit in addressing the issue of pornographic material in a comprehensive way, rather than pursuing ‘incremental measures’ which may disperse users to other, non-commercial sites.99
3.100
The eSafety Commissioner submitted that consultation would be required on the services to be incorporated and covered in any mandatory ageverification regime for online pornography.100

Free and ad-supported websites

3.101
A clear theme in evidence was that many pornographic websites are able to be accessed for free. Many submissions referred to the website Pornhub, which offers free content supported by advertising, in addition to paid content. Pornhub reported approximately 33 billion site views in 2018.101
3.102
The Committee heard that many free pornographic websites—including Pornhub—have a commercial aspect that would ensure they are captured under the broad definition of commerciality under the DEA.102
3.103
Mr Dagg from the Office of the eSafety Commissioner explained that the ‘overwhelming rationale for operating free platforms is to drive customers into the premium platforms’. Mr Dagg noted that MindGeek, the parent company of Pornhub, also owns ‘scores’ of premium websites.103
3.104
Mr Dagg suggested that the definition of commercial in the DEA was ‘a reasonably good place to start’.104

Social media services and search engines

3.105
Concern was also raised about the availability of pornographic material on social media websites and via search engines. As noted above, there was some criticism of the UK approach for its failure to include social media services, and this omission was cited in the UK Government’s decision to pursue a wider regulatory regime.
3.106
Collective Shout expressed concern that, while working to improve their practices, social media services ‘continue to host pornographic content that is easily accessed by children’.105 Similarly, Mr Kuan told the Committee:
Simply typing a pornographic term into a search engine like Google would ...display pornographic content without restriction.106
3.107
The Synod of Victoria and Tasmania, Uniting Church in Australia argued that search engines would need to be captured for age-verification to be effective:
Unless the Government can require search engine corporations to do age verification on users to then not return results for searches for pornography to children, requiring commercial pornography sites to provide age verification would seem to be a limited safeguard.107
3.108
The eSafety Commissioner submitted that many social media services use age screening to ascertain whether or not a user is eligible, and that this may ‘simply require users to self-declare their age’.108 However, some submitters argued these measures were not effective.109
3.109
In a submission to the inquiry, DIGI, an industry association representing companies including Google, Facebook, and Twitter, outlined a range of measures used by its members to restrict access to pornographic content. These measures include age restrictions, human moderation, and automated technology to detect and remove prohibited content.110 As an example:
... in the last quarter, Facebook removed 98.4% of adult nudity sexual activity content, and 99.5% of child nudity and sexual exploitation content, before it was flagged by users.111
3.110
In its submission, the BBFC discussed the question of age verification for social media services:
Potentially harmful material on social media platforms needs to be addressed but a ‘blanket ban’ on users aged under 18 is unlikely to be viewed as a satisfactory solution to the problem, either by platforms or by the general public more broadly. Blocking an entire platform is unlikely to be a proportionate response. Perhaps the power to issue fines would be more appropriate.112
3.111
The BBFC went on to suggest a number of options:
These range from voluntary measures to direct statutory intervention, though different enforcement powers would be required than those set out in the DEA. For example, age-verification could be applied at account level and could be monitored by the platform.113
3.112
eChildhood submitted that age verification for online platforms including social media services required further investigation, and that these platforms were the target of the Office of the eSafety Commissioner’s Safety by Design initiative (see discussion later in this chapter).114

Mechanisms for enforcement

3.113
There was some discussion in evidence about the difficulty of enforcing a requirement for age verification given that—as noted above—pornographic material is generally hosted outside Australia.
3.114
eChildhood argued that any age-verification regime must be capable of restricting access to pornographic material, whether it is hosted in Australia or internationally. eChildhood also argued that there should be parity in the treatment of domestic and international websites.115
3.115
The UNSW Law Society submitted that enforcement would be ‘incredibly difficult’ due to the large number of adult websites, and that compliant sites could be at a competitive disadvantage as children may preferentially seek pornographic material from non-compliant sites.116
3.116
However, Ms Erratt told the Committee that the large number of adult websites are ‘owned by a relatively small number of companies’:
If you engage with those companies, you can have quite a broad impact on compliance.117
3.117
The AVPA argued that extending jurisdiction beyond a country’s borders was also a consideration in the physical world and was not necessarily a reason not to impose rules domestically:
The risk of evading protections by going to offshore, unregulated websites has to be addressed through enforcement, be that through financial blocks on income streams to those websites from the domestic market or through more direct site blocking through internet service providers. ...And a standards based approach adopted globally will provide increased opportunity for international agreements and collaboration to raise standards internationally.118
3.118
A focus of discussion was the power under the DEA for the BBFC to issue notices to ancillary service providers, including payment providers such as Visa and MasterCard. Ms Erratt, representing the BBFC, argued that this power was an effective one:
As you can imagine, the adult industry is financially driven, so the threat of losing income means that these are very effective enforcement powers for any regulator.119
3.119
Similarly, Mr Dagg told the Committee:
... that approach of targeting ancillary service providers is a good one. If you are undermining the economic viability of those sites then you are directly affecting the interests of those who run the sites and they are motivated to comply.120
3.120
Mr Dagg explained that the Office of the eSafety Commissioner had found that Visa and MasterCard had taken ‘swift and absolute’ action in instances where their services were being misused to facilitate access to child sexual abuse material:
In previous investigations we have focused on contacting MasterCard and Visa where we’ve identified the use of those payment cards to facilitate access to child sexual abuse material. And in respect of the websites that we were targeting, as soon as we notified those card providers through our inquiries with the information that their services were being misused to access child sexual abuse material, we saw an immediate impact on the websites.121
3.121
Mr Alastair MacGibbon, former eSafety Commissioner and former National Cyber Security Adviser, also suggested that restricting payments was an important enforcement power, but noted that non-compliant websites might seek to use alternative payment methods.122 However, Mr MacGibbon emphasised that any approach would involve ‘edge cases’ where it would fail.123
3.122
Mr MacGibbon said that there were other options for enforcement, including blocking non-compliant websites.124 eChildhood also discussed ‘forced blocks’ implemented at the ISP level.125
3.123
eChildhood proposed that there should be provision for members of the public to report non-compliant sites and that there should be widespread awareness of this facility.126

Privacy regulation and auditing

3.124
It was put to the Committee that any regime for age verification for online pornography should include strong protections for the privacy of users’ personal information.127 Further evidence relating to privacy in the implementation of age verification more generally is discussed in the previous chapter.
3.125
Eros Association was concerned that implementation of an age-verification model similar to that proposed for the UK would create a ‘honey pot’ of personal information that may be hacked or leaked.128
3.126
As noted above, while the Committee heard that the regulator in the UK had attempted to address this concern through data protection and privacy standards, a criticism was that these standards were not enshrined in legislation. The eSafety Commissioner explained:
The fact that the legislation itself did not contain any guidance, technical requirements or conditions for data storage expected of age verification solutions was considered inattentive, particularly given the sensitivities in relation to data security and privacy that age verification involve. Whilst the BBFC made clear in its own guidance and voluntary certification scheme of the expectations and obligations in relation to data protection and privacy, the fact that these details were not translated into the core of the DEA legislation itself was of concern.129
3.127
The Australian Christian Lobby suggested that it ‘would be unacceptable for adult sites to retain any identification of users’.130 Similarly, eChildhood submitted:
For adults who wish to access pornography online, the fact that they are accessing pornography and information regarding their personal pornography preferences, is sensitive personal information that must not be misused.131
3.128
There was general agreement about the use of third-party age verification such that no personal information is passed between a pornographic website and an age-verification provider.132 Third-party verification is discussed in detail in the previous chapter.
3.129
eChildhood and the UNSW Law Society noted that obligations under the Privacy Act 1998 and the Australian Privacy Principles would apply to age-verification providers.133 However, eChildhood argued that, ‘due to the sensitive nature of access to and personal preferences with respect to online pornography’, these obligations should be supplemented with more stringent privacy protections.134
3.130
eChildhood also recommended including in any age-verification legislation the individual’s right to be forgotten, similar to the provision in the European Union General Data Protection Regulation (see discussion in the previous chapter).135
3.131
As discussed in the previous chapter, the BBFC established a voluntary scheme whereby age-verification providers could be audited for compliance with privacy and data security requirements.136
3.132
The eSafety Commissioner suggested that a prerequisite for the implementation of a mandatory age-verification regime would be the establishment of ‘processes and procedures to test, monitor, audit and provide oversight on age verification technical solutions and tools’.137

Community and industry consultation

3.133
eChildhood submitted that in order to ensure any age-verification legislation is effective, ‘thorough consultation with all key stakeholders and digital experts is imperative’:
This will enable a robust, flexible and researched outcome that includes safety, security and privacy.138
3.134
The eSafety Commissioner nominated several areas where consultation may be required prior to implementing a mandatory age-verification scheme for online pornography:
consultation with the public (children and young people as well as adults), adult industry, internet service providers, social media platforms, mobile phone providers, civil rights groups, human rights groups, NGOs to ascertain views on, and support for, the use of age verification to protect children and young people from online pornography
public consultations on any draft guidance to any regulatory approach to age verification, standards and classes or types of services to be incorporated and covered
broad consultation with federal and state regulators to develop a national strategy and to ensure for harmonisation and interoperability across jurisdictions.139
3.135
eChildhood nominated several matters to be determined in a consultation process, including the appointment of an appropriate regulatory body (see discussion in the next section).140
3.136
Speaking to the Committee about work done in preparation for the proposed age-verification regime in the UK, Ms Erratt said that the BBFC engaged with industry and also carried out ‘full-scale public consultation’. The BBFC was also planning a public engagement program to educate consumers about age verification:
We really did understand the importance of ensuring that consumers understood why age verification was coming in, which was to protect children, and that they understood how to age verify safely.141
3.137
The eSafety Commissioner also noted the importance of awareness raising and education to inform the public about the rationale for age verification for online pornography and the safeguards in place to address privacy, security, and safety concerns.142
3.138
The Committee notes evidence from the eSafety Commissioner that additional resources may be required were her office to be tasked with carrying out further work on age verification, such as reviewing and testing appropriate technologies and negotiating with industry bodies.143

Regulatory oversight

3.139
In its submission, eChildhood considered the appointment of a body to oversee an age-verification regime in Australia, suggesting that appropriate bodies may include the Office of the eSafety Commissioner, the Australian Classification Board, and the Australian Communications and Media Authority.144
3.140
In relation to the Office of the eSafety Commissioner, eChildhood noted its existing mandate to coordinate and lead online safety efforts:
The [Office of the eSafety Commissioner] may be well-positioned to regulate and enforce an Age Verification regime for the purpose of providing online safety to children. It already has similar obligations and responsibilities, is well-known and respected within the industry and has both national and International support networks. It has a platform within schools and other online safety education arenas to promote the system and disseminate information about it.145
3.141
However, eChildhood suggested that a decision in relation to the appropriate regulator would be an important outcome of any consultation process, and that whichever body was appointed would need to be provided with adequate resources to ensure that effective implementation of the regime was maintained.146

Evidence on complementary measures

3.142
The Committee heard evidence about a range of technological and other measures that could complement age verification in order to minimise or address the harms associated with exposure of children and young people to online pornography.

Filtering and blocking

3.143
The Committee heard about other technologies to protect children and young people from exposure to online pornography, including network- and device-level filtering and ISP blocking.147
3.144
For example, the Family Friendly Filter scheme, introduced in 2005, requires that ISPs make available approved filters for end users to install.148 As noted above, the eSafety Commissioner refers adult and explicit overseas-hosted content to suppliers of filters under the scheme to ensure that this content is blocked.149
3.145
The eSafety Commissioner submitted that filtering services ‘can be a useful tool to support those looking to moderate children’s access to online content, particularly in relation to very young children’.150
3.146
The eSafety Commissioner explained that device-level filtering products were found to be the most effective, but that these did not offer complete protection for families and children.151
3.147
As an alternative, ISP blocking involves maintaining a blacklist of addresses for ISPs or internet infrastructure operators to block on their systems.
3.148
However, the eSafety Commissioner submitted that blocking systems suffer from a number of limitations, including over- and under-blocking, ease of circumvention, and high maintenance and administration costs. Further, blocking systems are unable to capture all internet traffic and there is ‘very little publicly available information’ on their effectiveness and accuracy.152
3.149
eChildhood discussed filtering and blocking in some detail in its submission, arguing that these solutions lack robustness and should not be relied on in isolation.153
3.150
The eSafety Commissioner recommended that an effective approach to minimising exposure to online pornography would involve a ‘combination and layering of technological solutions’.154

Safety by Design initiative

3.151
The Committee heard about the Safety by Design initiative (SbD), which seeks to ensure that user safety is considered in the design and development of online products and services. The development of SbD is being led by the eSafety Commissioner.155
3.152
The eSafety Commissioner explained:
At its core, [SbD] is about embedding the rights of users and user safety into the design, development and deployment of online and digital products and services.
... It recognises and responds to the intersectionality of risk and harm in the online world and acknowledges the potential of advancements in technology, machine-learning and artificial intelligence to radically transform user safety and our online experiences.156
3.153
The eSafety Commissioner emphasised the importance of considering online safety in a holistic manner ‘rather than addressing issues in an ad hoc manner’, and also proactively considering user safety ‘rather than retrofitting safety considerations after online harms have occurred’.157
3.154
The Committee heard that age verification was one tool that could be used in incorporating user safety in the design of online platforms.158
3.155
Following consultation with industry, parents and carers, and children, the eSafety Commissioner developed a set of three SbD principles that ‘provide online and digital interactive services with a universal and consistent set of realistic, actionable and achievable measures to better protect and safeguard citizens’ safety online’. These principles are:
service provider responsibilities;
user empowerment and autonomy; and
transparency and accountability.159
3.156
The eSafety Commissioner explained that consultations also identified that young people expect industry to be proactive in identifying and minimising exposure to threats, risks, and harmful content.160
3.157
The eSafety Commissioner is in the process of creating a framework and resources to facilitate the adoption of SbD principles.161

Education

3.158
A clear theme in evidence to the inquiry was that education to address the harms associated with exposure to online pornography would complement technological measures such as age verification.
3.159
The eSafety Commissioner submitted that it is imperative that young people are assisted to navigate their digital environment:
It is developmentally appropriate that young people are sexually curious and have an interest in what constitutes healthy sexual relationships. As such, young people will explore their sexual identities and search for information about relationships online.
... A child’s educational journey therefore presents a critical opportunity for evidence-based education and awareness raising to address children’s exposure to sexually explicit material and online pornography.162
3.160
Citing research that indicated that 94 per cent of parents with pre-schoolers reported that their child was using the internet by age four, the eSafety Commissioner suggested that such education ‘needs to start early’.163
3.161
The eSafety Commissioner recommended that education about respectful relationships and online safety—including age-appropriate education on dealing with online pornography—should be embedded in the Australian Curriculum.164
3.162
Similarly, the child protection advocacy group Bravehearts, recommended ‘developmentally appropriate sex education for schools, inclusive of positive and healthy relationships, consent issues and awareness of the online environment’:
... one of the most important tools we have is education and ensuring that children have access to honest, developmentally appropriate sex education and personal safety programs.165
3.163
The UNSW Law Society submitted there are ‘structural problems’ with sex education programs and that these programs could be supplemented with ‘information about how pornography can display unrealistic and harmful behaviours towards sex’.166
3.164
Ms Carol Ronken, Director of Research at Bravehearts, argued that education should aim to teach children to be ‘critical consumers’ of online material.167
3.165
This message was echoed by the eSafety Commissioner, who submitted that education should incorporate ‘broad-based relationship, critical thinking and resilience skills that enable young people to critically interpret online media and cope with potential exposure to harmful content’.168 Ms InmanGrant explained:
I often talk about the filter between our children’s ears—their brains—and that’s where the education and critical reasoning skills are really, really important.169
3.166
Concern was expressed about a lack of consistency across the country in relation to sex- and online safety education, and about competing priorities in the curriculum.170 For example, Ms Inman-Grant told the Committee:
We know, with our fragmented education system, that online safety education and respectful relationships education is not happening consistently and comprehensively across the public, independent and Catholic schools. It is a concern.171
3.167
Mr Marshall Ballantine-Jones, a PhD candidate undertaking research on reducing the negative effects of pornography and sexualised media on adolescents, submitted that there are ‘only a handful’ of education programs that address pornography and sexualised media, and that none of these have been empirically tested for effectiveness.172
3.168
The Department of Social Services advised that it provided the charity Our Watch with $3.129 million over four years from 2016-17 to deliver an ‘evidence-based community awareness initiative to understand and counter the impact of pervasive pornography on young people’. However, the Department also advised that the campaign had yet to launch.173
3.169
Ms Elizabeth Hefren-Webb, Deputy Secretary of the Department, told the Committee:
It’s really tricky to develop a campaign about porn that doesn’t breach community standards and parents’ standards about what young people should be hearing. ...The materials that have been developed targeting parents, we think, are in good shape, but the materials targeting people under 18—we don’t think we’ve got the right pitch yet.174
3.170
The eSafety Commissioner recommended further research into what constitutes effective education on online pornography, ‘including content, pedagogy, professional learning and support for vulnerable cohorts’:
This should also include capacity building and support for educators to develop knowledge, skills, capacity and confidence to cover this content.175
3.171
As noted in the previous section, the Committee also heard about the role for education in informing the public about the rationale for age verification, and the safeguards in place, to allay fears about the privacy and safety of pornography users.

Integration with a wider eSafety approach

3.172
Consistent with the evidence outlined in this section, a key message of the eSafety Commissioner was that online harms should be addressed as part of a wider approach to online safety:
It is clear that there are no quick-fix solutions to any online safety issue, but that long-term, sustained social and cultural change to protect children online requires the coordinated efforts of the global community and greater collaboration and consultation between industry, government and the general public. There is no silver bullet, and age verification will only ever be one part of the solution.176
3.173
Similarly, Bravehearts argued that age verification can only be effective as part of a holistic approach to addressing online threats, including research, education, and preventive measures.177

Committee comment

3.174
The Committee recognises that there is increasingly clear evidence that children and young people are being exposed to online pornography, and that this exposure is associated with a range of risks and harms.
3.175
Evidence given to the inquiry showed very clearly that there is widespread and genuine concern among the community about the negative impact of online pornography on the welfare of children and young people.
3.176
Based on the evidence to the inquiry, it is the Committee’s strong view that age verification should be pursued as a measure to limit children and young people’s exposure to online pornography.
3.177
The Committee acknowledges that age verification is not a silver bullet—some websites containing pornographic material may not be captured, and some determined young people may find ways to circumvent the system. However, when it comes to protecting children from the very real harms associated with exposure to online pornography, the Committee’s strong view is that we should not let the perfect be the enemy of the good.
3.178
It is the Committee’s expectation that an effective age-verification regime will create a significant barrier to prevent young people—and particularly young children—from deliberately or, perhaps even more importantly, inadvertently gaining access to pornographic material. In doing so, age verification will work best to protect the most vulnerable from the harms associated with exposure to online pornography.
3.179
The Committee however notes the experience in the United Kingdom, where the implementation of age verification has stalled, and also the lack of precedent for a regime of mandatory age verification in any other jurisdiction. This is evidence of the challenge of implementing an effective regime, and an indication that there is still more work to do.
3.180
An effective regime will require robust standards for privacy, safety, and security, broad understanding and acceptance among the community, support from the adult industry and age-verification providers, and a wellresourced regulator with appropriate powers.
3.181
Consideration will also need to be given to the most appropriate legislative and regulatory regime, as well as to what extent the regime should seek to capture social media, search engines, and other online services. All of these issues will need to be worked through, but it is the Committee’s view that these mediums should be caught by an age-verification regime.
3.182
Further to this, a clear message in evidence to the inquiry is that an effective response to the exposure of children and young people to online pornography will be broader than age verification. Other technical solutions, education, and a broader focus on e-safety will all contribute to minimising harms from online pornography and bringing about a safer online environment for our children.
3.183
In this context, the Committee recommends that the eSafety Commissioner lead the development of a roadmap for the implementation of age verification for online pornography.

Recommendation 3

3.184
The Committee recommends that the Australian Government direct and adequately resource the eSafety Commissioner to expeditiously develop and publish a roadmap for the implementation of a regime of mandatory age verification for online pornographic material, setting out:
a. a suitable legislative and regulatory framework;
b. a program of consultation with community, industry, and government stakeholders;
c. activities for awareness raising and education for the public; and
d. recommendations for complementary measures to ensure that age verification is part of a broader, holistic approach to address risks and harms associated with the exposure of children and young people to online pornography.
3.185
As a dedicated office with relevant expertise, industry knowledge, and an understanding of the broader issues associated with online safety, the Committee considers that the eSafety Commissioner is best placed to ensure that the issues set out above are addressed as part of a comprehensive set of measures involving age verification.
3.186
As a basis for consideration, the Committee offers the following principles based on evidence to the inquiry:
at a minimum the regime should seek to capture pornographic material made available on a commercial basis, consistent with the definition established in the United Kingdom;
the regime should also capture pornographic material accessed via and/or available on non-commercial websites as well as social media platforms, search engines, and other online services;
enforcement powers should include the power to direct ancillary service providers (such as payment providers) to withhold services from non-compliant websites;
appropriate standards relating to privacy, safety, and security should be incorporated in legislation; and
consultation with industry and members of the community, and education and awareness raising about the purpose of age verification, should commence as early as possible in the process.
3.187
In carrying out this work, the Committee expects that the eSafety Commissioner would consult with regulators in other jurisdictions—including the British Board of Film Classification—to leverage existing work and to ensure consistency where this is appropriate.
3.188
The Committee also expects that the eSafety Commission would consult with the Digital Transformation Agency in relation to the development of appropriate technical standards for age verification (see Recommendation 1) and an age-verification exchange (see Recommendation 2) to ensure that any regime for age verification for online pornography leverages these more general capabilities.
3.189
Lastly, the Committee is concerned to see this issue addressed as quickly as possible. As such, the Committee recommends that this work be completed and presented to government for decision within 12 months of the presentation of this report.
3.190
The Committee encourages the Australian Government to provide additional resources to the Office of the eSafety Commissioner as is required to complete this work without compromising the important existing work carried out by the office.

  • 1
    eSafety Commissioner, Submission 191, p. 2.
  • 2
    Senate Environment and Communication References Committee, Harm being done to Australian children through access to pornography on the Internet, November 2016, pp. 7-11.
  • 3
    eChildhood, Submission 192, p. 9.
  • 4
    eChildhood, Submission 192, p. 9.
  • 5
    Collective Shout, Submission 178, p. 3.
  • 6
    Collective Shout, Submission 178, pp. 4-5.
  • 7
    Victorian Aboriginal Child Care Agency, Submission 185, p. 3.
  • 8
    Victorian Aboriginal Child Care Agency, Submission 185, p. 3.
  • 9
    Collective Shout, Submission 178, p. 3.
  • 10
    Ms Melinda Liszewski, Campaigns Manager, Collective Shout, Committee Hansard, Canberra, 6 December 2019, p. 60.
  • 11
    WA Child Safety Services, Submission 170, p. 2.
  • 12
    Ms Tamara Newlands, Executive Director, eChildhood, Committee Hansard, Canberra, 6 December 2019, p. 51.
  • 13
    For example, see: eChildhood, Submission 192, pp. 9-10; Collective Shout, Submission 178, p. 3; Dr Elizabeth Taylor, Submission 196, p. 3.
  • 14
    Collective Shout, Submission 178, pp. 1-2.
  • 15
    Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 10.
  • 16
    Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 11.
  • 17
    WA Child Safety Services, Submission 170, p. 2.
  • 18
    Ms Melinda Tankard Reist, Private capacity, Committee Hansard, Canberra, 6 December 2019, p. 61. See also: Ms Melinda Liszewski, Campaigns Manager, Collective Shout, Committee Hansard, Canberra, 6 December 2019, p. 60.
  • 19
    Victorian Aboriginal Child Care Agency, Submission 185, p. 3.
  • 20
    New Zealand Office of Film and Literature Classification, ‘Breaking Down Porn – An analysis of commonly viewed porn in NZ’, <https://www.classificationoffice.govt.nz/news/latest-news/breaking-down-porn-an-analysis-of-commonly-viewed-porn-in-nz/>.
  • 21
    Senate Environment and Communication References Committee, Harm being done to Australian children through access to pornography on the Internet, November 2016, pp. 12-23.
  • 22
    eChildhood, Submission 192, p. 10.
  • 23
    Australian Institute of Family Studies, Children and young people’s exposure to pornography, <https://aifs.gov.au/cfca/2016/05/04/children-and-young-peoples-exposure-pornography>. See also: Australian Institute of Family Studies, The effects of pornography on children and young people - An evidence scan, 2017.
  • 24
    Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 10.
  • 25
    For example, see: University of New South Wales Law Society, Submission 58, pp. 2-3; ChildSafe, Submission 124, pp. 1-5; WA Child Safety Services, Submission 170, pp. 1-2; Marshall Ballantine-Jones, Submission 175, pp. 1-2; Melinda Tankard Reist, Submission 177, pp. 27; Collective Shout, Submission 178, pp. 5-7; Bravehearts, Submission 182, pp. 2-3; Senator Amanda Stoker, Submission 184, pp. 1-3; Victorian Aboriginal Child Care Agency, Submission 185, pp. 3-4; eChildhood, Submission 192, pp. 9-11, 43-46.
  • 26
    Royal Commission into Institutional Responses to Child Sexual Abuse, Final Report, Volume 10, Children with harmful sexual behaviours.
  • 27
    Queensland Department of Child Safety, Youth and Women, Prevent. Support. Believe. Queensland’s Framework to address Sexual Violence, 2019, p. 11.
  • 28
    Victorian Aboriginal Child Care Agency, Submission 185, p. 4.
  • 29
    For example, see: WA Child Safety Services, Submission 170, p. 2; Ms Tamara Newlands, Executive Director, eChildhood, Committee Hansard, Canberra, 6 December 2019, pp. 51, 55.
  • 30
    Mr Chern Eu Kuan, Student Contributor, UNSW Law Society, Committee Hansard, Canberra, 6 December 2019, p. 47.
  • 31
    For example, see: eChildhood, Submission 192, p. 44; eChildhood, Submission 192.1, pp. 1-4.
  • 32
    Mrs Liz Walker, Deputy Chair, eChildhood, Committee Hansard, Canberra, 6 December 2019, p. 56.
  • 33
    eSafety Commissioner, Submission 191, p. 2.
  • 34
    eChildhood, Submission 192, p. 44.
  • 35
    eChildhood, Submission 192.1, p. 2.
  • 36
    eChildhood, Submission 192.1, pp. 2-3. See also: eChildhood, Research: academic and concentration impacts of pornography, <https://www.echildhood.org/research_academic_impacts>; Marshall Ballantine-Jones, Submission 175, pp. 1-2.
  • 37
    Mrs Liz Walker, Deputy Chair, eChildhood, Committee Hansard, Canberra, 6 December 2019, p. 56.
  • 38
    Mrs Liz Walker, Deputy Chair, eChildhood, Committee Hansard, Canberra, 6 December 2019, p. 56.
  • 39
    Sue Dunlevy, ‘Women feeling guilty in bedroom’, The Courier-Mail, Brisbane, Queensland, 24 February 2020, p. 7.
  • 40
    For example, see: Melinda Tankard Reist, Submission 177, pp. 35; Dads4Kids, Submission 181, p. 7; Bravehearts, Submission 182, p. 2; eChildhood, Submission 192, p. 45.
  • 41
    eChildhood, Submission 192, p. 45.
  • 42
    Ms Tamara Newlands, Executive Director, eChildhood, Committee Hansard, Canberra, 6 December 2019, p. 52.
  • 43
    eChildhood, Submission 192, p. 10.
  • 44
    Ms Melinda Liszewski, Campaigns Manager, Collective Shout, Committee Hansard, Canberra, 6 December 2019, p. 60.
  • 45
    For example, see: Ms Jane Munro, Submission 144, p. 2; Collective Shout, Submission 178, pp. 12-13; Canberra Declaration, Submission 180, pp. 9-12; Dads4Kids, Submission 181, pp. 9-11; TrustElevate, Submission 190, p. 7; eChildhood, Submission 192, p. 18.
  • 46
    eChildhood, Submission 192, p. 18.
  • 47
    eSafety Commissioner, Submission 191, pp. 3-4; Department of Communications and the Arts, ‘Online content regulation’, <https://www.communications.gov.au/policy/policy-listing/online-content-regulation>.
  • 48
    National Classification Code (May 2005), <https://www.legislation.gov.au/Details/F2013C00006>. See also: eSafety Commissioner, Submission 191, pp. 3-4.
  • 49
    eSafety Commissioner, Submission 191, pp. 3-4.
  • 50
    eSafety Commissioner, Submission 191, pp. 3-4.
  • 51
    eSafety Commissioner, Submission 191, pp. 3-4.
  • 52
    Mr Toby Dagg, Manager, Cyber Report, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 14.
  • 53
    eChildhood, Submission 192, p. 19.
  • 54
    British Board of Film Classification, Submission 187, p. 1.
  • 55
    Secretary of State for Digital, Culture, Media and Sport, ‘ONLINE HARMS: Written statement – HCWS13’, <https://www.parliament.uk/written-questions-answers-statements/written-statement/Commons/2019-10-16/HCWS13>.
  • 56
    Digital Economy Act 2017 (United Kingdom), section 14(1).
  • 57
    Digital Economy Act 2017 (United Kingdom), section 23. See also: British Board of Film Classification, Submission 187, p. 2.
  • 58
    Digital Economy Act 2017 (United Kingdom), section 15.
  • 59
    British Board of Film Classification, ‘R18’, <https://bbfc.co.uk/about-classification/r18>. See also: British Board of Film Classification, Submission 187, p. 8.
  • 60
    Online Pornography (Commercial Basis) Regulations 2018.
  • 61
    Online Pornography (Commercial Basis) Regulations 2018.
  • 62
    British Board of Film Classification, Submission 187, p. 6.
  • 63
    British Board of Film Classification, Submission 187, p. 2.
  • 64
    British Board of Film Classification, Submission 187, pp. 2, 7.
  • 65
    Digital Economy Act 2017 (United Kingdom), section 25(1)(a); British Board of Film Classification, Submission 187, pp. 7-8.
  • 66
    Digital Economy Act 2017 (United Kingdom), section 18(1).
  • 67
    Digital Economy Act 2017 (United Kingdom), section 19(1)-(2).
  • 68
    Digital Economy Act 2017 (United Kingdom), section 21(1).
  • 69
    Digital Economy Act 2017 (United Kingdom), section 23(1)-(2).
  • 70
    British Board of Film Classification, Submission 187, p. 8.
  • 71
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, pp. 1, 4.
  • 72
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 4.
  • 73
    British Board of Film Classification, Submission 187, p. 6.
  • 74
    British Board of Film Classification, Submission 187, p. 2.
  • 75
    Secretary of State for Digital, Culture, Media and Sport, ‘ONLINE HARMS: Written statement – HCWS13’, <https://www.parliament.uk/written-questions-answers-statements/written-statement/Commons/2019-10-16/HCWS13>.
  • 76
    United Kingdom House of Commons Hansard, 17 October 2019, Volume 666, <https://hansard.parliament.uk/commons/2019-10-17/debates/C743945F-9F9F-48E5-9064-707189D07846/OnlinePornographyAgeVerification>.
  • 77
    British Board of Film Classification, Submission 187, p. 2.
  • 78
    British Board of Film Classification, Submission 187, p. 7.
  • 79
    British Board of Film Classification, Submission 187, p. 1.
  • 80
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 1.
  • 81
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 4.
  • 82
    eSafety Commissioner, Submission 191, pp. 11-12.
  • 83
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, pp. 1-2.
  • 84
    eSafety Commissioner, Submission 191, p. 14. See also: Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 11; Dr Julia Fossi, Expert Advisor, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 16.
  • 85
    Age Verification Providers Association, Submission 200, p. 5.
  • 86
    Mr Iain Corby, Executive Director, Age Verification Providers Association, Committee Hansard, Canberra, 5 December 2019, p. 11.
  • 87
    Dr Julia Fossi, Expert Advisor, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 12. See also: UK Children’s Charities’ Coalition on Internet Safety, Submission 161, p. 2.
  • 88
    eSafety Commissioner, Submission 191, pp. 10-11.
  • 89
    eSafety Commissioner, Submission 191, p. 14.
  • 90
    British Board of Film Classification, Submission 187, pp. 8, 13.
  • 91
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 2.
  • 92
    eChildhood, Submission 192, p. 22.
  • 93
    eChildhood, Submission 192, p. 22; British Board of Film Classification, Submission 187, p. 10; Age Verification Providers Association, Submission 200, p. 5. See also: Senator Amanda Stoker, Submission 184, p. 4.
  • 94
    Collective Shout, Submission 178, p. 11.
  • 95
    eSafety Commissioner, Submission 191, pp. 23-27. See also: Communications Alliance Ltd, Submission 186, p. 4.
  • 96
    eSafety Commissioner, Submission 191, pp. 25.
  • 97
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 3.
  • 98
    UNSW Law Society, Submission 58, p. 6.
  • 99
    TrustElevate, Submission 190, p. 8.
  • 100
    eSafety Commissioner, Submission 191, p. 5.
  • 101
    Mr Toby Dagg, Manager, Cyber Report, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 14.
  • 102
    Mr Toby Dagg, Manager, Cyber Report, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 13.
  • 103
    Mr Toby Dagg, Manager, Cyber Report, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 13.
  • 104
    Mr Toby Dagg, Manager, Cyber Report, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 13.
  • 105
    Collective Shout, Submission 178, p. 10.
  • 106
    Mr Chern Eu Kuan, Student Contributor, UNSW Law Society, Committee Hansard, Canberra, 6 December 2019, p. 47.
  • 107
    Synod of Victoria and Tasmania, Uniting Church in Australia, Submission 183, p. 6.
  • 108
    eSafety Commissioner, Submission 191, p. 6.
  • 109
    For example, see: Synod of Victoria and Tasmania, Uniting Church in Australia, Submission 183, pp. 2-3.
  • 110
    DIGI, Submission 269, pp. 1-2.
  • 111
    DIGI, Submission 269, p. 1.
  • 112
    British Board of Film Classification, Submission 187, p. 13.
  • 113
    British Board of Film Classification, Submission 187, p. 13.
  • 114
    eChildhood, Submission 192, p. 32.
  • 115
    eChildhood, Submission 192, p. 36.
  • 116
    UNSW Law Society, Submission 58, p. 6.
  • 117
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 4.
  • 118
    Age Verification Providers Association, Submission 200, p. 4.
  • 119
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 1. See also: Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 4.
  • 120
    Mr Toby Dagg, Manager, Cyber Report, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 20.
  • 121
    Mr Toby Dagg, Manager, Cyber Report, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 20.
  • 122
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, p. 24. See also: Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 20.
  • 123
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, p. 24.
  • 124
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, p. 24.
  • 125
    eChildhood, Submission 192, p. 31.
  • 126
    eChildhood, Submission 192, p. 36.
  • 127
    For example, see: UNSW Law Society, Submission 58, p. 8; eSafety Commissioner, Submission 191, p. 5.
  • 128
    Eros Association, Submission 65, p. 3.
  • 129
    eSafety Commissioner, Submission 191, p. 11.
  • 130
    Australian Christian Lobby, Submission 160, p. 6.
  • 131
    eChildhood, Submission 192, p. 21.
  • 132
    For example, see: eChildhood, Submission 192, p. 21.
  • 133
    eChildhood, Submission 192, p. 21; UNSW Law Society, Submission 58, p. 7.
  • 134
    eChildhood, Submission 192, p. 21.
  • 135
    eChildhood, Submission 192, p. 21.
  • 136
    British Board of Film Classification, Submission 187, p. 11.
  • 137
    eSafety Commissioner, Submission 191, p. 5.
  • 138
    eChildhood, Submission 192, p. 33.
  • 139
    eSafety Commissioner, Submission 191, p. 5.
  • 140
    eChildhood, Submission 192, p. 28.
  • 141
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 3.
  • 142
    eSafety Commissioner, Submission 191, p. 5.
  • 143
    Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, pp. 16-17.
  • 144
    eChildhood, Submission 192, pp. 28-29.
  • 145
    eChildhood, Submission 192, p. 28.
  • 146
    eChildhood, Submission 192, p. 28.
  • 147
    For example, see: eSafety Commissioner, Submission 191, pp. 15-17.
  • 148
    Communications Alliance, Submission 186, p. 5; eSafety Commissioner, Submission 191, p. 15.
  • 149
    eSafety Commissioner, Submission 191, pp. 3-4, 15.
  • 150
    eSafety Commissioner, Submission 191, p. 15.
  • 151
    eSafety Commissioner, Submission 191, p. 17; Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 11.
  • 152
    eSafety Commissioner, Submission 191, p. 17.
  • 153
    eChildhood, Submission 192, pp. 29-32.
  • 154
    eSafety Commissioner, Submission 191, p. 16.
  • 155
    eSafety Commissioner, Submission 191, pp. 20-22. See also: eSafety Commissioner, ‘Safety by Design’, <https://www.esafety.gov.au/key-issues/safety-by-design>.
  • 156
    eSafety Commissioner, Submission 191, p. 20.
  • 157
    eSafety Commissioner, Submission 191, p. 20.
  • 158
    Dr Julia Fossi, Expert Advisor, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 13.
  • 159
    eSafety Commissioner, Submission 191, pp. 20-21.
  • 160
    eSafety Commissioner, Submission 191, p. 21. See also: Dr Julia Fossi, Expert Advisor, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 16.
  • 161
    eSafety Commissioner, Submission 191, pp. 21-22.
  • 162
    eSafety Commissioner, Submission 191, p. 17.
  • 163
    eSafety Commissioner, Submission 191, p. 18.
  • 164
    eSafety Commissioner, Submission 191, p. 19.
  • 165
    Bravehearts, Submission 182, pp. 3-4.
  • 166
    UNSW Law Society, Submission 58, p. 9.
  • 167
    Ms Carol Ronken, Director of Research, Bravehearts, Committee Hansard, Canberra, 6 December 2019, p. 57.
  • 168
    eSafety Commissioner, Submission 191, p. 18.
  • 169
    Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 18.
  • 170
    eSafety Commissioner, Submission 191, pp. 18-19; Ms Carol Ronken, Director of Research, Bravehearts, Committee Hansard, Canberra, 6 December 2019, p. 58.
  • 171
    Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 12.
  • 172
    Marshall Ballantine-Jones, Submission 175, p. 2.
  • 173
    Department of Social Services, Submission 163, pp. 6-7.
  • 174
    Ms Elizabeth Hefren-Webb, Deputy Secretary, Families, Department of Social Services, Committee Hansard, Canberra, 6 December 2019, p. 7.
  • 175
    eSafety Commissioner, Submission 191, p. 19.
  • 176
    eSafety Commissioner, Submission 191, p. 22.
  • 177
    Bravehearts, Submission 182, pp. 3-4.

 |  Contents  |