2. Methods for online age verification

2.1
The terms of reference for the inquiry required the Committee to consider the potential of online age verification as a mechanism for protecting children and young people from exposure to certain forms of age-restricted content.
2.2
As such, the Committee was interested in evidence on how online age-verification works in practice to verify that an internet user is aged 18 years or above, and how this can be done easily, safely, and securely.
2.3
The Committee heard that while the concept of online age verification is not new, initial methods involved simply displaying an adults-only warning, requiring a user to input their date of birth, or requiring a user to scan or mail a copy of an identity document.1
2.4
The Age Verification Providers Association (AVPA) explained that these initial methods were easily evaded, and were not adequate in circumstances where legislation imposed age restrictions.2
2.5
Similarly, the eSafety Commissioner submitted that the ‘digital ecosystem for third-party verification and in-person proofing was not sufficiently evolved’ to proceed with a pilot of age verification attempted by Microsoft in Australia in 2008.3
2.6
However, more recently, as technology has developed and as some jurisdictions have sought to strengthen age restrictions on certain forms of online content, age verification has become an area of increased interest, leading to the development of more sophisticated methods.
2.7
This chapter summarises evidence received on the attributes of an effective online age-verification model, and then reviews the current state-of-the-art methods for age verification. Evidence in relation to the application of age verification to online pornography and online wagering is discussed in subsequent chapters of this report.

Attributes of an effective age-verification model

2.8
While the Committee heard about a range of age-verification methods (discussed later in this chapter), the Committee also received more general evidence on attributes or features that an effective online-age verification model should possess.
2.9
This evidence, which may serve to inform the implementation of any future regime for online age-verification in Australia, is summarised in three sections addressing the following themes:
privacy and security;
accuracy and effectiveness; and
impact on business and users.

Privacy and security

2.10
A consistent theme in evidence to the inquiry was the importance of any system for online age verification having strong controls for the safety, security, and privacy of users.
2.11
For example, the eSafety Commissioner submitted:
The importance of balancing privacy, security and safety considerations is essential. Any age verification proposal in Australia that mandates the use of technology should include and make explicit reference to data protection, privacy and safety.4
2.12
Similarly, consumer credit agency Equifax submitted that ‘strong privacy controls will be critical’. Equifax offered the following points for consideration:
Minimise, or if possible, eliminate, the retention of any record of age-verification, including a prohibition on disclosure or reuse of any personal information relating to a request for age verification;
18+ sites should not know who a viewer is, only know that a person viewing or using the site has been verified as 18+;
Similarly, the entity verifying age should not know what site the person wishes to view, only that age verification has been requested;
The age verification process should be conducted using the minimum details required to achieve a match;
In a more mature identity environment, people could choose to obtain a reusable age-verification token for them to provide when needed.5
2.13
The University of New South Wales (UNSW) Law Society raised concerns in relation to the collection and management of personal information and potential breaches of privacy:
There are many questions that exist ... such as: whether users will be given notice around the storage of their information; to what extent can users consent to the manner in which their data is utilised; whether it is certain that unauthorised disclosure of personal information will not occur. ...the effectiveness of age verification is thus contingent upon the ability to safeguard [an] individual’s security details.6
2.14
The eSafety Commissioner also highlighted the importance of requirements for data storage.7
2.15
The AVPA recommended that personal data should not be retained unless required:
... where there is no need to retain an audit trail of age verifications, then personal data should not be retained, thus vastly minimising the risk of a privacy breach.
Where an audit trail is required by regulators or the law, still personal data need not be retained, rather only the pseudonimised record of the verification events themselves.8
2.16
Mr Alastair MacGibbon, former eSafety Commissioner and former National Cyber Security Adviser, told the Committee that while ‘there is no such thing as absolute security, safety or privacy online’, technology was sufficient to provide services in the ‘vast bulk of cases’ involving age-restricted content.9
2.17
However, Mr MacGibbon also suggested that individuals with concerns about privacy ‘will be forced into darker parts of the web to avoid verification’.10 A similar suggestion was made by the UNSW Law Society.11
2.18
Dr Julia Fossi, Expert Advisor at the Office of the eSafety Commissioner, stressed the importance of independent auditing and monitoring of age-verification technologies and raising public awareness of safeguards that may be in place.12

Identity protection and third-party verification

2.19
The Committee was made aware of the distinction between age verification and identity verification.13
2.20
Several submitters noted that age verification only requires checking of one attribute of an individual’s identity—age-related eligibility (that is, whether or not the individual is over a particular age).14
2.21
For example, Mr Iain Corby, Executive Director of the AVPA, explained:
... age verification is not identity verification. They’re very separate. What we try to do is have the minimum amount of data used in the first place and then retained going forward. For quite a lot of uses, you wouldn’t need to retain any personal data at all. All you need to know is that person X—and we only know them as ‘X’—has at some point proved, to a certain standard, that they are over a particular age or within a particular age range or they have a particular date of birth.15
2.22
Similarly, online compliance provider TrustElevate submitted:
Traditionally, to verify that an individual is, for example, 18+ years of age, the collection of a significant amount of personal data, including name, address, and date of birth, is required. In effect, age verification involves a full identity verification process. Recent technology and policy innovations in the electronic identity sector mean that it is now possible for age check services to check a single attribute of an individual’s identity (i.e. age-related eligibility).16
2.23
The Committee heard that disclosure of personal information could be minimised through the use of third-party age verification, which involves verification being carried out by an entity that is separate from the age-restricted service.
2.24
Ms Amelia Erratt, Head of Age Verification at the British Board of Film Classification (BBFC), explained:
In terms of the age verification solutions that we were looking at, for the most part age verification is provided by a third party, which means that you don’t give your personal data to [an age-restricted] website; you leave the website and carry out your age verification separately.17
2.25
Mr Peter Alexander, Chief Digital Officer at the Digital Transformation Agency (DTA), likened third-party verification to the authentication process used in some online transactions:
... sometimes when you make a purchase online, before the sale is finalised you are referred to a banking institution or credit card domain to enter a two-factor authentication code. Once you’ve successfully verified the two-factor authentication code, you are returned to the retailer’s website.18
2.26
Mr Matt Strassberg, General Manager of External Relations in Australia and New Zealand at Equifax, stated that it was his expectation that age-restricted sites would not be responsible for age verification:
You’ll be redirected by a third party, so that way there’s separation between the verification and the [age-restricted] site.
... We don’t want to know what site you’re going to, nor should the [age-restricted] site know who you are.19
2.27
Representatives of the DTA described the process of third-party verification as ‘privacy enhancing’ as it involved the age-restricted site only receiving a ‘yes/no answer’ about whether the user was aged 18 years or above rather than providing documents directly to the age-restricted site.20

Accuracy and effectiveness

2.28
The Committee received limited evidence in relation to the accuracy and effectiveness of online age verification. (Evidence on compliance with ageverification requirements for age-restricted content is discussed in subsequent chapters.)
2.29
However, a consistent theme in evidence was that while age verification was not a ‘silver bullet’, with appropriate standards in place, the technology could provide a barrier to prevent young people—and particularly young children—from inappropriately accessing age-restricted content.
2.30
Mr Corby suggested that online age verification could be applied in a proportionate manner:
For some things you would never want anybody to scrape through whereas for other things you might not be quite so concerned. ...You can scale up or down as appropriate, and that’s important because you want to allow the wheels of commerce to keep turning without too much extra friction, so you only want to ask the questions you really need to.21
2.31
Ms Erratt told the Committee that the BBFC was aware of age-verification solutions that were ‘very accurate’, but that it was a matter of setting robust standards:
For example, we required that the type of data be data that was only known by that person rather than broadly known. Age, name and address, for example, could not be an acceptable dataset to age verify, because that is information that could be reasonably known by another person.22
2.32
As discussed later in this chapter, the Committee heard that age-estimation software could be configured to meet different accuracy thresholds.23 The AVPA provided evidence on the accuracy of age estimation based on photos or videos:
For a jurisdiction with legal age restriction of 18, and a threshold set to 25 years, the latest technology’s current mean error rate is 0.31%. For a threshold of 23 years, the error rate is 0.75%. In other words, accuracy is over 99% if the system is set to allow only customers who its analytics conclude are over 23 based on their image. Even a solution reliant on a passport, identity card or driving licence might only offer a percentage level of accuracy, after accounting for the risks of forgery, impersonation or theft. This is no different from the offline world, where there are no systems which offer 100% verification – even passport checks at borders will miss some fake or recently stolen passports.24
2.33
Mr Robin Toombs, Chief Executive Officer of Yoti, submitted that it was ‘quite difficult’ for children nine to 13 years of age to fool Yoti’s age-estimation system.25
2.34
The eSafety Commissioner, Ms Julie Inman-Grant, suggested that further work was required for automated facial recognition to be ‘precisely accurate’:
I think we have a long way to go in terms of [artificial intelligence] and machine learning and also recognising different skin tones and ethnicities. There are a lot of complexities there.26
2.35
Some submitters note that online age verification may be circumvented using tools such as virtual private networks (VPNs), thereby limiting its effectiveness. For example, Mr Chern Eu Kuan, Student Contributor at the UNSW Law Society, explained:
Several factors suggest that the effectiveness of age verification is limited, the biggest concern being the various techniques that minors can easily employ to circumvent verification. ... In a real-world implementation of age verification, the use of a VPN would allow internet users to virtually relocate to a country without age verification and access a website as easily as they would be able to outside of Australia.27
2.36
Mr Kuan cited a study of 13- to 15-year-olds undertaken by Family Zone:
... they installed internet filters on children’s devices and found that nearly two-thirds of 13- to 15-year-olds were trying to use VPN services to access pornography.28
2.37
However, Ms Erratt said the BBFC’s research indicated that a smaller percentage of 11- to 13-year-olds claimed to know how to circumvent age verification:
We know that there are going to be techsavvy teenagers who find ways to circumvent the legislation. From our research, though, we’ve found that only 14 per cent of 11- to 13-year-olds claim that they know a workaround like the dark web or like VPN.29
2.38
The Alliance for Gambling Reform suggested that further research was required to understand the extent to which children are circumventing age verification.30
2.39
Ms Erratt suggested that age verification should not be seen as a ‘silver bullet’:
We have always acknowledged that age verification is not a silver bullet, but without a doubt it can prevent young children from stumbling across commercial pornography online. With online regulation, no single solution will be perfect, but that shouldn’t prevent action being undertaken.31
2.40
This sentiment was shared by a number of other stakeholders, who argued that age verification would create a significant barrier and prevent inadvertent access to age-restricted sites.32 This evidence, particularly as it relates to online pornography, is discussed in further detail in the following chapter.
2.41
Speaking more generally, Mr MacGibbon emphasised that approaches to regulation on online activity should not be expected to be perfect:
... when it comes to solutions online, you’ll often hear where they’ll fail. Yet offline we accept the fact that there are edge cases and people who will be unintended victims ...of regulation or of activities. We accept the fact that seatbelts save lives but not every life, but we still mandate the wearing of seatbelts. When it comes to online regulation or online intervention or online behaviours, there is a prevailing philosophy that says that everything needs to be perfect or you should do nothing, and that, if you do try to do something, it will be completely ineffective anyway...I think we see the consequences of that market failure today in a whole range of things whether it is online safety, online security or privacy.33

Impact on business and users

2.42
The Committee heard a range of views on the impact on businesses and users of implementing online age verification for age-restricted content or services. Evidence highlighted the importance of minimising the burden for businesses and giving users a choice of age-verification methods.
2.43
The BBFC submitted that age verification was a ‘simple and affordable option’ for online platforms:
... in order to ensure that these solutions were not prohibitively expensive, age-verification providers have developed products which have significantly reduced the cost of age-verification. In fact, a number of age-verification providers were planning to offer age-verification free to online commercial pornographic services and most have services which are completely free to consumers. The reason that this is possible is because age-verification services online can drive uptake for services offline which can be monetised such as age-verification for restricted products such as alcohol and cigarettes or entry to nightclubs.34
2.44
Similarly, TrustElevate submitted that business could perform age verification for free or at low cost:
The commercial models that underpin an identity ecosystem can be flexible enough to enable businesses that are not generating sufficient revenue, to run checks at a lower cost, or free, which mitigates concerns around stifling innovation or the imposition of overly burdensome costs.35
2.45
However, Eros Association expressed concern about the ‘red-tape burden’ on producers of age-restricted content associated with mandatory age-verification controls.36
2.46
The Committee received further evidence in relation to the regulatory burden of identity verification for online wagering, which is discussed in Chapter 4.
2.47
In relation to users, Dr David Sandifer suggested that online age verification was ‘minimally inconvenient’:
Those adults who wish to access online [age-restricted content] are not prevented from doing so: they merely need to confirm their adult status, as they would when buying alcohol or cigarettes.37
2.48
Several witnesses stressed the importance of making available a range of options for age verification. For example, Ms Erratt told the Committee:
It’s ...important that consumers trust age verification systems. They need to be aware that they have a choice of options available to them and know how to age-verify safely.38
2.49
Similarly, Mrs Liz Walker, Deputy Chair of eChildhood, argued that user choice, as well as assurances about safety, security, and privacy, would ‘lead to less resistance’ to age verification.39
2.50
Mr Strassberg suggested that having a range of options for age verification would dissipate risks to security and empower users. Mr Strassberg also noted that there may be reluctance about using government-issued identity documents.40
2.51
Similarly, Mr Toombs told the Committee that choice was ‘very important’:
I think there will be some people who are comfortable putting a photo ID into an age verification system, whether that be a government checking system or a private age verification provider. I think there will be lots of other people who feel they would rather find another way ... to prove their age...41

Technical standards for age verification

2.52
In evidence to the inquiry, there was general support for a standards-based approach to the implementation of online age verification.
2.53
For example, the eSafety Commissioner submitted that one of the ‘main preconditions’ for implementing a mandatory age verification scheme was ‘the establishment of a trusted age verification framework for implementation, that sets out robust technical standards, requirements and conditions for age verification mechanisms that fully address privacy, data protection, security, safety, usability, and accessibility considerations’.42
2.54
The AVPA argued that there are a range of benefits to establishing a common standard for age verification, including that such a standard would:
provide a basis for a competitive, interoperable marketplace, improving choice and quality for consumers while keeping prices down;
provide a basis for educating the public about age verification and its effectiveness; and
enable a proportionate response depending on the risks associated with each age-restricted product or service, where the industry can provide standardised solutions to meet a given level of assurance and audit.43
2.55
Evidence received by the Committee relating to different approaches to establishing technical standards for online age verification is outlined in this section.

Guidance provided under the UK Digital Economy Act

2.56
The Committee heard about the approach of the BBFC as the designated age-verification regulator for online pornography under the Digital Economy Act in the United Kingdom. The Digital Economy Act is discussed in further detail in the following chapter.
2.57
Under section 14(1) of the Act:
... all providers of online commercial pornographic material accessible from the UK would have been required to carry age-verification arrangements for UK consumers to ensure that their content is not normally accessible to children.44
2.58
As the age-verification regulator, the BBFC was required to publish ‘guidance about the types of arrangements for making pornographic material available that the regulator will treat as complying with section 14(1).’45
2.59
In its submission to the inquiry, the BBFC set out criteria against which age-verification providers would have been assessed for compliance with the Act, including:
a) an effective control mechanism at the point of registration or access to pornographic content by the end-user which verifies that the user is aged 18 or over at the point of registration or access
b) use of age-verification data that cannot be reasonably known by another person, without theft or fraudulent use of data or identification documents nor readily obtained or predicted by another person
c) a requirement that either a user age-verify each visit or access is restricted by controls, manual or electronic, such as, but not limited to, password or personal identification numbers. A consumer must be logged out by default unless they positively opt-in for their log-in information to be remembered
d) the inclusion of measures which authenticate age-verification data and measures which are effective at preventing use by non-human operators including algorithms.46
2.60
The BBFC described this as a ‘principles-based approach’:
We opted for a principle-based approach rather than specifying a finite number of “approved” solutions, to allow for and encourage technological innovation within the age-verification industry. In the years we worked on the project, we have seen significant advances in this area, notably the development of age estimation technology which had the potential to be both robust and easy to use for consumers.47

Age Verification Certificate

2.61
In addition to providing guidance to age-verification providers, the BBFC established a voluntary, non-statutory certification scheme—the Age Verification Certificate (AVC).48
2.62
The BBFC explained:
The AVC Standard was developed by the BBFC and [cyber security and risk mitigation firm] NCC Group in cooperation with industry, with the support of Government, including the National Cyber Security Centre at GCHQ and Chief Scientific Advisors, and in consultation with the [Information Commissioner’s Office]. Under the AVC, age-verification providers may choose to be independently audited by NCC, who are experts in cyber security and data protection, and then certified by the BBFC.
The third party audit by NCC includes an assessment of an age-verification provider’s compliance with strict privacy and data security requirements. These are tailored specifically to address age-verification for online pornography, for example by ensuring there is no handover of personal data used to verify an individual’s age between AV providers and pornographic websites.49
2.63
Ms Erratt explained that age-verification providers would be able to display a symbol to confirm that they had been certified. Users would be able to click on the symbol, which would take them to the BBFC website where they could view a summary report on the provider:
The purpose of the age verification certificate is really to provide that comfort to consumers and ensure that they’re confident in using age verification. It also gives age verification providers an opportunity to demonstrate that their solutions meet robust data protection standards.50
2.64
Ms Erratt told the Committee that, while the AVC was voluntary, ‘all of the major age verification providers wanted to be certified under that scheme’.51 According to the BBFC website, one provider (Yoti) achieved certification on 1 July 2019 and others are undergoing assessment.52
2.65
eChildhood noted that, under the Digital Economy Act, the BBFC would have been required to review age verification on all pornography websites accessible in the UK to ensure compliance with BBFC guidance, independent of the voluntary certification process.53

PAS 1296 Age Checking code of practice

2.66
As another example of an effort to develop technical standards for age verification, the Committee heard evidence about the Publicly Available Specification (PAS) 1296 Age Checking code of practice. PAS 1296 was published by the British Standards Institute in March 2018.54
2.67
Dr Rachel O’Connell, author of PAS 1296 and also Co-founder of TrustElevate, submitted that the standard was written to assist age-verification providers to comply with legal requirements:
[PAS 1296] provides recommendations on the due diligence businesses can exercise to ensure that age check services deliver the kind of solution that meet a business’s specific regulatory compliance needs.55
2.68
Dr O’Connell told the Committee that the challenge in developing PAS 1296 was how to ‘enable age-related eligibility checks to be conducted in a privacy-preserving manner’.56
2.69
Dr O’Connell explained that PAS 1296 uses a ‘vectors of trust’ approach, comprised of four components:
1
identity proofing (how strongly the set of identity attributes has been verified and vetted);
2
primary credential usage (how strongly the primary credential can be verified);
3
primary credential management (the use and strength of policies, practices, and security controls used in managing the credential); and
4
assertion presentation (how well the given digital identity can be communicated across the network without information leaking to unintended parties, and whether the given digital identity was actually asserted by the given identity provider and not another party posing as such).57

Trusted Digital Identity Framework

2.70
Beyond age verification, the Committee also heard about privacy and security standards associated with the Australian Government’s Digital Identity program, led by the DTA in partnership with other government agencies.58
2.71
The Digital Identity program is intended to give Australian people and businesses a single, secure way to authenticate their identity for the purpose of accessing government services online.59
2.72
The DTA explained that identity providers wishing to participate in the Digital Identity system must meet ‘strict privacy and security requirements’ set out in the DTA’s Trusted Digital Identity Framework (TDIF).60
2.73
Mr Alexander, Chief Digital Officer at the DTA, explained:
[The TDIF] has been developed in close cooperation with industry, government, academia and privacy experts. It sets rules and requirements for accreditation under the digital identity program, while participants in the digital identity program must be compliant with the TDIF, it is also freely available as a framework for all Australian businesses today as a reference for best practice and how to verify identity online.61
2.74
According to the DTA’s website, the TDIF is currently made up of a set of 19 policies, which outline rules and standards for:
how personal information is handled by participating government agencies and organisations;
the usability and accessibility of identity services;
how the identity system is secured and protected against fraud;
how identity services are managed and maintained; and
how [the TDIF] will be managed.62
2.75
eChildhood suggested that the TDIF could be modified to support an online age-verification regime.63 The Committee heard that Australia Post was already providing age-verification services for alcohol purchasing and entry to licensed venues in some jurisdictions with its Digital iD solution, which is accredited under the TDIF.64
2.76
The DTA submitted that Digital Identity could be used to verify identity attributes, including age, for the purpose of accessing age-restricted sites:
Such sites would only receive the information required to confirm the user meets the age requirements of the service. Other information could potentially be provided, but this would be consent based to ensure the [user’s] privacy is protected.65
2.77
However, Mr Alexander told the Committee that while there would be some benefits to using the Digital Identity program for online age verification, the DTA would need legislative authority for the program to connect directly to services in the private sector, and further investment would be required as age verification was not in the original scope of the program.66
2.78
The DTA recommended that if Digital Identity is used for age verification, it should be an optional choice:
We would expect that [Digital Identity] would only be one of a number of potential pathways that individuals may use to undertake age verification.67

European Union General Data Protection Regulation

2.79
Some submitters noted that age-verification providers in the United Kingdom have been required to comply with the European Union General Data Protection Regulation (GDPR).68
2.80
The eSafety Commissioner explained that under the GDPR, when collecting and processing personal information, age-verification providers must comply with a range of data protection and data minimisation requirements, including:
individuals must be told why, when, where and how their personal data is being processed, and by which organisations;
providers must process the minimum personal data necessary to achieve the intended outcome of confirming age; additional personal data should not be collected, irrespective of whether it is subsequently securely deleted;
providers must facilitate individuals’ rights (including the rights of access, erasure and rectification); and
providers must ensure that personal data is not retained for longer than is necessary to achieve the purposes for which it was originally collected (sometimes referred to as individuals’ ‘right to be forgotten’).69

Overview of methods for online age verification

2.81
As noted at the beginning of this chapter, the Committee heard that online age verification had become an area of increased interest and technological development as some jurisdictions had sought to enforce age restrictions for online content.
2.82
The eSafety Commissioner explained that age-verification technology is evolving quickly:
There has been increased investment in the development of online age verification, age-assurance, age checking and e-identification systems over the last few years, and a broad suite of technologies now currently exist.70
2.83
The eSafety Commissioner went on:
... a number of third-party information and analytics companies exist to provide identity and age verification checks on consumers, as well as credit checks and fraud assessments.71
2.84
A number of commercial age- and identity-verification providers gave evidence to the inquiry, in addition to a number of government agencies involved in developing age- and identity-verification platforms.72
2.85
This section summarises the main methods of online age verification discussed in evidence to the inquiry, which involve verification based on:
government-issued identity documents;
consumer information and other databases; and
biometric data.
2.86
Age estimation and age screening are also discussed in this section.
2.87
While these methods are discussed individually, the Committee heard that providers may use a combination of methods depending on the level of assurance required.73 For example, age verification could use a combination of biometric data and a government-issued identity document.
2.88
While not the focus of the inquiry, the Committee also notes evidence received on a number of alternative or complementary technologies, including filtering and ISP [internet-service provider] blocking.74 This evidence is discussed in more detail in the following chapter.

Verification based on identity documents

2.89
The Committee heard that age verification could involve the use of a government-issued identity document, such as a driver licence or passport. As an individual’s date of birth is verified in the initial process of obtaining the identity document, the document then provides a reference against which the individual’s age can be verified at a later time.
2.90
For example, a simple method could involve a user submitting details from an identity document (for example, a driver licence number), which could then be verified with reference to an online government register or other database to confirm that the document (and therefore the user’s date of birth) is valid.75
2.91
However, the Committee also heard about more sophisticated methods for validating identity documents, such as matching a photo on the document with an image submitted online in real-time by the user, or, in the case of a passport, reading the near-field-communication (or NFC) chip embedded in the document.76

Document Verification Service

2.92
As an example of a possible approach to online age verification based on governmentissued identity documents, several submitters referred to the Document Verification Service (DVS), which was established by the Australian Government in partnership with state and territory governments.
2.93
The DVS enables checks of biographic information (including date of birth) against government-issued identity documents, including birth certificates, driver licences, passports, and visas.77
2.94
The Department of Home Affairs, which administers the DVS, confirmed that the service could be used for the purpose of age verification:
The Document Verification Service checks whether the personal information on an identity document matches the original record. Importantly this includes verification of the date of birth on Australian passports, driver licences and birth certificates. The Document Verification Service conducted about 48 million transactions in 2018-19 and has been available to government agencies for over 10 years, and to the private sector since 2014.78
2.95
Private-sector organisations can only use the DVS to check a person’s identity with their consent, and only where this is permitted by the Privacy Act 1988.79 The DVS does not check facial images.80 However, a service for this purpose is in development (see discussion later in this chapter).
2.96
The Department submitted that the DVS ‘makes it harder for people to use fake identity documents, which could otherwise be used to circumvent age verification processes’.81
2.97
Responsible Wagering Australia submitted that there is a ‘high level of confidence’ in the integrity of the DVS, but also noted concerns about its usability by the private sector and some technical limitations:
Some technical limitations of the DVS we have identified include a lack of consistency between data fields of the databases, queries being restricted to name and document number as opposed to a name and address, and affirmative matches requiring an absolute match (meaning that common typographical issues increase failure rates).82

Retail card

2.98
The Committee heard that online age verification could involve the use of a retail card, which would be obtained in a face-to-face transaction where the user’s age would be verified by sighting a government-issued document such as a driver licence.83
2.99
Ms Erratt, Head of Age Verification at the BBFC, explained that a user would obtain the retail card from a shop in the same way as other age-restricted goods, such as cigarettes or alcohol. The card would include an anonymous code that could be submitted online to complete the age-verification process.84
2.100
Ms Erratt explained that age-verification providers had developed a range of safeguards to mitigate the misuse of a retail card by a person under the age of 18 years, such as requiring the code associated with the retail card to be used within a certain period of time.85
2.101
Ms Erratt suggested this was an example of an age-verification method that did not involve the retention of any personal information.86

Verification based on consumer information or other databases

2.102
The Committee heard that online age verification could involve the use of consumer information or other databases that incorporate age-related information or are restricted to individuals aged 18 years or above.
2.103
For example, Equifax submitted that age verification could involve confirmation that a user is listed on the Commonwealth electoral roll or has credit reporting information retained on Equifax’s consumer credit bureau, either of which indicates that the user is aged 18 years or above.87
2.104
Equifax submitted that 96 per cent of eligible Australians are listed on the Commonwealth electoral roll and approximately 18 million of an estimated 18.9 million Australian adults are listed on the consumer credit bureau.88
2.105
According to Equifax, currently both the Commonwealth electoral roll and credit reporting information can be used for anti-money laundering and counter-terrorism financing purposes, but not for other identity- or age-verification purposes.89
2.106
Dr O’Connell gave a similar example of online age verification involving a user submitting their mobile phone number. As mobile phone contracts are restricted to individuals aged 18 years or above, age verification would then involve determining that a contract is associated with the phone number.90

Verification based on biometric data

2.107
Another possible method of online age verification involves the use of biometric data, such as a facial image of an individual user.
2.108
As noted above, this method could involve a user submitting a live photo from the camera on their phone, which is validated against a government-issued identity document, either automatically using facialrecognition technology or manually by a trained operator.91
2.109
The Committee heard that a so-called ‘liveness test’ could be used to ensure that user is a real person presenting genuine biometric data.92 For example, Mrs Julie Dawson, representing Yoti, an age-verification provider, explained:
We want to prove that it’s actually you, so we’re going to do what we called a ‘liveness test’, which is a quick 3D scan of your face where we’re checking that you’re a real human being, not a robot, a 3D image or a hologram of some sort. Sometimes we might get you to read a couple of words off the screen, like ‘dog’, ‘cat’ or ‘mother’. That helps us detect that that is a real human.93
2.110
The eSafety Commissioner provided two other examples of this approach:
For example, individuals must blink when taking a selfie to prove they are live and not merely a static photo. To combat pre-recorded voices, the system prompts individuals to repeat randomly generated phrases or a sequence of numbers to prove that they are human and not a recording.94

Face Verification Service

2.111
The Department of Home Affairs submitted that it was developing a Face Verification Service (FVS), which it proposed could assist in age verification.95
2.112
The FVS is intended to enable checks of facial images against government-issued identity documents.96
2.113
The Department explained that the FVS would complement the DVS (see discussion earlier in this chapter):
... The Face Verification Service complements the Document Verification Service by preventing the use of stolen as well as fake identity information. This could assist in age verification, for example by preventing a minor from using their parent’s driver licence to circumvent age verification controls.97
2.114
However, the Department also explained that the service was not yet fully operational:
Whilst it is intended to be made available to private sector organisations in future, this will be subject to the passage of the Identity-matching Services Bill 2019 which is currently before Parliament. The use of driver licence images through the Face Verification Service is also subject to the agreement of the states and territories.98
2.115
The Identity-matching Services Bill 2019 was introduced into Parliament on 31 July 2019. The bill was subsequently referred to the Parliamentary Joint Committee on Intelligence and Security for review. In its report on the bill, the Committee recommended that the bill be re-drafted taking into account principles relating to privacy, transparency, oversight, and user obligations.99

Age estimation

2.116
The Committee heard about the recent development of more sophisticated approaches that involved estimating or predicting a user’s age without reference to government-issued identity documents or other databases.
2.117
The eSafety Commissioner submitted that there is growing research into the use of age prediction, particularly in jurisdictions where identity documentation is rare or non-existent.100
2.118
Yoti referred the Committee to its Age Scan product, which involves a user submitting a facial image from the camera on their phone:
Yoti Age Scan is a secure age-checking service that can estimate a person’s age by looking at their face. ... The user does not have to register to use the service, and does not have to provide any information about themselves. Therefore, no identity document need be presented. The user simply presents their face in front of the camera.
... Yoti Age Scan works quickly, returning an age estimate in around 1 to 1½ seconds.101
2.119
Yoti submitted that the software is configurable to meet whatever accuracy threshold is required by the business or regulator:
With a threshold set at 25, the average false positive rate across 14 to 17 year olds is 0.31%. This figure continues to decline as our training data increases in size.102
2.120
Yoti submitted that the product had been used in a variety of settings, including live-streaming services, retail self-checkouts, and dating and pornography platforms, and that approximately 200 million checks had been performed in the nine months that the product had been available.103
2.121
Mr Corby, Executive Director of the AVPA, told the Committee that age-estimation software could also use social and behavioural information from a social media account.104
2.122
Similarly, the eSafety Commissioner explained that advances in technology were enabling platforms and services to identify users from behavioural and online signals:
How an individual interacts and engages online leaves traces that can be utilised to identify whether they are an adult or a child. For example, a handle or username, image tags, hashtag usage, gesture patterns, web history, content interaction, IP address, location data, device serial number, contacts – all can be used to measure what age-bracket that you might fall under.
These signals are sometimes used by social media platforms, alongside third-party verification systems, to flag users who might be underage on their site. There are a few examples of technology that utilise these signals for automatic age-gating purposes.105

Age screening

2.123
As noted above, the Committee heard evidence about earlier, less sophisticated methods such as age screening and age gating, which may involve a user self-declaring their age, often at the point of access or registration.106
2.124
The eSafety Commissioner explained that age gating allowed online services and providers to restrict access to content to people over a particular age:
Age gating can simply restrict access to the content, providing users with an error message or re-directing them to more age-appropriate content. Alternatively, content can be locked for access and only released once a PIN code or other type of age verification process has taken place.107
2.125
However, the eSafety Commissioner also noted that basic forms of age gating, which trust the user to declare their correct age, are easily circumvented.108

Online and physical-world rights and obligations

2.126
Lastly, in addition to the principles relating to age verification discussed earlier in this chapter, the argument was put to the Committee that there should be stronger alignment between individuals’ rights and obligations in the physical world and corresponding rights and obligations in the online world.
2.127
Mr MacGibbon explained:
One of the observations I would make is that we have somehow separated our online lives and our expectation for what occurs online from our offline lives. Yet now it would be trite to say that the connectivity between our online and offline activities is such that they’re one single entity, and we should aspire as a nation to create the concept of an online civil society, one which reflects our offline society. To do that we do need to take action. We do need to consider regulations, and we do need to consider what roles and responsibilities governments have, as well as the companies that provide services online, and the role of individuals in that online civil society.
... I believe there are ways we can improve the online environment. I am not going to say they are perfect but I certainly would say that we owe it to the people of Australia to reflect Australian values online, just as every nation should reflect its values in the online space of its community.109
2.128
Mr Corby made a similar point, suggesting that a good starting point would be to consider ‘[w]hat happens in the real world, and are we trying to at least do as well as that?’110
2.129
Mr MacGibbon went on to relate this idea to the issue of age verification for age-restricted products and services:
If I was to walk into a restricted premises today where I could gain access to legal yet restricted pornography, no-one would question who I was and my identity would not be known.
... Any system that we create online needs to somehow respect people’s privacy for lawful access to materials. If society says it is lawful offline, it should be lawful online. And any system that you create which has to involve some form of technology should reflect as closely as possible our offline lives.111
2.130
Mr MacGibbon argued that, at present, online privacy, safety, and security are ‘much less assured’ than in the physical world:
... we think we’re anonymous and ...we think that what we do online doesn’t matter. ...while there might be technical and legal solutions to this, there is a broader social question about how society views its online activities, and we’re very immature when it comes to that.112
2.131
In relation to children and young people in particular, both Mr MacGibbon and Ms Inman-Grant emphasised that parents need to be engaged in their children’s online lives in the same way they are in their everyday lives.113

Committee comment

2.132
Australians are increasingly accessing a wide range of products and services online.
2.133
The Committee accepts the proposition that what is legal in the physical world should be legal in the online world. In the same way, the Committee is concerned to see that age restrictions that apply in the physical world are also applied online.
2.134
In face-to-face commerce, children and young people are restricted from accessing a range of adult products and services at the point of sale. This includes alcohol, tobacco, and mobile phone services. Here, potential consumers are required to show appropriate personal identification before accessing these products.
2.135
The Committee considers that the same principle should apply to access online pornography and online wagering. This is discussed in further detail in the following chapters. While outside the scope of this inquiry, it was also put to the Committee that online sales of alcohol should similarly be restricted to individuals whose age has been verified using an effective method. As a matter of principle, the Committee accepts this proposition.
2.136
Evidence to the inquiry suggests that methods for online age-verification have advanced significantly in recent years. The Committee is confident that suitable technology exists to support a regime of mandatory age verification for age-restricted products and services.
2.137
However, it is the Committee’s view that a prerequisite for the implementation of any system of mandatory age-verification is the establishment of robust technical standards, particularly in relation to the privacy of users’ personal information.
2.138
As such, the Committee recommends that the Australian Government develop appropriate technical standards to underpin online age verification. The Committee considers that it is important for this work to begin now in order to support future policy decisions in relation to age verification. This work would also underpin the Committee’s recommendations in relation to online pornography and online wagering later in this report.
2.139
The Committee recognises the benefit in users having a range of options available to verify their age. It is apparent that the various methods on offer have different strengths and weaknesses in relation to the key attributes identified above for an effective age-verification model. Any single method may involve a trade-off, for example, between its accuracy and effectiveness, and protecting the privacy of users’ information. Some may offer high levels of accuracy and protection but at greater cost to the business and/or individual user.
2.140
As such, the Committee’s expectation is that these standards would not seek to prescribe particular age-verification methods.
2.141
Instead, the standards would establish a baseline that all age-verification methods offered to customers in Australia would be required to meet. For example, with respect to privacy and security, based on evidence to the inquiry, the Committee believes that at a minimum all age-verification methods should be required to:
preserve a user’s privacy such that no personal information other than the user’s age-related eligibility is shared between the age-verification provider and the age-restricted site; and
minimise retention of, and if possible eliminate the storage of personal information, so as not to create a ‘honeypot’ of sensitive data, and if some necessary data must be stored, it must be stored in a secure way.
2.142
The Committee’s aim is to ensure that, whichever method a customer chooses to use to verify their age, they can be assured that the process will be easy, safe, and secure.

Recommendation 1

2.143
The Committee recommends that the Digital Transformation Agency, in consultation with the Australian Cyber Security Centre, develop standards for online age verification for age-restricted products and services.
a. These standards should specify minimum requirements for privacy, safety, security, data handling, usability, accessibility, and auditing of age-verification providers.
b. Consideration should be given to the existing technical standards in Australia and overseas, including but not limited to the UK Age Verification Certificate, the PAS 1296 Age Checking code of practice, the Trusted Digital Identity Framework, and the European Union General Data Protection Regulation.
c. Opportunities should also be provided for consultation with industry, including private age-verification providers, and members of the public.
2.144
In developing these standards, the Committee also encourages the Digital Transformation Agency to consult with ID Care, an Australian not-for-profit organisation which provides specialist advice and support to individuals and organisations in relation to online identity and cyber security. The Committee expects that consulting with ID Care and similar organisations will help to ensure that any standards adequately address security and privacy concerns and are fit-for-purpose in the Australian context.
2.145
As noted above, the Committee envisages that these standards would be mandated as part of any policy decision in relation to mandatory age-verification for particular products or services. These standards may also assist in educating and providing assurance to the public in the implementation of any such regime.
2.146
At the same time, the Committee recommends that the Australian Government develop an age-verification exchange for the purpose of third-party online age verification. Similar to the existing identity exchange (a component of the government’s Digital Identity program) the age-verification exchange would enable users wishing to access an age-restricted site to choose from a range of age-verification providers.
2.147
To avoid duplication, the Committee recommends the age-verification exchange be developed as an extension of the Digital Identity program.

Recommendation 2

2.148
The Committee recommends that the Digital Transformation Agency extend the Digital Identity program to include an age-verification exchange for the purpose of third-party online age verification.
2.149
The Committee anticipates that such a system would support the development of a competitive ecosystem for third-party age verification in Australia, including public and private sector age-verification providers. This would ensure that users have a choice about how they wish to verify their age, and would also protect users’ privacy by ensuring that age verification is carried out by a third party.
2.150
Age-verification providers wishing to participate in the system should be required to meet appropriate technical standards (see Recommendation 1).
2.151
The Committee also expects that the system would enable free or low-cost age verification by leveraging existing government verification services. This would assist in minimising compliance costs for small businesses required to implement mandatory age verification.
2.152
The Committee also encourages the Australian Government to consider measures to educate users—particularly children and young people and their parents and guardians—about online privacy. In this regard, the Committee acknowledges the ongoing work of the Office of the eSafety Commissioner and congratulates both Ms Inman-Grant and her staff for their important work in this area.

  • 1
    Age Verification Providers Association, Submission 200, p. 3.
  • 2
    Age Verification Providers Association, Submission 200, p. 3.
  • 3
    eSafety Commissioner, Submission, 191, p. 8.
  • 4
    eSafety Commissioner, Submission 191, p. 14.
  • 5
    Equifax, Submission 189, pp. 2-3.
  • 6
    UNSW Law Society, Submission 58, p. 7.
  • 7
    eSafety Commissioner, Submission 191, p. 11.
  • 8
    Age Verification Providers Association, Submission 200, pp. 4-5.
  • 9
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, p. 24.
  • 10
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, p. 26.
  • 11
    UNSW Law Society, Submission 58, p. 8.
  • 12
    Dr Julia Fossi, Expert Advisor, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 16. See also: eSafety Commissioner, Submission 191, p. 5.
  • 13
    eSafety Commissioner, Submission 191, p. 5.
  • 14
    See, for example: eChildhood, Submission 192, pp. 24-25; Equifax, Submission 189, p. 1.
  • 15
    Mr Iain Corby, Executive Director, Age Verification Providers Association, Committee Hansard, Canberra, 5 December 2019, p. 7.
  • 16
    TrustElevate, Submission 190, p. 1.
  • 17
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 5.
  • 18
    Mr Peter Alexander, Chief Digital Officer, Digital Transformation Agency, Committee Hansard, Canberra, 6 December 2019, p. 42.
  • 19
    Mr Matt Strassberg, General Manager, External Relations Australia and New Zealand, Equifax, Committee Hansard, Canberra, 6 December 2019, pp. 39-40.
  • 20
    Mr Peter Alexander, Chief Digital Officer, Digital Transformation Agency, Committee Hansard, Canberra, 6 December 2019, p. 42; Mr Jonathon Thorpe, Head of Identity, Digital Delivery Division, Digital Transformation Agency, Committee Hansard, Canberra, 6 December 2019, p. 42
  • 21
    Mr Iain Corby, Executive Director, Age Verification Providers Association, Committee Hansard, Canberra, 5 December 2019, p. 7.
  • 22
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 5.
  • 23
    Yoti, Submission 172, p. 11.
  • 24
    Age Verification Providers Association, Submission 200, p. 4.
  • 25
    Mr Robin Toombs, Chief Executive Officer, Yoti, Committee Hansard, Canberra, 6 December 2019, p. 30.
  • 26
    Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 18. See also: Yoti, Submission 172, p. 11.
  • 27
    Mr Chern Eu Kuan, Student Contributor, UNSW Law Society, Committee Hansard, Canberra, 6 December 2019, p. 47.
  • 28
    Mr Chern Eu Kuan, Student Contributor, UNSW Law Society, Committee Hansard, Canberra, 6 December 2019, p. 47. See also: UNSW Law Society, Submission 58, p. 6; Family Zone, Submission 202, p. 2.
  • 29
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 4.
  • 30
    Alliance for Gambling Reform, Submission 179, p. 2.
  • 31
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, pp. 2, 4.
  • 32
    For example, see: Collective Shout, Submission 178, pp. 10-11; British Board of Film Classification, Submission 187, p. 1; eChildhood, Submission 192, p. 22; Age Verification Providers Association, Submission 200, p. 5.
  • 33
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, pp. 23-24.
  • 34
    British Board of Film Classification, Submission 187, p. 12.
  • 35
    TrustElevate, Submission 190, p. 3.
  • 36
    Eros Association, Submission 65, p. 3.
  • 37
    Dr David Sandifer, Submission 171, p. 3.
  • 38
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 1.
  • 39
    Mrs Liz Walker, Deputy Chair, eChildhood, Committee Hansard, Canberra, 6 December 2019, p. 56.
  • 40
    Mr Matt Strassberg, General Manager, External Relations Australia and New Zealand, Equifax, Committee Hansard, Canberra, 6 December 2019, p. 39.
  • 41
    Mr Robin Toombs, Chief Executive Officer, Yoti, Committee Hansard, Canberra, 6 December 2019, pp. 28-29.
  • 42
    eSafety Commissioner, Submission 191, p. 5.
  • 43
    Age Verification Providers Association, Submission 200, pp. 3-4.
  • 44
    British Board of Film Classification, Submission 187, pp. 6-7.
  • 45
    British Board of Film Classification, Submission 187, pp. 6-7.
  • 46
    British Board of Film Classification, Submission 187, p. 7.
  • 47
    British Board of Film Classification, Submission 187, p. 7. See also: Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 3.
  • 48
    British Board of Film Classification, Submission 187, p. 11.
  • 49
    British Board of Film Classification, Submission 187, p. 11.
  • 50
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, pp. 1, 3, 6.
  • 51
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 6.
  • 52
    British Board of Film Classification, ‘Age-verification Certificate’, <https://bbfc.co.uk/about-classification/age-verification-certificate>.
  • 53
    eChildhood, Submission 192, pp. 19-20.
  • 54
    TrustElevate, Submission 190, p. 1.
  • 55
    TrustElevate, Submission 190, p. 1.
  • 56
    Dr Rachel O’Connell, Co-founder, TrustElevate, Committee Hansard, Canberra, 5 December 2019, p. 12.
  • 57
    TrustElevate, Submission 190, pp. 2-3. See also: Dr Rachel O’Connell, Co-founder, TrustElevate, Committee Hansard, Canberra, 5 December 2019, pp. 12-13.
  • 58
    Digital Transformation Agency, Submission 188, pp. 1-2.
  • 59
    Digital Transformation Agency, ‘Digital Identity’, <https://www.dta.gov.au/our-projects/digital-identity>.
  • 60
    Digital Transformation Agency, Submission 188, p. 2.
  • 61
    Mr Peter Alexander, Chief Digital Officer, Digital Transformation Agency, Committee Hansard, Canberra, 6 December 2019, p. 41.
  • 62
    Digital Transformation Agency, ‘Trusted Digital Identity Framework’, <Trusted Digital Identity Framework>.
  • 63
    eChildhood, Submission 192, pp. 22-27.
  • 64
    Australia Post, Submission 199, pp. 2-4.
  • 65
    Digital Transformation Agency, Submission 188, p. 2.
  • 66
    Mr Peter Alexander, Chief Digital Officer, Digital Transformation Agency, Committee Hansard, Canberra, 6 December 2019, pp. 42-43.
  • 67
    Digital Transformation Agency, Submission 188, p. 2.
  • 68
    British Board of Film Classification, Submission 187, p. 10; eSafety Commissioner, Submission 191, pp. 10, 12-13; Age Verification Providers Association, Submission 200, p. 4.
  • 69
    eSafety Commissioner, Submission 191, pp. 12-13.
  • 70
    eSafety Commissioner, Submission 191, p. 4.
  • 71
    eSafety Commissioner, Submission 191, p. 6.
  • 72
    For example, see: AVSecure LLC, Submission 74; Department of Home Affairs, Submission 146; Yoti, Submission 172; Digital Transformation Agency, Submission 188; Equifax, Submission 189; TrustElevate, Submission 190; Australia Post, Submission 199; Age Verification Providers Association, Submission 200.
  • 73
    eSafety Commissioner, Submission 191, p. 7.
  • 74
    For example, see: eSafety Commissioner, Submission 191, pp. 15-17; eChildhood, Submission 192, pp. 29-32; Family Zone, Submission 202, pp. 2-3.
  • 75
    For example, see: Department of Home Affairs, Submission 146, p. 2.
  • 76
    For example, see: Yoti, Submission 172, p. 10; Australia Post, Submission 199, p. 3.
  • 77
    Australian Government, ‘Identity Matching Services - what are they?’, <https://beta.idmatch.gov.au/our-services>.
  • 78
    Department of Home Affairs, Submission 146, p. 2.
  • 79
    Department of Home Affairs, Submission 146, p. 2.
  • 80
    Australian Government, ‘Identity Matching Services - what are they?’, <https://beta.idmatch.gov.au/our-services>.
  • 81
    Department of Home Affairs, Submission 146, p. 2.
  • 82
    Responsible Wagering Australia, Submission 174, p. 4.
  • 83
    For example, see: AVSecure LLC, Submission 74, p. 2; British Board of Film Classification, Submission 187, p. 10.
  • 84
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 5.
  • 85
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 5.
  • 86
    Ms Amelia Erratt, Head, Age Verification, British Board of Film Classification, Committee Hansard, Canberra, 5 December 2019, p. 5.
  • 87
    Equifax, Submission 189, p. 2.
  • 88
    Equifax, Submission 189, pp. 1-2; Mr Matt Strassberg, General Manager, External Relations Australia and New Zealand, Equifax, Committee Hansard, Canberra, 6 December 2019, pp. 37-38.
  • 89
    Equifax, Submission 189, p. 2.
  • 90
    Dr Rachel O’Connell, Co-founder, TrustElevate, Committee Hansard, Canberra, 5 December 2019, p. 13.
  • 91
    Yoti, Submission 172, pp. 10-11; Department of Home Affairs, Submission 146, p. 2.
  • 92
    Yoti, Submission 172, p. 3; Australia Post, Submission 199, p. 3.
  • 93
    Mrs Julie Dawson, Director, Regulatory and Policy, Yoti, Committee Hansard, Canberra, 6 December 2019, p. 31.
  • 94
    eSafety Commissioner, Submission 191, p. 9.
  • 95
    Department of Home Affairs, Submission 146, p. 2.
  • 96
    Australian Government, ‘Identity Matching Services - what are they?’, <https://beta.idmatch.gov.au/our-services>.
  • 97
    Department of Home Affairs, Submission 146, p. 2.
  • 98
    Department of Home Affairs, Submission 146, p. 2.
  • 99
    Parliamentary Joint Committee on Intelligence and Security, Advisory report on the Identity-matching Services Bill 2019 and the Australian Passports Amendment (Identity-matching Services) Bill 2019, October 2019.
  • 100
    eSafety Commissioner, Submission 191, p. 7.
  • 101
    Yoti, Submission 172, pp. 10-11.
  • 102
    Yoti, Submission 172, p. 11.
  • 103
    Yoti, Submission 172, p. 11; Mr Robin Tombs, Chief Executive Officer, Yoti, Committee Hansard, Canberra, 6 December 2019, p. 28.
  • 104
    Mr Iain Corby, Executive Director, Age Verification Providers Association, Committee Hansard, Canberra, 5 December 2019, p. 10.
  • 105
    eSafety Commissioner, Submission 191, p. 8.
  • 106
    eSafety Commissioner, Submission 191, p. 6; Age Verification Providers Association, Submission 200, p. 3.
  • 107
    eSafety Commissioner, Submission 191, p. 6.
  • 108
    eSafety Commissioner, Submission 191, p. 6.
  • 109
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, p. 23.
  • 110
    Mr Iain Corby, Executive Director, Age Verification Providers Association, Committee Hansard, Canberra, 5 December 2019, p. 8.
  • 111
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, p. 23.
  • 112
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, pp. 24, 26.
  • 113
    Mr Alastair MacGibbon, Private capacity, Committee Hansard, Canberra, 6 December 2019, pp. 2526; Ms Julie Inman-Grant, eSafety Commissioner, Office of the eSafety Commissioner, Committee Hansard, Canberra, 6 December 2019, p. 11

 |  Contents  |