Chapter 5 - Impacts on teachers and education system

  1. Impacts on teachers and education system

Upskilling teachers, students, and communities

Role of educators

5.1Stakeholders agree that the role of educators will inevitably change with the integration of generative artificial intelligence (GenAI) in education. Educators will predominately become facilitators of learning.[1] AI is not expected to replace educators; rather, the norm would involve a hybrid-approach between human educators and GenAI as a collaborator.[2] Educators would remain imperative to students’ acquisition of knowledge, skills, and experience,[3] and GenAI tools could help to build teachers' capabilities.[4] Tertiary Education Quality and Standards Agency (TEQSA) highlighted that:

It is critical that the policy objective is to use AI to support educators to be more effective, rather than aiming for efficiency gains that could lead to fewer educators.[5]

5.2TEQSA further commented that:

While generative AI tools have the potential to revolutionize teaching and assessment practices, careful consideration of the purpose of education, the role of educators, and the evolving landscape of knowledge will be critical in harnessing the benefits of AI in a way that is inclusive and mitigates potential negative consequences. To effectively navigate an AI-dependent environment, ongoing professional development will be essential for teachers, school support staff, administrative personnel, and policymakers.[6]

5.3Educators will need to provide what AI cannot,[7] such as human connection, interaction, and role modelling,[8] as well as promoting ‘uniquely human skills’.[9] Educators will need to continue to foster within themselves and their students key attributes, skills, and human values, and ensure they are not lost.[10] This is critical for student development, wellbeing, and learning outcomes.[11] Those key attributes include ‘cultural sensitivity, resilience, relationships, curiosity, critical thinking, teamwork, innovation, ethics, civic engagement, and leadership’.[12] Other stakeholders have also highlighted creativity, healthy scepticism, problem-solving, and human agency as important attributes.[13]

5.4It is essential that educators teach students to think critically to assess AI generated outputs, especially given risks of algorithm bias, inaccuracy, and misinformation and disinformation. The Centre for Digital Wellbeing (CDW) commented on the need to prioritise this skill as 'the integration of generative AI tools carries the risk of children, students, and teachers becoming passive consumers rather than active thinkers’.[14]

5.5There may be a decrease in content transmission as a teaching practice, and a greater focus on skills-development. Educators may no longer be required to equip their students with certain knowledge and could instead focus on what students need to know to use higher order thinking and to operate safely as citizens and professionals.[15] As TEQSA highlighted:

The rapid advancement of large language models is forcing educators to think carefully about what knowledge still needs to be taught when so much information can be so readily synthesised by AI.[16]

5.6Several stakeholders agreed that GenAI will have a substantial impact on teaching practices and learning across all education sectors.[17] GenAI pedagogical practices must be informed by evidence-based research, educational principles underpinnings, and involve professional judgement.[18] GenAI pedagogical practices should also involve listening to educators about their use of GenAI in teaching, and evolving best practice, and sharing that information with the profession. The Queensland University of Technology argued for ‘…leadership and regulations around the use of generative AI that give educators rules and guardrails within which to innovate and develop practice’.[19]

5.7There are differing views about whether there will be a place for traditional components of education. The Australasian Academic Integrity Network asserted that there will be ‘a shift away from traditional educational learning [and] teaching’,[20] while Mr Anthony England, Director of Innovative Learning Technologies at Pymble Ladies’ College (PLC) said ‘balancing the use of AI with traditional learning methods is critical to develop a holistic set of skills in students’.[21] Professor Leslie Loble AM, Industry Professor at the University of Technology Sydney (UTS), argued that ‘the core fundamentals of good education’ will remain crucial, including literacy and numeracy, as well as critical thinking, computational thinking, and ethical reasoning.[22]

Educating students in AI

5.8There was broad agreement among stakeholders that educators would need to impart to students the skills and qualities required to use GenAI tools safely, responsibly, and ethically. Thiswill also be necessary to help students navigate life generally as the technology becomes embedded into society. Monash University (MU) said that educators will need to determine ‘how to teach with and teach about GenAI’.[23]

5.9MU further stated that:

By engaging strategically with AI tools, students and educators can develop their capacity for safe, responsible, and effective use of non-human tools, deepen critical thinking skills, build an understanding of the uses of big datasets, and understand the consequences of misuse.[24]

5.10The Australian Academy of Science also emphasised the importance of artificial intelligence (AI) literacy:

By demystifying and scaffolding the use of AI in our teaching, we will equip students with the skills and knowledge needed… Without appropriate education, Australia risks falling behind not only on technological advancements but also in identifying and dealing with their misuse. As such, collaboration between academia, industry and government is crucial to create a well-rounded environment for the development and implementation of AI in Australia.[25]

5.11The University of Melbourne (UoM) highlighted that the Council of Europe considers that AI literacy comprises technological and human dimensions.[26] Similarly, MU said AI training should encompass:

● technological literacy, or understanding how machines work and how to work with them

● data literacy, which is the fluency to interpret and utilise the information on which technology operates, and which is generated by it

● human literacy, which cultivates human traits such as entrepreneurship, ethics, care, leadership, and understanding of intercultural contexts.[27]

5.12Regarding technological and data literacies, people will need to learn about how to use GenAI tools appropriately and confidently, the technology’s benefits, and risks such as biased, inaccurate, or outdated results.[28] Similarly, in the context of the Australian Curriculum, Australian Curriculum Assessment & Reporting Authority (ACARA) said:

For students to understand what AI is and how it works, they need to be taught about the concepts of chance, data and algorithms, to explore the risks and challenges of AI, its diverse applications and how to leverage it for positive impact as either users of AI or designers of digital solutions.[29]

5.13As described in Chapter 2, there are various initiatives to promote AI literacies for students. For example, there are pilot projects at the state level—like in South Australia, New South Wales and Queensland—and at the institutional level, for instance at MU and PLC. The Grok Academy, with Amazon Web Services (AWS), is integrating digital and AI skills into school classrooms.[30] The Grok Academy is providing K–12 students with free online and self-paced cloud learning resources that are consistent with the Australian Curriculum, and aims to roll this out throughout Australia.[31]

Future workforce and national interest

5.14It is in the national interest to promote AI literacies. AWS explained that having a digitally skilled workforce is crucial to creating a prosperous Australia, and that this is best achieved through equipping school and tertiary students through curriculums.[32] Students will ‘become the next generation of AI leaders’. The Australian Academy of Technological Sciences and Engineering (AATSE) expressed a similar view:

It is vital that we not only prepare students for this, but give them… a competitive advantage by training the next generation of AI leaders who can both use and improve AI to build a stronger nation.[33]

5.15Educators at the school and tertiary levels need to prepare students for the use of GenAI in current and future workplaces.[34] Workplaces will increasingly involve ‘human-AI collaborative relationships’,[35] especially as businesses and government find productivity gains.[36] Further, AI will ‘profoundly alter future employment types and the skill sets needed to service them’, and employers will want GenAI literate graduates.[37] Edith Cowan University said that graduates would need to tell potential employers ‘what they offer, over and above the outputs of AI tools’.[38]

5.16Swinburne University suggested making the ‘use of generative AI tools… a key component of student learning and thus fostered through targeted work-integrated learning pedagogy that embeds opportunities for students to experience general as well as discipline-specific generative AI workplace applications’, such as internships and industry-linked projects. Swinburne University further recommended that universities and regulatory bodies stay abreast of the industry practices involving GenAI and update their curricula accordingly.[39]

Upskilling educators

5.17The broad ecosystem in the education space requires AI training and literacy, as the National Tertiary Education Union (NTEU) pointed out:

This should extend to all staff who are engaged with teaching, learning and research, as well as administrative and professional/technical staff, where there is the expectation or necessity for AI to be part of their activities. In particular, casual, sessional, contract and other staff employed non-permanently should be supported by the institution in relation to their professional development around the use of AI.[40]

5.18Education students at university also require AI training.[41] AWS suggested having it ‘within the teaching qualification and broader teacher accreditation and professional development frameworks’.[42] Integrating this into the curriculum at university is discussed in Chapter 4. Additionally, some stakeholders said it should be developed in consultation with the higher education (HE) sector, accreditation and registration bodies, and industry partners.[43]

5.19The current education workforce also needs to be equipped with the knowledge and skills to teach with, and about, GenAI. Professional development (PD) should cover the purpose of GenAI, how to use it safely and responsibly, how to integrate it into teaching practices, the technology’s risks and benefits, and implications of key policy changes.[44] This training is important to ensure that teachers themselves have solid AI literacy that they can in turn build in their students,[45] that the tools uses align with the curriculum,[46] and that the uses improve student outcomes.[47] PD and training is needed to mitigate against risks such as ‘… job insecurity and skills obsolescence among educators’.[48]

5.20AI tools are already being deployed in educational settings, and educators often feel ill-prepared by their institutions. The NTEU said its “members have felt that these things have been pushed on them”.[49] There is also some resistance to GenAI amongst teachers.[50] Teachers’ attitudes towards GenAI vary, partly due to a lack of school support, access to experts and communities of practice, ethical issues around cheating and unfair equity of access, and a lack of evidence of the benefits on teaching and learning.[51] Effective communication is needed about GenAI, which would also help to appease anxiety and dispel myths around the technology.

5.21There was a consensus between stakeholders that PD needs to be ongoing, so that educators can stay abreast of technological advancements and can adapt their pedagogical practices accordingly. PD also needs to be rolled out quickly and in a targeted way.[52] Additionally, it needs to be accessible and easily digestible, especially as educators are time poor and given teacher burnout.[53] Given this, if GenAI training becomes compulsory, teachers should be supported by their employers to complete it during school hours.[54]

5.22PD may include online and in-person webinars, seminars, modules, guides, and other resources.[55] The Grok Academy is working with AWS to provide primary to secondary school teachers with free PD resources.[56] ACARA is creating modules and is ‘… keen to partner with jurisdictions and sectors and teacher professional associations to support teachers to plan and implement the curriculum relevant to understanding AI’.[57] Stakeholders have also said that there is a lot of collaboration and information-sharing in the education sector, including from early adopters.[58]

5.23Some other suggestions regarding upskilling educators include:

  • significant investment in PD for all affected staff and support for States and Territories to roll it out
  • consideration of other countries’ approaches. For instance, Singapore has ‘a national strategy for the inclusion of generative AI in Initial Teacher Education and also a national professional development program’.[130] Currently, there is no national rollout plan in Australia
  • the National AI Schools Taskforce could consider the provision of teacher training as part of the implementation of the Australian Framework for Generative AI in Schools
  • PD in AI as the next body of work regarding the ‘National Teacher Workforce Action Plan and all of the investments in supporting teachers in their craft and trying to reduce teacher workload’
  • collaborations between educational institutions, governments, industry and non-profits to assist educators
  • a virtual schools hub, which may be an ‘industry-government partnership that then leverages the work of universities’
  • La Trobe University suggested ‘funding at universities to deliver evidence-based courses (including short courses/microcredentials) …’.[59]

AI champions

5.24One repeated idea was to have AI champions throughout the educational system in Australia. AI champions could assist educators and consequently students, as well as corporate functions, through peer-to-peer learning and sharing.[60] This already exists in specific institutions, like PLC. Mr England stated:

Normally best practice informs policy and what we do. But we don't have that luxury, so I need my staff—those that are willing to experiment. I want to give them familiarisation with tools, and then get them to go and use it and share their insights with their colleagues…Sixty AI champions will be established across the college over 2024, and those people will be charged with promoting its use within their subgroups…[61]

5.25The eSafety Commission already has a network of 900 eSafety Champions within schools, many of who are deputies and wellbeing teachers. The Commission also provides preservice training for teachers and PD opportunities for teachers. The Commission’s eSafety providers, accredited independent organisations, also support schools. Lastly, the eSafety Commission has risk assessments of emerging technologies, and other resources, for schools.[62]

5.26Several stakeholders highlighted the importance of ensuring equal access to PD, so that educators and students do not get left behind, thereby widening the digital gap. Lower socio-economic schools could get additional resources to access GenAI tools, resources, and implementation support.[63] AI champions could be funded,[64] and communities of practice could be established, as well as partnerships between wellresourced and disadvantaged schools. The AI itself could also be used to identify educators’ skill gaps and recommend resources.[65]

Role of parents and guardians

5.27It is important that parents and guardians understand, and are comfortable with, the use of GenAI in schools, to help support students. Parental permissions are required and are causing delays to use the tools.[66] The Australian Council of State School Organisations (ACSSO) stressed that it is vital that there is awareness raising for parents and guardians about what GenAI is, how it can be used, and the possible benefits and risks.[67]

5.28Parents and guardians have varied views about their children using GenAI in education, and AI literacy levels. Federation of Parents and Citizens Associations of New South Wales reported that their members had raised both benefits and risks with GenAI for students.[68] According to ACSSO, as well as Association of Heads of Independent Schools of Australia’s (AHISA) survey report, parents were especially concerned about academic integrity, including the detection and punishment of their children for using GenAI in assessments, and also privacy breaches.[69] Figure 5.1 highlights some of AHISA’s findings about parents’ views of GenAI:

Figure 5.1

Source: AHISA Member Survey: The use of generative AI in Australian independent schools, July 2023, Association of Heads of Independent Schools of Australia, Submission 82, p. 25.

5.29Many stakeholders could raise awareness, such as the schools themselves,[70] parent groups, external providers or the government. The eSafety Commission already has readily available information for parents.[71] AHISA hoped the Australian Framework for Generative AI in Schools would help gain parents’ trust.[72] It is up to schools to transparently share information about ‘what they're doing and how they're intending to use it…’ and to provide support.[73] There also needs to be a focus on how it can assist students who are in the ‘low-equity achievement gap’.[74]

Lack of evidence of impacts

5.30Despite the potential for GenAI to revolutionise education, there is a broad consensus that there is a need to strengthen the evidence base about the short-term and long-term impacts of GenAI on education. This includes a lack of evidence about the effects of these tools in specific and nuanced contexts. To date, there is no compelling evidence that GenAI tools would provide any advantage to learning. More broadly, according to the United Nations Educational, Scientific and Cultural Organization, it is not apparent that educational technology (EdTech) has improved learning at all, despite millions of dollars being invested worldwide.[75]

5.31Like many stakeholders, the University of Sydney argued for research into ‘the safe and effective use of generative AI to improve evidence-based teaching, learning and assessment in the Australian school, vocational and higher education sectors’.[76] Similarly, the University of South Australia said investment into research and development (R&D) is important to ‘rigorously evaluate the role of AI in education’, including its impacts on teaching and learning and workforce requirements.[77]

5.32A solid and tested evidence base is also needed to inform the development of tools, policies, and practices.[78] Better evidence is also required to manage risks of using GenAI in education. For instance, schools in Australia are ‘seeking guidance and evidence-based strategies to implement generative AI through a risk management framework’.[79] Without solid evidence backing the use of GenAI in educational settings, there could be serious and harmful consequences.[80]

5.33The lack of a strong evidence base is linked to GenAI in education being an emerging field with few experts.[81] Further, the technology is rapidly changing, and it is in its early stages of adoption in the Australian education system.[82] It is unclear what works with GenAI in education and why, particularly around:

  • the efficacy of the tools and whether they align with educational goals
  • the tools’ effects on education systems, pedagogical practice, and learning practices
  • possible benefits and risks to users
  • the broader impacts on the human condition.[83]
    1. Many countries are investing into R&D on AI in education. In 2023, the United States announced seven new AI centres, with two focussed on education. Stakeholders noted similar efforts and substantial funding for collaborative research centres in the United Kingdom, China, and in some European countries (€80M to create The National AI Education Lab to research the development and uptake of AI in education).[84]
    2. The Committee heard ‘that Australia is lagging behind competitor nations when it comes to our investment in AI and indeed research more broadly’, which is needed to develop GenAI products of our own.[85] Likewise, Professor Shazia Sadiq, Fellow at AATSE, stated that:

Australia is lagging behind in terms of investment into large collaborative research centres for AI in education. The good news is that we are not lagging behind in terms of our equity into global research and knowledge systems. There are many amazing researchers in Australia… who are world leaders in this space—Monash, UniSA, UQ, University of Sydney, to name a few…[86]

5.36Stakeholders identified possible ways to address the issue of the lack of evidence about GenAI and its impacts, including:

  • regulation and guidelines to protect consumers[87]
  • funding for ongoing R&D, and monitoring and evaluating[88]
  • establishment of research centres[89]
  • ‘collaboration between educators, researchers, policymakers, and technology developers’[90]
  • trials of evidence-based programs.[91]

Possible impacts on assessment

5.37There was general agreement between stakeholders that GenAI would have considerable impacts on assessment design and practices across all levels of education.[92] This is because GenAI presents major challenges to assessment integrity, as well as some opportunities.[93] GenAI is already impacting approaches to assessment in schools and HE,[94] and some significant shifts are underway.[95]

Risks

5.38GenAI tools are prompting a fundamental questioning of how to approach assessments, especially the written essay. This type of research-based assessment has long been plagued with issues such as contract cheating and plagiarism, which could be exacerbated by GenAI. GenAI presents issues of authorship, which also affects academic and research integrity. These concerns also apply to assessments as GenAI can be used by students to cheat and muddy the waters about who produced what content.[96]

5.39There will need to be new ways to verify students’ identity and work when completing assessments.[97] Whilst AI detection tools exist, they have limited effectiveness in identifying AI generated content.[98] This is especially so ‘given the emergent nature and widespread accessibility of generative AI tools and large size of many higher education classes’. Plagiarism using GenAI poses reputational risks to students and their careers, institutions and the education sector at large. The Australian Academic Integrity Network argued that ‘strategies and resourcing are needed to address significant risks of misuse and falsification by students claiming the outputs of generative AI as their original work’.[99]

5.40Many universities have not been using AI detection tools as they are still in the early stages of development.[100] The UoM has been helping to test a tool launched by Turnitin in April 2023, which aims to identify AI generated content. The UoM’s tests have revealed ‘that these tools may be more likely to flag false positives where human authors use simple, predictable, or consistent word choices and sentence structures’ and forewarned that the ‘reliability of detection tools may vary as new, more sophisticated large language models are developed’.[101]

5.41Further, many stakeholders agreed that it will be difficult for schools and HE providers to certify whether the desired learning outcomes have been met if GenAI has been used.[102] This could make it complex for accreditation bodies to know whether to award degrees to students.[103] It is a minimum requirement under TEQSAs Higher Education Standards Framework that educators design assessments that can accurately show whether a student has demonstrated the required skills and knowledge.[104] TEQSA warned that:

It is crucial that the education sector develops new methods of assessment that can ensure learning outcomes in an age of AI tools to prevent an uncoupling of learning and assessment, which could have far-reaching consequences.[105]

5.42Another issue is that HE institutions have inconsistent policies and practices when it comes to GenAI and assessments, with some banning it and others supporting it.[106] This creates an uneven playing field for students, and different expectations on educators. At the HE level, TEQSA could play a leadership role.

5.43Schools have some guidance around assessments and GenAI. The Australian Framework for Generative AI in Schools includes the following:

Learning design: work designed for students, including assessments, clearly outlines how generative AI tools should or should not be used and allows for a clear and unbiased evaluation of student ability.

Academic integrity: students are supported to use generative AItoolsethically in their schoolwork, including by ensuring appropriate attribution.[107]

Possible shifts

5.44There was broad agreement among inquiry participants that traditional methods of assessments have become less effective.[108] Students can get GenAI to complete a lot of traditional assessment for them, such as producing essays.[109] The education sector needs to shift towards more ‘authentic’ assessment practices to support academic integrity, that focus on testing human skills, such as critical thinking, and ‘knowledge integration [and] ethical practice’.[110] Assessing for knowledge of key content will remain important.[111]

5.45A change towards more authentic assessments could, however, be more labourintensive for educators to design and implement, such as in-person examinations (e.g. defending your thesis).[112] There will be a greater need for more one on one student-staff interaction to ensure learning outcomes have been achieved and that cheating has not occurred.[113] This is particularly difficult for educators with hundreds of students, so sustainable strategies need to be developed and the student-teacher ratio may need to be reconsidered.[114] The change toward authentic assessments could also require educators to have extra evaluation measures.[115] Educators could also have an increased workload if they are required to ‘continuously develop new methods of assessment that assess students at a level beyond the levels of AIs’.[116]

5.46The assessment transition could involve understanding and using GenAI for ‘the design and conduct of assessment’.[117] Educators collaborating with GenAI to create more tailored assignments designed to ‘prompt students to apply their knowledge and to foster critical and creative thinking’.[118] Students could be asked to integrate GenAI as part of the assessment.[119] The University of Sydney had developed a two-lane approach to assist educators:

Lane 1includes secure assessments predominantly in a live, supervised setting, designed to be as authentic as possible. Lane 2includes setting assignments where students are encouraged and taught how to collaborate with Gen-AI productively and responsibly, focusing on assessing the process of learning as well as the product of that learning.[120]

5.47Several stakeholders highlighted a likely shift from product-orientated assessment to focus more on the processes of learning. UTS said this could involve ‘students documenting and reflecting on how they have tackled a task, aided by analytics that capture activity traces, and enable novel forms of personalised feedback’.[121] Students could be tested on critically evaluating the AI generated content used in their assessments, including modifying it, as well critiquing the learning process itself.[122] This collaborative approach with GenAI would also help to prepare students for using GenAI in the workplace.[123]

5.48GenAI also presents opportunities for assessments. Assessment could take advantage of new ways of learning with GenAI.[124] Students may create more sophisticated responses to assessment by using GenAI.[125] Western Sydney University outlined certain benefits of using GenAI for assessments, stating:

personalised student assessment and individualised feedback, tailored learning paths, augmented assessment (e.g. immersive and real-world simulations), increased formative assessment—feedback at scale, immediate interventions for students at risk.[126]

5.49Schools looking to solve the risks and challenges with GenAI are implementing solutions such as smaller, more frequent assessments, on-the-spot tests and exams and oral presentations to verify student working knowledge and ensure academic integrity.[127]

5.50Stakeholders had many recommendations about how to move forward with assessments in the emergence of GenAI, such as:

  • reassess the purposes of assessment and what might be required to verify that learning outcomes have been achieved[128]
  • develop assessment guidance that increases resilience to GenAI and takes advantage of the benefits of GenAI[129]
  • have programmatic assessment, use EdTech, combine ‘educator and peer feedback with automated feedback’, and promote the quick adoption of robust assessment design principles.[130]
    1. The University of Sydney concisely recommended:

• Adapting to GenAI, rather than trying to ban or outrun the technology.

• Rediscovering what it means to be human and assessing these skills and attributes.

• Refocusing on the desired student learning outcomes.

• Assessing the learning process as well as the product.

• Co-creating outputs with GenAI.

• Evaluating outputs co-created with GenAI.

• Asking students to describe and reflect on their use of Gen-AI and the lessons learnt.[131]

Potential impacts on academic and research integrity

5.52There are many risks associated with the use of GenAI to academic and research integrity. Ai Group raised critical issues, including accuracy of data, peer review use, authorship, intellectual property (IP), and ethical and privacy concerns.[132] This chapter discusses more views on academic integrity as it relates to assessments and plagiarism.

Authorship

5.53The Australian Research Council (ARC) explained the issue of authorship regarding the use of GenAI in research:

Using generative AI tools to generate text and passing that off as original could undermine the norms around authorship. Traditional attribution of authorship assumes that the author has applied their intellect, skill and effort, and appropriately acknowledged and cited the work and ideas of others that have been drawn upon as part of that content. But when generative AI tools are used, it can become difficult to identify what is work genuinely authored by that researcher or research team, or where authors have drawn upon the work of others, without acknowledgment.[133]

5.54Dr Aaron Lane from the RMIT Blockchain Innovation Hub stated that GenAI ‘does not meet the threshold of authorship’. Dr Lane asserted that GenAI should be considered in the same vein as a Google Search or Wikipedia in the production of material or research content such as journal articles, as people do not disclose the use of databases to help locate resources. Furthermore, Dr Lane acknowledged that the process of experimentation does take time, but will yield new norms in research.[134]

5.55Under the ARC’s Policy on the Use of generative AI tools, it cautioned applicants against using GenAI when developing their grant applications.[135] The Copyright Agency argue that originality, specifically human authorship, is a central requirement to GenAI use in Australia. This is because the assessment of student work and academic research in Australia is underpinned by the assumption that the work produced and submitted is the original work of the student. A submission that is wholly or partially generated by GenAI is not the result of the student’s effort; the student is not the author.[136]

5.56The UoM has put authorship policies in place for students and researchers using GenAI in their work. The UoM’s Authorship Policy requires the authors listed to have made a ‘significant intellectual or scholarly contribution to a research output’, and be willing to take responsibility for the contribution, thereby excluding GenAI tools being named as authors.[137]

5.57MU has similarly banned GenAI from being listed as an author and states that ‘Users of GenAI are responsible for the output they use–any errors, inaccuracies in data and plagiarised work that appears in the work will be attributed to the author’. Authors must disclose where they have used GenAI to create an output and cannot solely use GenAI to develop work.[138]

5.58Stakeholders identified ways to ensure academic and research integrity with GenAI use, which include:

  • consistent guidance from TEQSA in the sharing of useful resources given the broad nature of the current Higher Education Standards Framework (Threshold Standards) 2021. Updates to the Threshold Standards should be monitored and reflected in the technological advances.[139]
  • establish a fund to support research through the ARC or a Centre of Excellence to support the development, use, and impact of GenAI on education.[140]
  • guidelines and processes to research and academic integrity on what constitutes appropriate and ethical use in sourcing and acknowledging information.[141]

Research and data

5.59How research degree students, academic supervisors, or researchers, interacts with GenAI will vary with context and intent of use. One example of how researchers are using GenAI is to develop research capabilities from bibliographic surveys to robotics and literary reviews.[142] The UoM did not believe that GenAI use in research currently poses a significant risk to publication integrity, but that may change. The University explained that the generated text ‘is rarely at an appropriate academic level and is often wrong or absurd’. Moreover, if research material was found to have used GenAI products, it would carry serious penalties.[143]

5.60It is important that robust and appropriate protocols are followed to ensure the integrity of research products. Australia has a research integrity framework that is overseen by the ARC and National Health and Medical Research Council (NHMRC). The ARC and the NHMRC help to implement the 2018 Australian Code for the Responsible Conduct of Research which includes the principles of honesty, rigour, transparency, fairness, respect, recognition, accountability, and promotion. Theprinciples apply to the conduct of research in Australia that researchers and institutions are expected to follow and would also cover the use of GenAI in all elements of research.[144]

5.61Several stakeholders raised concerns about the accuracy and timeliness of research data used in research projects. TEQSA contended that GenAI has the capacity to not only generate fake data and images, but also entire studies and journal articles. It can be difficult to detect and discern images generated by GenAI, which can compromise the integrity of research.[145]

5.62GenAI is also prone to ‘hallucinations’ which can render its data unreliable. There is no expectation that GenAI needs to be truthful, even if the public thinks that it is. Infact, an argument can be made that GenAI platforms are not research tools themselves, rather they can be considered as ‘interlocutors in research conversations and design and brainstorming conversations’; or a data extraction service that can synthesis data.[146]

5.63Students from The Grange P–12 College told the Committee that GenAI platforms may not have up-to-date information which reaffirms the need to corroborate any sources used in research projects to reach more objective conclusions.[147] Furthermore, GenAI does not necessarily cite references, it searches the internet and takes information from sources and evaluates the data within its own software and parameters to answer questions.[148]

Peer review process

5.64Stakeholders identified risks associated with using GenAI in the peer review process. TEQSA raised concerns that ‘the administrative burden of the scientific peer-review process may result in reviewers outsourcing the review to AI systems to either provide the reviewer with a summary or provide feedback’. The risk here is that GenAI platforms do not have the level of expertise that a human expert may possess.[149]

5.65The ARC asserted that GenAI may compromise the quality and integrity of the peer review process or even breach Australian Code for the Responsible Conduct of Research, 2018 (the Code) by ‘diminishing these contributions and, potentially, producing text that contains inappropriate content or commentary that is generic and lacking in rigour’.’ If the Code is breached due to GenAI, it could seriously damage the ‘credibility and trust in the research endeavour, both at an individual, institutional and sector level’.[150]

5.66Expert Panel member Associate Professor Julia Powles, Director of the University of Western Australia Tech and Policy Lab, welcomed the ban on GenAI in the peer review process, and noted:

You cannot, for example, put in a submission that you have been reviewing and say, 'Translate this for an eight-year-old,' as much as you might like to, because there are no guarantees about where that information will end up. It is useful to have such a clear position from two of our national institutions.[151]

Intellectual property

5.67IP remains a problem for research integrity, as well as data integrity. The ARC warned that content produced by GenAI may be based off the IP of others and may be factually incorrect or hallucinated.[152] Some institutions have raised concerns about the disclosure of IP and confidential information. This is because GenAI tools such as ChatGPT may retain and review prompts that are entered to help AI trainers improve their systems. The UoM has advised researchers not to share confidential informational or innovation as a GenAI prompt due to fear that the IP may no longer be owned by the university.[153]

5.68TEQSA is working with institutions to ensure that they are proactively managing the risks posed by GenAI and IP when ‘sensitive pre-published research findings, doctoral theses presented for examination or grant applications are uploaded to a third-party platform’. If improperly managed, ‘AI has the potential to dilute the quality of published research, obscure genuine research in a sea of AI-generated content and ultimately undermine the public’s trust in the scientific process’.[154]

5.69The Committee was informed that information entered into commercial GenAI tools, including ChatGPT, may enter the public domain and be accessed by other users and third parties.[155] The Group of 8 (Go8) shared concerns about maintaining data confidentiality, particularly regarding the health and medical disciplines as these studies rely on personal data and confidentiality. The Go8 explained that tools in the public domain risk the release of personal data in an ‘uncontrolled and unauthorised manner’.[156]

5.70The ARC has articulated that the release of material into GenAI tools constitutes a breach of confidentiality and advised that it will remove AI generated content from its assessment process.[157] However, the introduction of Microsoft Copilot may allow users to contain some data safely in bubble, thereby facilitating increased use of the tool in research.[158]

Committee comment

5.71The Committee was impressed by the evidence presented on the possible impacts of GenAI on educators, the education workforce more broadly, teaching practices, assessment, and academic and research integrity. The Committee recognises that change is occurring quickly, and that people need support to keep up and to maximise the technology’s benefits—for their individual or institutional needs. While there will be upfront time investments and costs in learning how to use GenAI, the Committee considers it a worthwhile long-term investment to realise the benefits.

5.72Stakeholders agree that the role of educators will inevitably change with the uptake of GenAI, but that their primacy should remain as a human interface. Teachers should still be responsible for teaching the fundamentals of education, and emerging technologies like GenAI can be embedded into pedagogical practices.

5.73It is apparent to the Committee that AI literacy and capacity-building is vital for educators, the broader workforce including policymakers, students, and their parents and guardians to learn how to use GenAI appropriately. They need support and training to be prepared and be comfortable with the technology. This applies to schools, TAFEs, and universities.

5.74As such, a huge uplift is required nationally, including training for pre-service teachers, and professional development for existing teachers. For TAFEs and universities, this requires them to integrate GenAI into all courses as mentioned in Chapter 2, including to equip pre-service teachers. For existing teachers, one common issue is that they tend to not have time to complete more professional development, which requires employers to ensure teachers have support to do so.

5.75The Australian Government, in collaboration with the States and Territories, can lead the way on building AI literacy. This would ensure consistency, especially given the Australian Framework for Generative AI in Schools and rollout of GenAI tools in schools. Many resources already exist and need to be harnessed and made accessible in a coordinated way. The Committee recognises that the Australian Government has created a Digital Technologies Hub and sees value in using it as a single online repository for information on GenAI. A cluster model of AI champions could also be established to assist everyone, including marginalised schools, to embed GenAI.

5.76It is expected that GenAI will also have considerable impacts on the broader education workforce, and the design and implementation of assessments. These impacts will require adjustments to education policy and practice. The Committee notes the good work by the HE sector on dealing with assessment, as well as academic and research integrity, including to detect GenAI-related plagiarism.

5.77The Committee supports the establishment of a Centre for Digital Educational Excellence. The Centre should work collaboratively with regulatory, delivery and policy agencies in governments, as well as with the technology and education sectors.

5.78The Centre would capture best practice and data worldwide and locally in terms of GenAI use in education settings, as well as the adaptation of both curriculum and pedagogy to reflect the impact GenAI will have on both education and the world of work.

Recommendation 18

5.79The Committee recommends that the Australian Government work with State and Territory education departments to train educators and other staff in maximising the benefits of GenAI tools in educational settings, including:

  • training for pre-service teachers
  • professional development for existing teachers.

Recommendation 19

5.80The Committee recommends that the Australian Government support teachers in schools to build students’ skills through project-based learning, inquirybased approaches, and real-world problem-solving activities that demonstrate the risks of the technology.

Recommendation 20

5.81The Committee recommends that the Australian Government, in collaboration with the State and Territory governments, develop and implement a national training rollout plan for:

  • educators and broader education workforce through professional development and training, including virtual and in-person short courses and learning modules
  • students, through teacher delivery and online resources
  • parents and guardians, through information campaigns, school-led meetings, and online resources.

Recommendation 21

5.82The Committee recommends that the Australian Government encourage:

  • the use of the existing Digital Technologies Hub as a one-stop online repository of training and resources for educators, students, and parents and guardians to learn and teach about GenAI
  • a community of practice of AI champions, comprising lead educators and early adopters of AI in schools, TAFEs, and universities.

Recommendation 22

5.83The Committee recommends that universities and TAFEs embed GenAI competencies and skills across all courses and degrees.

5.84The Committee recommends that universities provide pre-service teachers with training in AI literacy in their degrees, including built-in industry-practice.

Recommendation 23

5.85The Committee recommends that Tertiary Education Quality and Standards Agency work with higher education providers to develop standards and frameworks, including authorship policies, to guide universities in maintaining research and academic integrity regarding GenAI.

Recommendation 24

5.86The Committee recommends that the Australian Government establish an innovation fund for universities to undertake research and development on the positive and negative impacts and potential application of the use of GenAI in education.

Recommendation 25

5.87The Committee recommends that the Australian Government establish a Centre for Digital Educational Excellence, modelled on the existing Cooperative Research Centres, which would act as a thought-leader in relation to both the use and development of GenAI in school and university settings.

Ms Lisa Chesters MP

Chair

21 August 2024

Footnotes

[1]University of South Australia (UoSA), Submission 29, p. 6; Curtin University Submission 41, pp. 1–2.

[2]Curtin University, Submission 41, pp. 1–2.

[3]TEQSA (TEQSA), Submission 33, p. 3.

[4]Pymble Ladies’ College (PLC), Submission 93, p. 9.

[5]Tertiary Education Quality and Standards Authority (TEQSA), Submission 33, p. 3.

[6]TEQSA, Submission 33, p. 4.

[7]PLC, Submission 93, p. 8.

[8]Independent Schools Australia (ISA), Submission 22, p. 11; Australasian Academic Integrity Network (AAIN), Submission58, p. 6.

[9]Edith Cowan University, Submission 17, p. 4.

[10]AAIN, Submission 58, pp. 6–7; Monash University Submission 3, pp. 2–3.

[11]Independent Education Union of Australia (IEUA), Submission 26, p. 3.

[12]The University of Sydney, Submission 44, Appendix C, p. [19].

[13]Mrs Kristen Migliorini, Founder and Chief Executive Officer, KomplyAi, Committee Hansard, 29 January 2024, p. 18; AAIN, Submission 58, p. 6.

[14]Centre for Digital Wellbeing, Submission 83, p. 8.

[15]Edith Cowan University, Submission 17, p. 2.

[16]TEQSA, Submission 33, p. 4.

[17]TEQSA, Submission 33, p. 4; Swinburne University of Technology, Submission 39, p. 3.

[18]ISA, Submission 22, p. 4; Australian Education Union, Submission 42, pp. 2–3.

[19]Queensland University of Technology (QUT), Submission 57, pp. 2–3.

[20]AAIN, Submission 58, p. 6.

[21]PLC, Submission 93, p. 8.

[22]Professor Leslie Loble AM, Industry Professor, University of Technology Sydney (UTS), Committee Hansard, 20March 2024, p. 8.

[23]Monash University (MU), Submission 3, pp. 2–3.

[24]MU, Submission 3, p. 2.

[25]Professor Philip Poronnik, Chair, National Committee for Biomedical Sciences, Australian Academy ofScience, The University of Sydney, Committee Hansard, 2 November 2023 p. 6.

[26]The University of Melbourne (UoM), Submission 34, p. 10.

[27]MU, Submission 3, pp. 2–3.

[28]UoM, Submission 34, pp. 9–10.

[29]Australian Curriculum, Assessment and Reporting Authority (ACARA), Submission 16, p. 1.

[30]Amazon Web Services (AWS), Submission 85, p. 5; Australian Academy of Technological Sciences and Engineering (AATSE), Submission 14, pp. 1–2.

[31]Ms Kylie Walker, Chief Executive Officer, Australian Academy of Technological Sciences and Engineering, Committee Hansard, 5 February 2024, p.16; AWS, Submission 85, p. 5.

[32]AWS, Submission 85, p. 5.

[33]AATSE, Submission 14, p. 2.

[34]MU, Submission 3, pp. 2–3; Swinburne University of Technology, Submission 39, pp.1–3; AATSE, Submission 14, pp. 2–3.

[35]MU, Submission 3, p.2.

[36]Swinburne University of Technology, Submission 39, pp.1–3.

[37]Swinburne University of Technology, Submission 39, pp.1–3.

[38]Edith Cowan University, Submission 17, p. 2.

[39]Swinburne University of Technology, Submission 39, pp.1–3.

[40]National Tertiary Education Union (NTEU), Submission 52, p. 8.

[41]School of Education, La Trobe University, Submission 91, pp. 2, 6.

[42]AWS, Submission 85, p. 6.

[43]AAIN, Submission 58, pp. 6-7; AATSE, Submission 14, p. 3.

[44]University of Technology Sydney (UTS), Submission 71, p. 5; QUT, Submission 57, p. 3.

[45]Professor Loble, UTS, Submission 49, p. 3.

[46]ACARA, Submission 16, p. 5.

[47]Professor Loble, UTS, Submission 49, p. 3.

[48]PLC, Submission 93, p. 9.

[49]Mr Kieran McCarron, Policy Officer, National Tertiary Education Union, Committee Hansard, 11October2023, p. 12.

[50]PLC, Submission 93, p. 9.

[51]School of Education, La Trobe University, Submission 91, pp. 2, 6; Victorian Association for the Teaching of English, Submission 10, p. 6.

[52]Western Sydney University, Submission 35, p. 2.

[53]Ms Julie Birmingham, First Assistant Secretary, Teaching and Learning Division, Department of Education (DoE), Committee Hansard, 6 March 2024, p. 3; PLC, Submission 93, p. 9.

[54]Ms Veronica Yewdall, Assistant Federal Secretary, Independent Education Union of Australia (IEUA), Committee Hansard, 11 October 2023, p. 7.

[55]PLC, Submission 93, p. 9.

[56]AWS, Submission 85, p. 5; AATSE, Submission 14, p. 3.

[57]ACARA, Submission 16, p. 5.

[58]Ms Birmingham, DoE, Committee Hansard, 6 March 2024, p. 3.

[130]QUT, Submission 57, p. 6.

[59]UoSA, Submission 29, pp. 2-3; Western Sydney University, Submission 35, p. 2; QUT, Submission 57, p. 6; Ms Birmingham, DoE, Committee Hansard, 6 March 2024, p. 3; AWS, Submission 85, p. 5; Mrs Migliorini, KomplyAi, Committee Hansard, 29 January 2024, p. 20; School of Education, La Trobe University, Submission 91, p. 2.

[60]Dr Aaron Lane, RMIT Blockchain Innovation Hub, Committee Hansard, 9 November 2023, p. 14

[61]Mr Anthony England, Director, Innovative Learning Technologies, PLC, Committee Hansard, 29 January 2024, p. 2.

[62]Mr Paul Clark, Acting Executive Manager, Education, Prevention and Inclusion Branch, Office of the eSafety Commissioner, Committee Hansard, 4 October 2023, p. 12.

[63]UTS, Submission 71, p. 5.

[64]Dr Lucinda McKnight, Research Fellow, Centre for Educational Impact Centre, Deakin University, Committee Hansard, 9 November 2023, p. 14.

[65]Dr McKnight, Deakin University, Committee Hansard, 9 November 2023, p. 14.

[66]Association of Heads of Independent Schools of Australia (AHISA), Submission 82, p. 5; Australian Science and Mathematics School, Submission 31, p. 3.

[67]Australian Council of State School Organisations (ACSSO), Submission 25, p. 22.

[68]Federation of Parents and Citizens Associations of New South Wales, Submission 43, pp. 2–6.

[69]ACSSO, Submission 25, p. 22.

[70]Mr Chris Davern, Assistant Secretary, Strategic Policy Branch, Strategy, Data and Measurement Division, Corporate and Enabling Services Group, Department of Education (DoE), Committee Hansard, 6March2024, p. 5.

[71]Mrs Lorraine Finlay, Human Rights Commissioner, Australian Human Rights Commission, Committee Hansard, 4 October 2023, p. 13.

[72]AHISA, Submission 82, p. 7.

[73]Mr Davern, DoE, Committee Hansard, 6 March 2024, p. 5.

[74]ACSSO, Submission 25, p. 22.

[75]United Nations Educational, Scientific and Cultural Organization (UNESCO), Global Education Monitoring Report 2023: Technology in education – A tool on whose terms?, UNESCO, 2023, p 11, viewed 5September2024.

[76]Ms Birmingham, DoE, Committee Hansard, 6 March 2024, p. 3

[77]UoSA, Submission 29, pp. 11–12.

[78]Cooperative Research Australia (CRA), Submission 88, pp. 5–6.

[79]ISA, Submission 22, p. 13.

[80]CRA, Submission 88, pp. 5–6.

[81]Ms Birmingham, DoE, Committee Hansard, 13September 2023, p. 3.

[82]Mrs Danielle Cronin, Director of Education Policy, Catholic Schools New South Wales, Committee Hansard, 30January 2024, p. 25; Mr Brad Hayes, Federal Secretary, Independent Education Union of Australia (IEUA); Ms Yewdall, IEUA, Committee Hansard, 11 October 2023, p 8.

[83]ISA, Submission 22, p. 4; University of Technology Sydney, Centre for Research on Education in a Digital Society (UTS CREDS) Submission 19, p. 4; Ms Birmingham, DoE, Committee Hansard, 13 September 2023, p. 3; CRA, Submission 88, p. 6.

[84]Professor Shazia Sadiq, Fellow, AATSE, Committee Hansard, 5 February 2024, p. 15; Acting Professor Jason Lodge, Submission 24, p. 1; UoSA, Submission 29, pp. 11–12.

[85]Ms Vicki Thompson, Chief Executive, Group of Eight Universities, Committee Hansard, 20 September 2023, p.1.

[86]Professor Sadiq, AATSE, Committee Hansard, 5 February 2024, p. 15.

[87]UTS CREDS, Submission 19, pp.5–6.

[88]UTS, Submission 71, p. 2.

[89]School of Education, La Trobe University, Submission 91, p. 2.

[90]CRA, Submission 88, p. 6.

[91]ISA, Submission 22, p. 4.

[92]TEQSA, Submission 33, p. 1; UoSA, Submission 29, p. 5.

[93]Curtin University, Submission 41, pp. 1-2.

[94]UoSA, Submission 29, p. 5.

[95]School of Education, La Trobe University, Submission 91, p. 5; NTEU, Submission 52, p. 3.

[96]School of Education, La Trobe University, Submission 91, p. 5.

[97]UTS, Submission 71, p. 4.

[98]UoM, Submission 34, p. 6; AAIN, Submission 58, p. 9.

[99]AAIN, Submission 58, p. 9.

[100]AAIN, Submission 58, p. 9.

[101]UoM, Submission 34, p. 6.

[102]UTS, Submission 71, p. 4; NTEU, Submission 52, p.3; The University of Sydney, Submission 44, pp. 5–6; Curtin University, Submission 41, pp.1–2.

[103]Curtin University, Submission 41, pp.1–2.

[104]NTEU, Submission 52, p. 3.

[105]TEQSA, Submission 33, p. 4.

[106]Claire Field, Submission 70, p. 9.

[107]Department of Education, Australian Framework for Generative Artificial Intelligence (AI) in Schools, 17November 2023, viewed 15 August 2024.

[108]Australian Catholic University (ACU), Submission 68, p. 3; UoSA, Submission 29, p. 5.

[109]TEQSA, Submission 33, p. 4.

[110]Curtin University, Submission 41, pp.1-2; Western Sydney University, Submission 35, p. 2; UoSA, Submission 29, p. 5; ACU, Submission 68, p. 3; Association for Academic Language and Learning, Submission 11, pp 1-2.

[111]TEQSA, Submission 33, p. 4.

[112]TEQSA, Submission 33, p. 4; UTS, Submission 71, p. 4.

[113]NTEU, Submission 52, p. 3.

[114]UTS, Submission 71, p. 4; NTEU, Submission 52, p.3.

[115]TEQSA, Submission 33, p. 4; UTS, Submission 71, p. 4

[116]Mr Kieran McCaron, Policy Officer, National Tertiary Education Union, Committee Hansard, 11October2023, p. 11.

[117]AAIN, Submission 58, pp. 6–7.

[118]Monash DeepNeuron, Submission 75, p. 2; UoSA, Submission 29, p. 5; The University of Sydney, Submission 44, pp. 5–6; MU, Submission 3, pp. 2–3.

[119]UoSA, Submission 29, p. 5; The University of Sydney, Submission 44, pp. 5–6.

[120]The University of Sydney, Submission 44, pp. 5–6.

[121]UTS, Submission 71, p. 4.

[122]NTEU, Submission 52, p. 3; TEQSA, Submission 33, p. 4.

[123]UoSA, Submission 29, p. 5.

[124]UTS, Submission 71, p. 4.

[125]ACU, Submission 68, p. 3.

[126]Western Sydney University, Submission 35, p. 2.

[127]Mr Hayes, IEUA, Committee Hansard, 11October 2023, pp. 5–6.

[128]TEQSA, Submission 33, p. 4.

[129]UTS, Submission 71, p. 4; AATSE, Submission 14, p. 3

[130]UTS, Submission 71, p. 4.

[131]The University of Sydney, Submission 44, pp. 5–6.

[132]Ms Megan Lilly, Executive Director, Centre for Education and Training, Australian Industry Group, Committee Hansard, 29 November 2023, p. 2.

[133]Australian Research Council (ARC), Submission 77, p. 3.

[134]Dr Lane, RMIT Blockchain Innovation Hub, Committee Hansard, 9 November 2023, p. 16.

[135]ARC, Submission 77, p. 3.

[136]Copyright Agency, Submission 60, pp. 2–4.

[137]UoM, Submission 34, p. 7.

[138]MU, Submission 3, p. 7.

[139]UoM, Submission 34, p. 11; MU Submission 3, p. 4.

[140]UoM, Submission 34, p. 11.

[141]AAIN, Submission 58, p. 10.

[142]Dr Richard Johnson, Deputy Chief Executive Officer, Australian Research Council (ARC), Committee Hansard, 4October 2023, p. 3.

[143]UoM, Submission 34, p. 7.

[144]ARC, Submission 77, p. 4.

[145]TEQSA, Submission 33, p. 5.

[146]Dr McKnight, Deakin University, Committee Hansard, 9 November 2023, p. 19.

[147]Leo, Year 11 Student and Amy, Year 12 Student, The Grange P–12 College, Committee Hansard, 13March2024, p. 4.

[148]Leo, The Grange P–12 College, Committee Hansard, 13 March 2024, p. 4.

[149]TEQSA, Submission 33, p. 5.

[150]ARC, Submission 77, pp. 3–4.

[151]Associate Professor Julia Powles, Director, Tech and Policy Lab, University of Western Australia, Committee Hansard, 6 September 2023, p. 6.

[152]ARC, Submission 77, p. 3.

[153]The UoM, Submission 34, pp. 7–8.

[154]TEQSA, Submission 33, p. 5.

[155]ARC, Submission 77, p. 4.

[156]Group of Eight, Submission 63, p. 4.

[157]ARC, Submission 77, p. 4; Dr Johnson, ARC, Committee Hansard, 4 October 2023, p. 1.

[158]Dr McKnight, Deakin University, Committee Hansard, 9 November 2023, p. 20.