This chapter focusses on the first of the inquiry’s Terms of Reference. It begins with a description of the current main measurements of school attainment in Australia, sets out the evidence in relation to these and goes on to focus on ‘measurements of gain’.
‘Measurement of gain’ refers to a student’s educational journey over time rather than, for example, an exam which is a snapshot in time. The terms ‘gain’ plus ‘performance’ should be contrasted with achievement and attainment. Gain and performance are what the Committee sees as ‘measurements of gain’ and are a temporal measure of the impact of education whereas achievement and attainment are a snapshot in time.
The importance of this kind of measurement is outlined by Dr Peter Goss in his work Towards an adaptive education system in Australia which states:
…teachers and schools must be better able to track the progress of their students over time. Putting the right data in the hands of teachers helps them judge their impact on learning and, in turn, fosters individual professional responsibility and collective efficacy, the best forms of accountability.
The Committee’s view is that in the absence of context, analysis, and interpretive skills, data is of very limited use. If teachers are to be given more raw information they need the time, skills and knowledge to make good use of it.
The Committee was interested in the entirety of a young person’s school experience. As student’s educational attainment can decline at any point from kindergarten to year 12, making sure that there are appropriate measurements of gain in place is important in all years of student education.
This chapter will now outline the current main measurements of gain in schools, the National Assessment Program – Literacy and Numeracy (NAPLAN) and the Australian Tertiary Admission Rank (ATAR).
ATAR and NAPLAN as measurements of gain in schools
National assessment program
The National Assessment Program – Literacy and Numeracy (NAPLAN) is an annual assessment for students in Years 3, 5, 7 and 9. It has been part of the school calendar since 2008.
NAPLAN tests the sorts of skills that are essential for every child to progress through school and life, such as reading, writing, spelling and numeracy. The assessments are undertaken nationwide, every year, in the second full week in May.
NAPLAN is made up of tests in the four areas (or ‘domains’) of:
language conventions (spelling, grammar and punctuation); and,
NAPLAN tests skills in literacy and numeracy that are developed over time through the school curriculum.
NAPLAN results (or NAPLAN scale scores) are reported using five scales, one for each of the domains of reading, writing, and numeracy, and two for the language conventions domain (one scale for spelling, and one for grammar and punctuation). Each scale spans all year levels from Year 3 to Year 9 with scores that range from approximately zero to 1000. It is possible for a NAPLAN scale score to be negative.
Australian tertiary admission rank
Tertiary institutions in Australia have found that a selection rank based on a student's overall academic achievement is the best single predictor of success for most tertiary courses.
The Australian Tertiary Admission Rank (ATAR) provides a measure of a student's overall academic achievement in relation to that of other students. It is calculated solely for use by institutions, either on its own or with other selection criteria, to rank and select school leavers for their courses.
The ATAR is a rank, not a mark.
The Committee received evidence that indicated that NAPLAN and ATAR may not be the best measurements of student gain. They are measurements that show how much a student knows at a point in time, rather than, as discussed above, how much a student has increased their knowledge.
AITSL opined that NAPLAN and PISA are focussed on the macro level of national trends that do not assist teachers in their day-to-day work. AITSL commented that:
Current national student performance measures, including NAPLAN and PISA, provide a big picture view of a particular point in time. While data of this nature can be useful in understanding student performance at the national level and identify trends over time to inform system policies and resourcing, it does not help to inform a teacher’s day-to-day approach to targeting strategies for improving the rate of learning for an individual or group of students.
The National Apprenticeship Employment Network (NAEN) describe the current measurement tools of gain in schools as ‘patchy and inconsistent across the country’.
The Grattan Institute, quoting its report, Targeted teaching states:
The best schools in Australia are not necessarily those with the best ATAR or NAPLAN scores. They are those that enable their students to make the greatest progress in learning. Whatever a student starts from on the first day of the year, he or she deserves to have made at least a year’s worth of progress by the end of it. Any less, and our students will fail to reach their full potential. Sadly that is too often the case.
The University of New South Wales (UNSW) submitted that:
[M]easurements of gain, such as ATAR and NAPLAN are unreliable predictors of future academic success and capability. In fact, research suggests that such standardised scores may act as an academic de-motivator, offering only a form of ‘false currency’ by which students assess their suitability for Higher Education and certain occupations, with an unreliable sense of their academic worth.
The University of Technology Sydney (UTS) state that they:
…remain concerned at the current fixation of education authorities and the general public on the results of mass testing such as the Australia-wide NAPLAN and the international PISA tests. Our concern is that the emphasis on mass testing appears to assign a value to what is being tested at the expense of other aspects of learning; and that the testing of a narrow set of literacy and numeracy skills – as in NAPLAN – does not, and should not be interpreted as a fulsome description of a student’s abilities or capacities.
NAPLAN is not, however, considered consistently effective. Indeed, Ms Robyn Anderson, Senior Research Associate, Queensland University of Technology told the Committee that NAPLAN is:
…unreliable for children under the age of eight. They tend to rely on memory. Their thinking skills aren’t well developed. There’s a real crossover there. Their memories get full and they often start falling off in their abilities after year 3.
Professor Christine Ure, Alfred Deakin Professor and Head of School of Education, Deakin University observed that:
… schools just advertise their year-12 ATAR scores and that is really the social measure of the quality of that school, whereas it is about how well they connect students to understanding what work life is about. Again, going back to issues about measures, we don't have measures for that and we don't even have expectations for schools to be delivering that to students.
Conversely, the NSW Adult Literacy and Numeracy Council (NSWALNC), a membership based organisation representing adult literacy and numeracy practitioners, researchers, program managers, teacher educators and provider organisations, does see value in tests such as NAPLAN. However:
…focussing on these skills in isolation of the different contexts in people’s lives are unlikely to lead to long term benefits.
The Grattan Institute’s submission suggests an employer is more interested in the effort and persistence a student has shown across their school rather than their grades viewed in isolation:
Imagine an employer, faced with two recent school or university graduates with equally good (but not great) grades. Should they employ the student who had worked hard and improved their grades over time, or the student who used to have great grades but has been cruising? For most jobs, I would employ the former. If nothing else, I would be more confident that the first student understands the value of effort and persistence in life.
The evidence suggests that NAPLAN and ATAR may not be best suited for the particular purpose of measuring a school, student or cohort gain. However, as the next section shows, evidence from the Grattan Institute suggested that the data provided through NAPLAN could be better used to track student performance.
A new NAPLAN measure: years of progress
The Grattan Institute’s Widening Gaps: What NAPLAN tells us about student progress, Technical Report (Widening Gaps) seeks to measure and compare relative student progress on the NAPLAN test ‘in a way that is robust, easy to interpret, and comparable across different groups of students’. Commenting on NAPLAN scale scores it states the following:
While the scores are used to indicate whether a student is above NAPLAN national minimum standards for each year level, they have no other direct interpretation. The scores are an estimate of student skill level at a point in time, a latent concept – the numbers themselves have no particular meaning. Nor are the scores comparable across assessment domains.
Grattan Institute’s submission, quoting the Australian Curriculum, Assessment and Reporting Authority (ACARA), states that:
Students generally show greater gains in literacy and numeracy in the earlier years than in the later years of schooling, and that students who start with lower NAPLAN scores tend to make greater gains over time than those who start with Higher NAPLAN scores.
Their submission states that the Widening Gaps report:
…presented a new measure, ‘years of progress’, which benchmarks student performance in NAPLAN to the typical student. It allows us to see if students are catching up or falling further behind relative to others.
The submission repeats two recommendations, which it suggests remain relevant today, from the Widening Gaps report:
Adopt Grattan’s new ‘years of progress’ approach to better understand relative student progress and learning gaps; and
Use analysis of relative student progress to inform system priorities, resource allocation and needs-based funding policies.
NAPLAN and ATAR are large scale testing regimes that produce large amounts of data. NAPLAN in particular, with its scale score, may not be best suited to measuring individual students’ gain though evidence suggests that NAPLAN data can be used to provide measurements of student gain relative to a statistically typical or benchmark student.
The following evidence tackles the issue of what alternatives to NAPLAN and ATAR could be used as measurements of gain in schools.
Alternatives to NAPLAN and ATAR as measurement of gain in schools
Whilst the evidence above suggests that there should be alternatives to NAPLAN and ATAR as a measurement of gain in schools, no evidence pointed to an existing data measurement that could be used. Evidence to the Committee focussed on what could be measured to indicate student gain.
Ms Sonneman states that the focus of measure needs to move from big data such as NAPLAN and from traditional academic domains:
We believe that, in Australia, there's been a large focus on big data—so the use of NAPLAN and standardised tests—as opposed to small data, data in the hands of teachers, to actually assess where students are at in their progress and how to improve their teaching. We've also focused a lot on the traditional academic domains, rather than the 21st century skills, such as creativity and resilience and communications skills. This is not necessarily because we haven't articulated or clear goals for achieving those skills but is mainly because trying to measure progress in those skills is still in its infancy, although things are moving.
The NSWALNC expressed concern with a focus on measurements of gain in school as they see ‘education as a foundation for lifelong learning’. NSWALNC contend that:
… for measurements of gain in school to be in any way predictive of success in the workplace, those measurements need to encompass that broad range of cognitive and affective skills; the ‘soft’ employability skills. Unfortunately, such measurements have proven difficult to devise and the focus has been placed on performance of literacy and numeracy skills since this is easily assessable; the underpinning constituents of competence such as social and affective skills are therefore neglected.
The NSWALNC asserts that generic measurement of literacy and numeracy skills is not a ‘useful approach’ and suggested that the:
… the relationship between literacy, numeracy and workplace performance cannot be understood without a socio-cultural perspective. The literacy and numeracy skills required in any particular workplace, and indeed any particular role within that workplace, cannot be understood without reference to that specific context.
Measurements of gain such as the OLNA (Western Australia’s online literacy and numeracy assessment) and workplace learning feedback are, according to the National Apprentice Employment Network (NAEN):
… far more valuable to an employer, as they demonstrate both the baseline literacy and numeracy skills of the individual, in additional to the enterprising/soft skills required to adapt to a workplace environment.
The NAEN expressed concern that:
[W]hile the ATAR remains the pinnacle measure of success in schools, and attributes to the ranking of the school in the broader education environment, then the emphasis will continue to be on academic gain and not on the holistic growth and readiness of the student to manage life post-school, in whichever direction they choose to pursue.
The NAEN believe that:
[M]ore emphasis is required on the measurement of behavioural traits that employers seek, including qualities such as positive attitude, resilience, persistence, consistency of improvement and initiative.
In NAEN’s opinion school ranking systems:
… should be revisited and schools should be ranked on broader measures than ATAR achievements.
Year13, a group who utilise digital tools and processes with stakeholders to effectively engage and inform young people online about opportunities post year 12, argue that:
Measurement of gain in school is currently limited to academic success in the ATAR when we think about something measurable that is taken beyond the school grounds. Australian youth are becoming increasingly aware that their worth and intelligence is more than just the ATAR, despite their schools telling them otherwise. This is resulting in incredible disengagement within school classrooms, increased mental health issues and general distrust in the education system and wider government.
Year13 submit that young people should be provided other opportunities to highlight their unique characteristics and personalities and that a:
… more holistic education, where there are clearer and broader measurements of gain is integral in order for students to smoothly enter employment, further education and general life.
The Mitchell Institute agreed with the above proposition and state that:
…measures of academic achievement are the key priority – as demonstrated by the emphasis that many schools place on lifting National Assessment Program – Literacy and Numeracy (NAPLAN) results and Australian Tertiary Admission Ranks (ATAR). These proxy measures of achievement tend to drive the priorities of teachers, school leaders and education departments, and are used as the main indicator of both student learning, and school and system effectiveness.
While literacy and numeracy are core foundations of learning, the research is clear that young people need more to thrive in the workforce and over their lifetime.
The evidence above suggests that NAPLAN, and ATAR in particular, are not considered to be the best measurements of gain of students in education. It is also clear that studying for, and the skills tested in ATAR, do not necessarily provide, in the eyes of some employers, the best grounding for employment.
The Mitchell Institute’s submission described soft skills as follows:
Capabilities, which are also widely referred to as non‐cognitive skills, enterprise skills, 21st Century skills or soft skills, are the set of skills, behaviours and dispositions which enable individuals to translate their knowledge and skills into meaningful action in changing contexts.
The Australian Curriculum uses the term ‘General Capabilities’:
The seven General Capabilities in the Australian Curriculum are: Literacy, Numeracy, ICT Capability, Critical and Creative Thinking, Personal and Social Capability, Intercultural Understanding and Ethical Understanding.
The Committee has decided to use the term ‘soft skills’ to encompass non-cognitive skills, enterprise skills and 21st Century skills. The Committee notes that care ‘needs to be taken in the context and exact nature of the definitions.’
The Committee received much evidence on the issue of soft skills, 21st Century skills or enterprise skills in its recent inquiry into innovation and creativity. The evidence to this inquiry, some of which is quoted below, was broadly similar to evidence received by the Committee previously.
In its report for the Innovation and Creativity Inquiry, the Committee recommended that Skills Service Organisations:
…require Vocational Education and Training providers to explicitly assess students’ development of soft skills such as effective communication, team work and problem solving in all relevant qualifications…
TAFE Queensland submitted that:
The most significant barrier that employers suggest in engaging school students in a school- based apprenticeship or traineeship, is the lack of soft skills many students possess. Feedback received through TAFE Queensland’s industry engagement activities indicates employers feel school leavers are not provided with the necessary employability skills as part of their schooling to be taken on as apprentices/trainees. This includes a lack of understanding of being work ready, for example:
Willingness to accept authority and critical feedback;
Use of mobile phones and social media in the workplace.
The University of New South Wales makes the point that Work Integrated Learning (WIL) is one way in which students can be helped to gain soft skills. Their submission states:
WIL provides an opportunity for students not only to apply theory to practice, but also to focus upon and develop essential non-technical or ‘soft’ skills relating specifically to the industry in which they will eventually be employed, and positioning them to navigate more strategically a shifting labour market.
WIL is another issue on which the Committee took a great deal of evidence during its Innovation and Creativity Inquiry. Having considered that evidence the Committee recommended that:
Australian Government funding to VET maximises the provision of work integrated learning opportunities; and
the Department of Employment investigate options for a national work integrated learning framework which includes reporting requirements.
Year13 informed the Committee of the frustration of young people whose:
… soft skills often go unrecognised and are consequently undervalued, and they maintain a negative association with their senior schooling years and the ATAR. This drastically encourages disengagement in the classroom, and there is an urgent need for a shift in the system to accommodate the future of work.
State Local Learning & Employment Networks (LLEN) succinctly summed up the evidence suggesting WIL is integral to the gaining of soft skills when they pointed out that:
Immersion of school students within the work place enhances the understanding of a modern workplace and the ‘soft skills’ required and supports their pathways into further education, training and or employment.
TAFE Directors Australia pointed out that:
…high-quality VET qualifications combine the hard and soft skills employers need with the knowledge that underpins those skills. This means that holders of these qualifications won’t need to be retrained as often or for as long as many of their peers.
The Committee discussed whether soft skills or 21st Century Skills could be tested under NAPLAN and the evidence provided by Ms Anderson was that ‘[I]t costs a lot of money.’
Soft skills can make the difference between a young person getting employed and not getting employed. However, these skills are difficult to teach and assess. They are usually ‘picked up’ in the workplace and, therefore, it greatly benefits students to have more access to workplaces to be able to show prospective employers that they have such skills.
However, it is difficult to measure gain in relation to soft skills. For example, Mr David Pattie from the Department of Education and Training told the Committee that
…we definitely measure literacy and numeracy. We deliver it as part of the curriculum. These 21st century skills are a bit more difficult to measure.
Ms Beth Blackwood from the Association of Heads of Independent Schools of Australia said:
…there is still only rudimentary means of being able to measure those soft skills, there is undoubtedly an emphasis for all students, whether they take those VET programs or whether they take the university pathway, to acquire those skills. We know they need them to be adaptive in this industrial environment.
It is imperative that passionate and competent teachers are supported and recognised for the excellent work they do. Evidence to the Committee shows that better teachers are ones who have a better impact on their students. The key then, is how to measure this impact.
Measuring teacher impact on individual students is not easy and the Committee found that there are tests currently used to assess where students are at in their progress which, in turn, shows teacher impact.
The Committee sees a need for work to be done on how data from NAPLAN and ATAR can be better used, as NAPLAN data is with the Grattan Institute’s Years of Progress, as a measure of student gain.
As the Grattan Institute’s evidence shows, NAPLAN data can be used to provide better measurements of student gain. NAPLAN and ATAR should be viewed as important data sets that can be manipulated to provide different types of reporting. Research should be undertaken to look at the data sets provided by NAPLAN and ATAR testing and the different measurements that may be gained by using those data sets.
As the evidence cited above shows, there is a need for measurements of gain, referring to a student’s educational journey over time. Such measurements would be supported by students and employers.
Whilst the Committee acknowledges the importance of soft or 21st Century skills, teaching these skills should not come at the cost of teaching literacy and numeracy. Literacy, digital literacy and numeracy are the bedrock upon which all careers rely. School curriculums should have units of study that increase a student’s skills and capabilities in these areas as well as the areas of soft skills.
One of the most important ways in which a student can be prepared for work is to engage in work. Although not quantifiable in the way ATAR is, work experience should be seen as an excellent measure of gain.
In the VET arena integrating this can be best achieved with Work Integrated Learning (WIL). As shown above, the Committee has already considered, and made recommendations on, this issue. However, it feels that more work could be done at the secondary school level to assist students in recognising their career goals and their areas of strength such as giving students the ability to participate in:
work experience and volunteer work;
adult learning environments; and
career assessment and career guidance activities.
Having regard to the evidence above the Committee makes the following recommendations.
The Committee recommends that further research be undertaken into the better use of NAPLAN and ATAR for the purpose of measuring gain, with appropriate metrics to be included as part of NAPLAN reports.
The Committee recommends that further research be undertaken into measurements of gain, referring to a student educational journey over time. Without limiting the foregoing, the Committee recommends that research be undertaken as to how to measure gain in relation to the teaching and learning of soft skills.
The Committee recommends that schools incorporate into their curriculum literacy, digital literacy, numeracy and soft skills to prepare students for post-school education, training and work.
The Committee recommends that stakeholders work together to enable secondary schools to increase delivery of:
work experience and volunteer work;
adult learning environments; and
career assessment and career guidance activities in order to assist students in recognising their career goals and their areas of strength.
The Committee recommends that the Australian Government review the differing ways in which industry connections are organised in schools compared with in VET and compared with in higher education, with a view to:
considering what constitutes best-practice;
looking at ‘what works’ to maximise access to quality work-integrated learning opportunities;
identifying ways that both education providers and industry can be proactive in establishing work integrated learning opportunities; and
considering how best to measure the success or otherwise of work-integrated learning arrangements.