<!--HTMLCleanerRegion--> Chapter 2

Chapter 2

National Assessment Program – Literacy and Numeracy (NAPLAN)

2.1        This chapter covers the background to NAPLAN, its purpose, uses of the results data and the issues raised during the inquiry regarding the administration of national standardised literacy and numeracy testing.


2.2        Prior to the first NAPLAN tests in May 2008, each Australian state and territory managed its own literacy and numeracy testing regime, commencing in 1989 with the Basic Skills Test in New South Wales. Despite differences in the state and territory regimes, national comparative data was prepared on an annual basis from 1999 through a process called ‘equating'.F[1]F A similar process is still used today to enable comparisons between tests done in different years.F[2]F

2.3        In July 2003, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to work towards enhanced collection, consistency and reporting of literacy and numeracy performance data. The Schools Assistance (Learning Together – Achievement through Choice and Opportunity) Act 2004 prescribed the implementation of national tests by 1 January 2008. In July 2006, MCEETYA agreed that national literacy and numeracy testing for all students in years 3, 5, 7, and 9 would commence in 2008. The NAPLAN tests began to be administered across all states and territories with support from all education ministers.F[3]

What is NAPLAN?

2.4        The National Assessment Program – Literacy and Numeracy (NAPLAN) commenced in Australian schools in 2008. Each year, all students in years 3, 5, 7 and 9 are assessed on the same days using standardised national tests in Reading, Writing, Language Conventions (Spelling, Grammar and Punctuation) and Numeracy.F[4]

2.5        Parents receive individual reports for their children, showing how each child performed compared to the national average, and, in some states and territories, compared to the school average. Schools are provided with detailed student results at the same time as, or sometimes before, parents, depending on the timing of school holidays in individual states and territories. After that:

The results are released to the public in two stages. The first stage is the NAPLAN Summary Report, released in mid September, showing results at each year level and domain by state and territory and nationally. The second stage is the NAPLAN National Report that includes detailed results by sex, Indigenous status, language background other than English status, parental occupation, parental education, and geolocation (metropolitan, provincial, remote and very remote). The National Report is released at the end of the year of testing.F[5]

2.6        Testing at the national level is not a new concept and is conducted in countries such as Belgium, Canada, France, Germany, the Netherlands, South Korea, Sweden, the United Kingdom and the United States among others.F[6]F Methods, of course, vary. In Canada, for example, the Pan-Canadian Assessment Program tests sample student groups in reading, mathematics and science literacy. Results are used by jurisdictions to validate data from their own, separate jurisdiction-level assessments, as well as Canada's results in Programme for International Student Assessment (PISA) tests.F[7]

2.7        A report on assessment systems published by the United Kingdom Qualifications and Curriculum Development Agency in 2007 provides details on a number of national assessment systems, such of that of France, where full student cohort literacy and numeracy testing is in place for Year 3 and Year 6 students and results are published by the Ministry of National Education's Division for Assessment, Evaluation, Potential and Performance.F[8]

2.8        The committee majority notes the Australian Primary Principals Association's position on NAPLAN as being only one element of the education system:

...NAPLAN is only one source of information about student achievement and the primary curriculum is designed to promote the social and emotional development of children as well as their academic attainment across all areas of the curriculum.F[9]

Who manages NAPLAN?

2.9        Following agreement at a Council of Australian Governments (COAG) meeting in October 2008, a national education authority, the Australian Curriculum, Assessment and Reporting Authority (ACARA), was established in December 2008 by the Australian Curriculum, Assessment and Reporting Authority Act 2008. One of the authority's key tasks is to assess the literacy and numeracy capabilities of the student population, and to this end ACARA manages NAPLAN. It should be noted that NAPLAN is just one aspect of the broader National Assessment Program (NAP).F[10]

How are the tests developed?

2.10      ACARA explained that the process for developing the tests is comprehensive and involves input from experts providing services under contract, supported by expert review and state and territory officials. The process takes around 12 months and has five phases: test development; administration; marking; analysis; and reporting of results.F[11]

What is the purpose of NAPLAN?

2.11      ACARA notes that the main purpose of NAPLAN testing is:

to identify whether all students have the literacy and numeracy skills and knowledge that provide the critical foundation for other learning and for their productive and rewarding participation in the community.F[12]

2.12      The first step towards improvement is the identification of areas of need. As succinctly put in a highly regarded report on the world's top-performing school systems by McKinsey & Company:

All of the top-performing systems...recognise that they cannot improve what they do not measure.F[13]F

2.13      A large number of submissions considered NAPLAN tests to be useful diagnostic tools.F[14]F Others described NAPLAN testing as '...an important advance in addressing poor performance'.F[15]F The Association of Heads of Independent Schools of Australia (AHISA) stated that NAPLAN:

...can help in the development of targeted programs for the professional development of teachers and school improvement...[and]...has unique value in that it provides state/territory and national data that allows principals a broad brush comparative benchmarking of student achievement.F[16]

2.14      The ACT Council of Parents and Citizens Associations echoed these views:

Council strongly supports national NAPLAN testing which provides parents with an additional resource on the progress of their child's education and has the opportunity to strengthen partnerships in learning between parents, teachers and schools.F[17]F

2.15      The Independent Education Union of Australia submitted that:

...the primary purpose of assessment and reporting is to provide meaningful information so as to improve student learning. The reporting process must be an integral part of the teaching and learning process.F[18]

2.16      The committee majority also notes the submissions which did not consider NAPLAN tests to be necessary or beneficial, but instead, for example, found them to be a '...low-cost, broad-brush, rough assessment guide, not a fine-grained diagnostic tool'.F[19]

How test results are used

2.17      ACARA noted that uses of NAPLAN data include:

2.18      The Australian Council for Educational Research (ACER) identified the following uses for the data:

2.19      The committee majority notes that NAPLAN data enable a direct comparison of results from one year to the next. NAPLAN helps schools identify successful programs and identify areas in need of improvement. Importantly, it assists education systems and governments to identify schools performing well or poorlyF[22]F which informs the allocation of resources.

2.20      State and territory governments have welcomed NAPLAN testing and confirmed that results are used:

...for system level reporting, for school accountability and for strategic planning. This data has enabled jurisdictions to target their support in resourcing schools and students with the greatest need... Moreover, the real power of the data derived from NAPLAN testing is through jurisdictional analysis whereby schools, and individual teachers, have access to a thorough diagnostic analysis of the performance of each student on each test, provided on a question by question basis.F[23]

2.21      The government of South Australia reported that results also help education departments apply measures which assist in developing intervention plans for students who do not meet minimum standards.F[24]

2.22      Mrs Sharyn Lidster, Acting General Manager, Strategic Policy and Performance, Department of Education, Tasmania, explained how jurisdictions use test data collected:

The NAPLAN tests are used extensively to support schools. Our jurisdiction, and others, provides the information back to schools. A lot of analysis is done that supports the schools. Workshops are run to help the schools to interpret the results and use them effectively to support their teachers. We also conduct workshops to assist senior people within schools to interpret the information and, where appropriate, we provide additional support for schools in relation to teacher development. Also, funding is provided to support the additional programs to improve the outcomes for students, where they are identified as performing below where we would expect them to be.

In Tasmania we also use the NAPLAN results to link to the teachers' assessments... We link the results of NAPLAN to those assessments and we provide the information back to the teachers to give them an objective piece of information that says that your assessments are consistent with the students' performance on the actual national tests.F[25]

2.23      In addition, the data can be used by parents and caregivers to make informed decisions about the education of their child. Standardised tests allow parents to see how their children are performing compared to the national average, and are supported as an additional resource enabling parents to measure their children's educational progress.F[26]F As noted by Professor Geoff Masters et al:

Parents and caregivers require valid and reliable information to evaluate the quality of the education the children are receiving, to make informed decisions in the best interests of individual students, and to become active partners in their children's learning. They require dependable information about the progress individuals have made) the knowledge, skills and understandings developed through instruction), about teachers' plans for future learning, and about what they can do to assist. There is also considerable evidence that parents and caregivers want information about how their children are performing in comparison with other children of the same age. And, if they are to make judgements about the quality of the education their children are receiving, they require information that enables meaningful comparisons across schools.F[27]

2.24      The Australian Parents Council, putting aside its reservations about the way NAPLAN results are currently used, recognised that parents need and are entitled to information on their children's education and progress.F[28]F The Australian Education Union submitted that parents have a right to know about their children's progress, but stated that '...there is no inherent right to information concerning other children at the school'.F[29]F

2.25      Barrack Heights Public School members of the NSW Teachers Federation called for more safeguards around the use of test results in order to prevent profit‑based organisations, such as media outlets and real estate agencies, from manipulating data for financial gain.F[30]F

2.26      This tension between the rights and advantages of parents and teachers accessing information on student progress on the one hand, and how this information was being used on the other, was evident in other submissions too:

The results of testing should be used to inform the teacher on the progress of students, the effectiveness of their teaching strategies and to give feedback to students and parents. Test results should not be used to make odious comparisons between schools.F[31]

2.27      Dr Ben Jensen emphasised how test result data should be used:

Any measure of school performance should not be viewed as an end in itself; they should be a basis of action. NAPLAN and My School should trigger actions that help Australian students.

First, NAPLAN results could be used to trigger actions to help underperforming students. Actions should be taken once a student performs at or below minimum literacy levels in the NAPLAN assessments and a development program and perhaps special assistance introduced until they are performing at appropriate levels. This has been successful in numerous high-performing countries. Second, schools labelled as underperforming on My School should be placed on a development program until they are performing at adequate levels. We are failing the students in these schools and we all need to ensure that these problems are addressed.F[32]

10BCommittee majority view

2.28      The committee majority agrees with the fundamental importance of literacy and numeracy, as supported by educational research. It understands that NAPLAN tests, while not in the traditional sense designed as 'diagnostic' assessments, can identify strengths and weaknesses in literacy and numeracy, and recognises the importance of measuring literacy and numeracy.F[33]F The committee majority notes the many uses for NAPLAN data and agrees that standardised testing is a useful instrument for informing system-wide policy decisions such as the allocation of resources. It helps schools to identify strengths and weaknesses of programs, allows for the comparison of results each year to identify trends and enables parents to follow performance and make informed decisions about the education of their children.

2.29      The committee majority is particularly drawn to the concept of providing underperforming students with a development program, the provision of which would be directly triggered by low NAPLAN results.

Recommendation 1

2.30      The committee majority recommends that ACARA and MCEECDYA explore and report publicly on ways in which to use below-average NAPLAN test results as a trigger for immediate assistance aimed at helping individual schools and students perform at appropriate levels.

Issues raised during the inquiry

2.31      Many submissions referred to national testing as a useful diagnostic tool, and supported it purely in that capacity.F[34]F The main issues raised in submissions and at the public hearings related to how the results are subsequently published and used. These issues are addressed in Chapter 3. However, submissions did raise some concerns in regard to standardised testing itself, and these are discussed below.

Test methodology and data reliability

2.32      Professor David Andrich in his submission to this inquiry noted:

The benefits that can arise from NAPLAN are based on the assumption that the quality, administration, analysis and reporting of the assessments is of the highest quality. If it is not of the highest quality, then unfortunately the assessments can not only be of little use, but can even be counterproductive. Because Australia has substantial skills and resources in educational assessment, there is no reason that NAPLAN should be of anything but highest quality.F[35]

2.33      A number of questions regarding the reliability of NAPLAN test methodology and data were drawn to the committee's attention. Some submissions suggest the tests may be prone to error, unreliable, or for varying reasons and to varying degrees unsuitable in assessing student or teacher performance,F[36]F for example because they '...are not sufficiently long to produce data of sufficiently high reliability to enable individual intervention or clinical style decisions to be made'.F[37]F

2.34      Associate Professor Margaret Wu of Melbourne University had concerns regarding margins of error when using NAPLAN tests to measure student performance. Given that NAPLAN consists of only one 40-question test per subject area, Associate Professor Wu concluded that scores in fact contain large margins of error and as such '...do not provide sufficiently accurate information on student performance, student progress or school performance'.F[38]F This can be additionally problematic as student scores are used to measure school performance, when in fact

...the publication of NAPLAN results on the My School website should be deemed as providing false information to the public, as the red and green bars do not in any way show school performance as claimed by the government.F[39]

2.35      Associate Professor Wu concluded that it is 'educationally unsound' to ask parents to make judgements on schools on the basis of NAPLAN results, and expressed her view that scores should not be published or accepted by the public without awareness of the impact of random fluctuations on results. She advocated against accepting scores '...if the confidence level of the results is not revealed'.F[40]

2.36      The Australian Education Union (AEU) further illustrated the problem presented by the margin of error in NAPLAN tests:

If you are examining literacy, for example, there might be, for argument's sake, 1,000 things you expect a child to know at nine years of age. They may know 600 of them. They do not know 40 per cent of them and do know 60 per cent of them. Depending on how you pick the test items, they may be picked disproportionately from the 40 per cent they do not know or they may be picked disproportionately from the 60 per cent of facts they do know. When you take that into account, that is where the measurement error for a test arises from.

With a test of 40 items, which the NAPLAN tests are, the measurement error for a student is around 12 per cent. For example, a student whose parents are advised that they have achieved a score of 60 per cent in a literacy test in fact has a score somewhere between 72 and 48. How people can ascribe usefulness to the data or to the My School website in the way that they have is totally beyond belief.F[41]

2.37      Professor Geofferey Masters, Chief Executive Officer of the Australian Council for Educational Research (ACER), which develops NAPLAN tests under contract to ACARA and manages the Program for International Student Assessment (PISA) for the OECD, emphasised that the assessments are '...firmly grounded in 20 years experience through the state literacy and numeracy testing programs.'F[42]F Professor Masters explained that:

NAPLAN is also pretty firmly grounded in international best practice in tests of this kind. Of course, they are point-in-time tests, so they are limited in that sense. They only assess part of what is important in schools. There is inevitably a degree of imprecision, measurement error, around the estimates that they provide. But they do represent best practice internationally. Part of the reason that the Australian Council for Educational Research are managing the PISA tests for the OECD out of Melbourne is that we do have international expertise in the analysis and reporting of data, and we are applying that to the NAPLAN tests.F[43]

2.38      Dr Peter Hill, Chief Executive Officer of ACARA, stated that NAPLAN tests:

...are provided to give an overall snapshot, and for that reason, unlike what was said earlier, we do not provide a score to parents. There is no score provided. In fact, a sheet similar to the one I have here is what is provided. It gives no score at all. It is quite a large dot on a continuum to indicate the position of the students, recognising that there is, indeed, always imprecision in our measures. However, as we aggregate those measures up to a school level and to a system level, then the more reliable those data become.F[44]

2.39      Professor Masters made the important point that there is a distinction between measures of student, teacher and school performance:

Some people believe that it is possible to go fairly easily from measures of student performance to measures of teacher performance, school performance or system performance. I do not share that view, and I think most of my colleagues at ACER do not share that view, but it is a commonly held view... I think that at ACER and also at ACARA we have not believed that... [it]...is either educationally or statistically valid...to [as is the case in the UK] move quickly from test results to a number or measure for each school in the country, and that measure is supposed to represent the school's performance so every school can be lined up and ranked on the basis of this one number, based entirely on the test results.F[45]F

2.40      MCEECDYA met on 15 October 2010 to discuss enhancements to the My School website recommended by ACARA's My School Working Party. The enhancements, most of which will be implemented in December 2010, include depictions of margins of error. Indications of the range in which school average performance may be located, with 90 per cent confidence, will be displayed alongside result data.F[46]

Committee majority view

2.41      The committee majority supports NAPLAN tests based on the understanding that they are grounded in international best practice. The committee majority recognises the concerns raised about test quality, and acknowledges that NAPLAN tests are subject to the same limitations in precision which apply to all such assessments. The committee majority believes that the current structure and appearance of the government's My School website is leading users to draw unintended or mistaken conclusions about how much can be inferred about teacher and school performance from the student test result data presented.

2.42      The committee majority strongly supports the abovementioned MCEECDYA initiative on the premise that confidence levels will be displayed on the My School website with adequate prominence, thus providing the community with a greater awareness of the complexities of the test result data.

Testing smaller student cohorts

2.43      The committee was also made aware that using mean school scores to measure school performance is problematic where small cohorts are involved. In such cases the performance of just a handful of students, sometimes one or two, who achieve extremely high or low scores can have an excessive and over time erraticĀ  effect on the overall school result.F[47]

2.44      The committee majority supports the Tasmanian Department of Education's belief that the median may be a better and '...more stable measure for schools with small student populations'.F[48]F

Recommendation 2

2.45      The committee majority recommends that ACARA assess and report publicly on the potential benefits of moving to a system that reports the median rather than the mean school performance.

Data for researchers

2.46      The Australian Primary Principals Association (APPA) suggested that MCEECDYA amend the guidelines and protocols covering access to NAPLAN data so that qualified researchers can obtain access to de-identified data.F[49]F The committee majority supports MCEECDYA investigating how this request can best be met.

Testing year 3 is too early

2.47      It was suggested in a small number of submissions that Year 3 NAPLAN tests are too difficult and 'developmentally inappropriate' for this age group, and will ultimately shift teachers' focus onto, for example, persuasive essay writing at a time when they would otherwise focus instead on more basic writing skills.F[50]

2.48      Professor Brian Caldwell posed the question:

...[A]re the problems facing Australia so serious that we require students as early as year 3 to complete 40 to 50 mostly multiple choice tests when the information that is furnished and the strategies that should be adopted have been known for at least a decade?F[51]

2.49      In its submission the Northern Territory Government also touched on the question of NAPLAN and Year 3 students, saying that the diagnostic needs of the highest and lowest student achievers, particularly those in Year 3, are not currently addressed by NAPLAN.F[52]F The submission suggests a 'rigorous interrogation' of contextual bias and reconsideration of '...the range of difficulty of test items...to maximise information gathering opportunities at the top and lower ends of student ability'.F[53]

NAPLAN and special needs students

2.50      Some submissions were concerned about low NAPLAN scores attained by special education students or students with learning difficulties causing a distorting effect on school performance.F[54]F The NSW Teachers Federation quotes a principal who received an apology from the parent of a child with severe learning disabilities because that child would impact negatively on the school results. This all points to '...a perverse incentive for schools to exclude students who are most in need of support'.F[55]

2.51      One suggestion to address this issue was vetting such results prior to reporting NAPLAN test scores on My School.F[56]F Following this suggestion would mean that special education students would still be tested and their teachers and parents given results, but that their scores would not impact on the overall school results.F[57]F

2.52      Another concern raised in submissions was that if schools fear underperformance they may discourage students with learning difficulties, those from non-English speaking backgrounds or those who are low achievers from sitting the tests.F[58]F There was some media reporting alleging that schools discouraged students from attending on test days if they knew their results were likely to pull the school's overall performance down.F[59]F

2.53      The ACT Council of Parents and Citizens Associations stated that some parents had reported having to provide schools with written requests before children with a disability could sit NAPLAN tests.F[60]F

2.54      On this point, ACARA clarified:

Every child is eligible and is encouraged to participate in the testing. In the circumstances of students with significant disabilities, a decision is made in consultation between the principal and the parents as to the impact that testing might have on the student from the perspective of either their ability to perform or their inability to perform – the effect that might have on their confidence and their self-esteem. The requirements for that process are clearly set out in the administrative handbook, which is provided to principals in every school.

I suspect what we are looking at is a situation where a parent has had a discussion with a principal about the participation of a student and the principal might as a matter of prudence have decided that it might be worth confirming in writing the parent's consent for the child to participate so that there is no misunderstanding of the circumstances.F[61]

2.55      The committee notes that the federal Department of Education, Employment and Workplace Relations (DEEWR) is working towards developing a nationally agreed definition of disability.F[62]

2.56      Following particular allegations in Victoria that some students who did not sit the tests did not meet criteria for exemption, such as having an intellectual disability, the Education Minister Bronwyn Pike MP reported that between 4.7 and 8.6 per cent of students were absent or withdrawn from school on the day of NAPLAN testing, which was particularly high. To address this, the Victorian government has requested that principals formally agree to ensure the highest possible level of participation.F[63]

2.57      At its October 2010 meeting MCEECDYA formally endorsed ACARA's proposal to include student participation data more prominently on the My School website, including absences, exclusions and withdrawals.F[64]

Committee majority view

2.58      The committee majority notes that the NAPLAN Frequently Asked Questions (FAQ) website advises that all Australian governments have committed to promoting maximum participation of students in the national assessment process.F[65]F Common national practices for providing students with special support, adjustments and accommodations for the administration of the NAPLAN tests have been agreed.F[66]

2.59      The committee notes the concerning evidence and allegations of schools attempting to manipulate test results by urging potential low achievers to stay home. Despite the relatively small number of such cases and the commitment from federal, state and territory governments to address the issue by tracking participation rates, the committee majority is deeply concerned that parents of children with a disability be given adequate opportunity to access information about their children's progress, to which they are entitled like any other parent.F[67]

Recommendation 3

2.60      The committee majority recommends that MCEECDYA and relevant jurisdictional test administration authorities look at and report publicly on ways to ensure that children with disabilities are not discriminated against and denied the right to participate in national testing.

NAPLAN and learners of the English language

2.61      The committee received submissions calling attention to possible inadequacies of the NAPLAN tests for some Indigenous students who may not speak Standard Australian English as their first language but are not treated as students with a Language Background Other Than English (LBOTE). Queensland Indigenous ESL and FNQ (Far North Queensland) Language Perspective stated:

The current measures of disaggregation according to Indigenous status, students with a language background other than English (LBOTE) and the Index of Community Socio-Educational Advantage provide a false and pernicious picture of Indigenous learner performance and are leading to inappropriate and wasteful measures of intervention.

...[W]ithout the requirement of second language assessment in schools for students being tested on NAPLAN, Indigenous learners are generally not placed within the LBOTE group.F[68]

2.62      NAPLAN may also have limitations in capturing the progress of students starting to learn English as a second language,F[69]F which again may affect not only children of recent migrants, but also some Indigenous students who speak non‑standard English dialects.F[70]

2.63      Submissions argued this is because NAPLAN tests do not collect adequate information about students who are learning English as an additional language or dialect, and may not recognise them as such before reporting their sub-standard results.F[71]F

2.64      Suggestions for addressing this included the collection of more specific data for students from a non-English speaking background.F[72]F Disaggregated data would enable specific support to these categories of students.F[73]F

Committee majority view

2.65      The committee majority supports the collection of more specific student data which would then be used to ensure appropriate support is available.

2.66      The committee majority notes that the My School website will in future display the percentages of students from a non-English speaking background.F[74]

Recommendation 4

2.67      The committee majority recommends that ACARA analyse and report publicly on how NAPLAN tests are serving different groups of Language Background Other Than English (LBOTE) students.

Indigenous student performance

2.68      The committee majority notes the situation in the Northern Territory, where Indigenous students comprise over 40 per cent of the student population, with particular concern. International testing shows that Indigenous students are overrepresented in the lowest performance categories for both literacy and numeracy, and underrepresented in the highest.F[75]F This trend is evident nationally as well, with Indigenous students considered the most educationally disadvantaged.F[76]F

2.69      The Council of Australian Governments (COAG) reports a marked difference in NAPLAN results in the Northern Territory, where only between 62.5 per cent and 77.0 per cent of students reach the national minimum standard in all domains and in all years. All other states and territories perform well against the national benchmark, with varying room for improvement. The committee notes that this difference between the Northern Territory and other states and territories reflects particularly negatively on educational outcomes for Indigenous students, who comprise 40.7 per cent of the school population in the Northern Territory. NAPLAN results for non-Indigenous students in the Northern Territory are comparable to other jurisdictions.F[77]

2.70      The committee is aware of evidence which suggests that students who begin to fall behind without being caught and helped to catch up will continue to fall further and further behind, making these achievement gaps increasingly difficult to close as time goes by.F[78]

2.71      Speaking before the committee, Dr Peter Hill emphasised the importance of addressing quality of teaching in areas of greatest student need:

I think that what you said earlier about the heart of the matter being teacher quality is absolutely true. All of the international surveys point to teacher quality – or quality of teaching, should I say – being the key in all of this. What we need to use My School for is to understand where the problems are and then to have strategies that can come in, address quality of teaching... I am thinking particularly of our Indigenous students; we cannot have any pride at all in our record of achievement there. We need to put in additional resources and do what we can to improve the quality of the teaching that those students experience.F[79]

2.72      The above sentiment was echoed by Professor Brian Caldwell, who stated that '...there is almost unanimity among researchers and policymakers that the most important resource of all is the quality of teaching'.F[80]

Committee majority view

2.73      The committee majority believes that, in order to produce a more accurate and detailed picture of students who are achieving below the benchmarks, ACARA should investigate ways to provide access to rich information on lower achievers so that targeted support and resources can be made available.

2.74      The committee majority also believes that more emphasis must be placed on nurturing and developing the professional skills of teachers, in particular those working to improve outcomes for lower student achievers.

Recommendation 5

2.75             The committee majority recommends that ACARA investigate and report to MCEECDYA on enhancing NAPLAN to support the diagnostic needs of higher and lower student achievers.

Expanding testing

2.76      Dr Ben Jensen, Director of the School Education Program at the Grattan Institute, explained that the best predictor of a student's likely performance on a given day is their performance in the previous year:

There has been a lot of work done on this in the United States. In some states where there is serious testing in a wide variety of subjects it is annual or more frequently than that. They actually do not bother using the socio-economic background characteristics of students because it just drops out of the model. It is not important; it does not matter. What is much more important is your progression, how much you have progressed and, of course, where that begins.F[81]

2.77      Dr Jensen, while not advocating increased frequency of testing, explained that the two-year gap between tests in Australia means that extrinsic factors such as socio‑economic background have an increased opportunity to influence individual student performance.F[82]F Australian students currently take NAPLAN tests only four times over the course of their schooling, in Years 3, 5, 7 and 9. More frequent testing would lessen the influence of extrinsic factors and provide a more accurate picture of student progress.

2.78      More frequent testing would also enable schools and policymakers to more accurately capture the progress of students who change schools between primary and high school. Currently, My School includes comparisons of Year 7-12 high schools and P-7 primary schools, despite the fact that in some jurisdictions Year 7 is the first year of high school and students may have been at a different school three months prior to the NAPLAN tests.F[83]F The absence of information on student performance in Years 6 and 8 makes it difficult to ascertain which school, the primary or the high school, is responsible for student performance in Year 7.

Committee majority view

2.79      The committee majority supports the collection and reporting of information about progress in schools and believes it should be expanded. In order to provide parents, teachers and government with the best possible record of student progress and immediately begin to address serious gaps between our highest and lowest achievers, a national test designed to measure improvement for students should be conducted every year. The committee majority believes that more frequent testing would help drive momentum for helping underperforming students.

Recommendation 6

2.80      The committee majority recommends that ACARA and MCEECDYA expand NAPLAN to include annual testing from years 3 to 10 in order to more accurately track student performance and give parents, teachers and policymakers a far better understanding of how students, teachers and schools, are progressing.

Timing of test administration

2.81      Submissions pointed out that students are assessed against their respective year standard before being taught most of the year's curriculum because tests are administered in the first half of the year (with results not available until the second half). It was argued that this could disadvantage students whose schools teach relevant material later in the year, and could lead schools to alter teaching plans in order to attain better NAPLAN results.F[84]F

2.82      The timing of the tests also means that by the time teachers and parents receive student results, in the second half of the school year, it is too late to incorporate any resulting teaching requirements into that year's teaching program. This may have led to some schools pressuring teachers of Year 1 and 2 students to prepare students for concepts tested in Year 3 NAPLAN tests.F[85]

2.83      Dr Peter Hill, Chief Executive Officer of ACARA, explained that:

The purpose [of national testing] has been to get a snapshot of student performance for reporting back at different levels: at the parent level, at the school level, at the jurisdiction sector level and the national level. That was the purpose from the beginning, and the purpose has never been diagnostic assessment.

Diagnostic assessment means that we look at the reasons why students are, perhaps, not performing. For that purpose we need an immediate feedback; these tests are broad in scope and would not be very useful for diagnostic purposes, particularly as the results come through very late.F[86]

Committee majority view

2.84      The committee majority believes that the government's poor communication of the intended purpose of NAPLAN tests has led to widespread community misunderstanding or confusion about the capacity of the tests to diagnose why a child is performing at a particular level. The committee majority notes that then-Education Minister the Hon. Julia Gillard MP may have helped perpetuate an erroneous perception of the purpose of NAPLAN tests by stating about test result data:

It's important to teachers; they do value this diagnostic information to work out what they need to do next for the children in their class.F[87]

2.85      The committee majority believes that a better communication strategy is needed to explain the true purpose of NAPLAN tests.

Security of the tests and allegations of cheating

2.86      The committee majority notes media reports of teachers and schools allegedly cheating to boost their NAPLAN results.F[88]F Save Our Schools pointed out that schools and teachers can cheat in various ways and argued that this calls into question the reliability of NAPLAN tests in measuring school performance.F[89]F

2.87      Submissions identified that increased accountability pressure may unintentionally increase the likelihood of cheating.F[90]F A number of suggestions were made to address this issue. The ACT Council of Parents and Citizens Associations recommended a review of the document that outlines how tests should be conducted.F[91]F The New South Wales Primary Principals' Association suggested publishing clear and uniform delivery protocols and highlighting the consequences of any breaches.F[92]

2.88      Responsibility for test material during development falls to ACARA, which '...prescribes security requirements for states and territories, schools and principals in a nationally agreed document – National Protocols for Test Administration.' State and territory jurisdictions are responsible for security during test administration. Test administration authorities in states and territories are responsible for investigating any allegations of security breaches.F[93]

2.89      The committee majority is aware that MCEECDYA has endorsed a range of measures aimed at enhancing test administration security. ACARA is now working with state and territory authorities to strengthen test security for 2011, and is:

...mounting a multi-level communication strategy in 2011 to further develop understanding of the required protocols for the management of test materials on the part of schools, principals and staff.F[94]

2.90      ACARA has informed the committee of plans to include annual statements on its website detailing all reports of security breaches, the status of reported cases and outcomes of any subsequent investigations. Schools and individuals will not be identified. Education ministers are currently considering a draft of the first statement.F[95]

Committee majority view

2.91      The committee majority believes that the community should be able to have confidence in the testing process and that uniform test administration guidelines should be developed and made publicly available as a matter of priority.

2.92      The committee majority notes that steps have been taken by state and territory education departments to investigate and address allegations of cheating, with more than 51 separate investigations under way.F[96]F The committee majority recognises that the government has been firm in its commitment to supporting investigations into all allegations of cheating.F[97]

2.93      The committee majority also notes that the Australian Primary Principals Association has developed and sent to the government for consideration a set of principles and a number of proposed safeguards aimed at mitigating a range of negative impacts and adverse effects of the test administration and reporting regime.F[98]F

Recommendation 7

2.94             The committee majority recommends that MCEECDYA explore ways for state and territory test administration authorities to more strongly enforce security protocols.


2.95      The committee majority believes that NAPLAN is an important foundation for measuring the performance of students but needs to be strengthened in a number of ways. It needs to provide a more accurate and detailed picture for all students, particularly those not meeting performance standards. Test developers need to gradually look at ways in which to reduce the margin of error in order to turn NAPLAN into a more accurate tool. Furthermore, to provide an even better understanding of student progress trajectories year on year, the committee majority believes that national testing should be conducted every year. This would be particularly beneficial for students who do not meet national benchmarks as it would help build a sense of urgency and reduce the delay in delivering targeted assistance. These enhancements will provide a more accurate and detailed picture of students' ability without the influence of extrinsic factors, and will provide policymakers and schools with an informed picture of which educational programs are working and which ones are not.

2.96      Building on the enhancements outlined above, the committee majority also believes that substantial work is required to address the significant issues raised during the committee's inquiry in relation to the reporting of NAPLAN data on the My School website. These matters are covered in the next chapter.

Navigation: Previous Page | Contents | Next Page