ATConnect All >

NAPLAN : A Personal Perspective

The NAPLAN tests, completed in May, have now had their results released after a two week delay due to an issue about the results from the normal written testing differing slightly from the on-line testing trial of which about 20% of students participated in this year.

The future on-line NAPLAN testing has been explained in several teacher in-service programs and it shows significant potential in producing the results within weeks not months and for grading the students’ performances in a more sophisticated way. Time will tell on this but 2020 is the year that ACARA hope to have most students do their NAPLAN testing on-line.

The ATC results are listed below and deserve some comments, admittedly from my personal perspective but they reflect consensus of the College Leadership Team.

The Green shows the ATC percentage above the State and the Red below the State.

What else does it tell us?

a) Girls do better than boys at Reading, Writing, Spelling, Grammar and Punctuation, but not Numeracy. You can tell this by comparing the State results vs State Boys. Where the State results are higher than the State Boys then the State Girls were obviously higher that the State Boys. By contrast in the right-side columns, the reverse is apparent. The State Boys are higher than the State, so the boys have done better in Numeracy than the girls. Are these earth-shattering findings? No. The media never mentioned it this week, so it is obviously common knowledge from their perspective. Interesting for ATC though is the significant percentage advantage for ATC students as they progress through the years. The school has a strong argument that the improvement from Year 5 through Year7 to Year 9 reflects the development of our students’ skills. Admittedly these are different groups of students, but the trend is meaningful, especially as ATC still lose and gain students in the Year 6 to Year 7 transition.

b) The writing shows promise for us. In this area (Writing) that attracted the most media attention during the week, our students in Year 7 and especially in Year 9 (8.82% above State) showed excellent improvements. Their writing scores are encouraging but need to be seen in comparison to Numeracy. In many individual cases for example, Year 9 students have scored a Band 10 (highest) for Numeracy but only a Band 8 for Writing.

The graph below shows that 44% of our Year 9 students were Bands 9 or 10 in their Numeracy, while only 14% were in same group for Writing. So, in many cases those students in the 44% group could not achieve as well in their writing. But it is not for want of trying in most cases. The Year 9 students are an excellent group of students. To me it is that writing is a more sophisticated skill, needing more determination to succeed. Right now, the majority of Year 9’s are confidently applying the SOH/CAH/TOA rules to their Trigonometry questions and can draw the graph of y = 3x – 4 within a minute. But the challenge of writing a “Hammer” paragraph with three different sentence types could slow up most students.

Our Year 9 students are regularly encouraged to see that despite their pleasing Numeracy results, it will be their writing skills that will set them ahead of thousands of their State peers in their 2021 ATAR assessment. Our writing program has been well established from Year 6 through to Year 11 and is integrated across several subjects, not just English. Our Year 11 students lead the way in their determination to write well for the Qld Core Skill tests in 2019 and their support for the practice programs set in place by Ms Kath Little cannot be faulted

c) NAPLAN is a very good system. That is my view as a teacher of 40+ years. It disappoints me when I hear on ABC programs that there is a ground swell of discontent against NAPLAN. I am a long-time listener of the ABC but at times they annoy me in that they can create a ground swell of discontent against just about anything; “too much caramel in MARS bars” for example; for sure they will find some Mars bar expert who is bitterly disappointed in the direction that the MARS bar saga is taking the country. To the clear majority of teachers who use the NAPLAN results it is not a competition between schools. It was never designed to be so. NAPLAN was established by Julia Gillard as Minister for Education about 2008 to find out those schools that need the most assistance. It has worked extremely well in that regard. As a spin off, we as teachers get this independent snapshot of where our students are with their basic skills, every two years. The Tests are very well written, at least in the Numeracy section. I have reviewed hundreds of their questions at various age levels and I am so impressed that the team of writers can take some very straight forward Maths concepts and weave so many neat two or three step questions.

The NAPLAN results then allow us to compare a student’s school performance with this outside quality assessment. It is so helpful and to me reliable, all the way to Year 12. In my seven years as Science and Maths teacher at Brisbane State High I depended on a State system called MA rating ( scale of 1 to 15) and another called TOLA. The NAPLAN seems far superior. I am happy to think that NAPLAN may evolve into a better on-line system but talk of scrapping NAPLAN (here is the ABC driving the agenda again) leaves me deflated. What will they replace it with? Will education lose these teams of expert writers who create these tests year on year? Will we go back to guessing our students’ abilities?

Long live NAPLAN I say.

Greg Quinn
Acting Dean of Learning


Author(s): Pendergast, Donna; Swain, Katharine
Abstract: In Australia, the introduction of the National Assessment Program for Literacy and Numeracy (NAPLAN) in 2008 marked a national shift towards an accountability-driven education agenda. Administered in Years 3, 5, 7 and 9, the tests sit predominantly across the grades commonly known as the ‘middle years’. Both advocates for and critics of NAPLAN note the potential of the test regime to impact on teaching and learning practices, especially as schools and systems strive to reach benchmarks. Coinciding with the introduction of NAPLAN testing, the nation experienced a renewed interest in middle years’ curriculum, pedagogy and assessment practices. This is evident in, for example, the Melbourne Declaration for Schooling (MCEETYA, 2008), which advocates the need for appropriate educational practices in the middle years to ensure optimal learning outcomes for young Australians. Among the practices is a call for authentic and reflective assessment with high expectations, evidenced by higher-order thinking challenges. It is within this context that this study of the 2009 NAPLAN test instruments of the literacy domain for Years 5, 7 and 9 was undertaken. Using Bloom’s Revised Taxonomy (Krathwohl, Bloom, & Masia, 1973) as the analytic framework, each test item was assessed to determine the cognitive complexity of the item. The analysis reveals that less than 10% of questions in the reading tasks afforded the cognitive challenge of higher-order thinking, and hence the interests of middle years’ education and NAPLAN testing might offer some tension in our education system. This paper presents a snapshot in time of a NAPLAN test and raises questions about the degree to which it contributes to the goal of increasing the cognitive challenge of learning and assessment practices in the middle years. It is argued that the NAPLAN test should be understood to be one insight into student achievement, but not the only or full story reflecting student capabilities. NAPLAN results should be employed in this knowledge and not be seen as a panacea for reforming the education system.