The NAPLAN 2017 summary results have been released with the usual mix of criticism, high hopes and panic that marks the yearly unveiling of data.
This year’s results will generate particular interest, as 2017 is the tenth time NAPLAN has been conducted since it was first introduced in 2008.
The final report is not due until December, but the summary results provide a useful opportunity to reflect not only on how young Australians have fared over the past year, but also over the past decade.
What does NAPLAN test?
NAPLAN takes place every year and assesses Australian school students in years 3, 5, 7 and 9 across four domains: reading, writing, language conventions (spelling, and grammar and punctuation), and numeracy.
NAPLAN is a “census assessment”. This means it tests all young people in all schools (government and non-government) across Australia.
NAPLAN uses an assessment scale divided into ten bands to report student progress through Years 3, 5, 7 and 9. Band 1 is the lowest and 10 is the highest.
Each year, NAPLAN data for every school in the nation is published on the publicly accessible My School website.
The Australian Curriculum, Assessment and Reporting Authority (ACARA), which manages NAPLAN and My School, suggests the test and website increase transparency, and allow for fair and meaningful comparisons between schools.
Others, however, argue the website has transformed NAPLAN into a “high-stakes” test with perverse consequences.
How do 2017 data compare to 2016 data?
Compared to 2016 results, 2017 data show:
- no statistically significant difference in achievement in any domain or year level at the national level;
- South Australia had the only statistically significant change out of any state or territory, with a decline in Year 3 writing achievement;
- New South Wales, Victoria and the Australian Capital Territory continue to be the highest-performing jurisdictions, scoring above the national average across the majority of domains and year levels; and
- the Northern Territory continues to significantly underperform on all measures when compared with other jurisdictions (see, for example, Year 3 reading trends below).
How do 2017 data compare to 2008 data?
Compared to 2008, 2017 data show:
- no statistically significant difference in achievement across the majority of domains and year levels at the national level;
- statistically significant improvements at the national level in: spelling (years 3 and 5); reading (years 3 and 5); numeracy (year 5); and grammar and punctuation (year 3);
- Year 7 writing is the only area to show a statistically significant decline in achievement at the national level (based on data from 2011 to 2017);
- Queensland and Western Australia stand out positively, showing statistically significant improvements across a number of domains and year levels;
- despite high mean achievement overall, there has been a plateauing of results in New South Wales, Victoria and the Australian Capital Territory; and
- students have moved from lower to higher bands of achievement across most domains over the past ten years. This is illustrated in the following graph that shows band shifts in Year 3 reading (green) and Year 9 numeracy (blue).
How many students meet the National Minimum Standards?
Another important NAPLAN indicator is the percentage of students meeting the National Minimum Standards (NMS).
NMS provide a measure of how many students are performing above or below the minimum expected level for their age across the domains.
The 2017 national portrait remains positive in relation to the NMS, with percentages over 90% for the majority of domains and year levels.
Year 9 numeracy has the highest NMS percentage of 95.8% at the national level.
Year 9 writing has the lowest NMS percentage of 81.5% at the national level.
The Northern Territory continues to lag significantly behind the rest of the nation across all domains and years, with NMS percentages falling distressingly low in some cases. For example, only 50% of Year 9 students in the Northern Territory meet the NMS for writing.
What are the implications moving forward?
It is safe to say the nation is standing still compared to last year and has not made any amazing leaps or bounds since the test was first introduced.
This will be of concern to many, given one of the main justifications for introducing NAPLAN (and committing major investments and resources to it) was to improve student achievement in literacy and numeracy.
The general lack of improvement in NAPLAN is also put into stark relief by steadily declining results by Australian students on the OECD’s Programme for International Student Assessment (PISA).
Those committed to NAPLAN see improving the test as the best way forward, along with improving the ways data are used by system leaders, policymakers, educators, parents and students.
One major change in 2018 is that schools will begin transitioning away from the current pen and paper version to NAPLAN online. ACARA hopes this will produce better assessment, more precise results and a faster turnaround of information.
Schools will initially move to NAPLAN online on an opt-in basis, with the aim of all schools being online by 2019.
Only time will tell as to whether NAPLAN online has the desired effects and whether the current cycle of stagnating results will continue.
Glenn C. Savage is Senior Lecturer in Public Policy and Sociology of Education and ARC DECRA Fellow (2016-19) at the University of Western Australia.
This article was originally published on The Conversation. Read the original article.
Make your contribution to independent news
A donation of any size to InDaily goes directly to helping our journalists uncover the facts. South Australia needs more than one voice to guide it forward, and we’d truly appreciate your contribution. Please click below to donate to InDaily.