If you found out your local school had a 60% NCEA level three pass rate, would you know whether you should congratulate the principal or demand a sacking?
Students come to school from different starting points. If one school teaches kids who never saw a book before showing up at school and another school teaches kids whose parents mostly teach at the local university, it would be daft to expect both schools to deliver similar outcomes.
Praising or damning schools for their performance relative to fixed national benchmarks then is just a little silly. A 60% pass rate could be a failure or a triumph.
At the same time, differences in student outcomes at similar schools can be vast.
The Auditor General’s report on Maori education, released this month, makes for sobering reading.
Among small decile 1 primary schools, the percentage of Maori students meeting or exceeding the bar on National Standards ranges from just over 20% to just under 90%. Comparing small decile 2 secondary schools, the percentage of Maori students at or above average NCEA level 2 results ranged from just under 40% to about 95%.
Students’ backgrounds matter for educational outcomes. But if decile were destiny and student backgrounds were all that mattered, there would not be yawning gaps in performance among broadly similar schools.
The best small decile 2 secondary school would not have a 55 percentage point NCEA achievement lead on the worst performing small decile 2 secondary school if all that mattered were the mix of incoming students. And the worst performing small decile 1 primary school would not be more than 60 percentage points behind the best performing small decile 1 primary school on National Standards.
[From Auditor General's report, Figure 7, page 24.]
The Auditor General’s report focuses on outcomes for Maori students but the problem is much broader. Education Counts, the government office watching education statistics, finds substantial differences in NCEA Level 3 achievement rates among schools within the same decile and across deciles.
And so there is a lot that schools can do to boost performance.
To start figuring out what might produce such different results, the Auditor General’s team compared high- and low-performing pairs of schools matched on size, type and decile.
Consistently, the schools making better use of data were the better performing ones. Collecting and analysing student achievement information helped successful schools to improve their practices and target help where it was needed.
It should not be surprising that organisations that set performance goals, monitor progress and update practice when needed wind up having better results than ones that do not.
What is disappointing is the system does far less than it could to encourage this kind of evaluation.
Imagine a national-level supermarket chain experiencing this kind of variability in different stores’ profitability. Some stores did a great job in forecasting demand and staffing and stocking appropriately for their communities’ needs; others would have a mix of empty shelves and rubbish tips full of excess wasted produce.
On checking into things, head office found the worse performing branches did not have staff able forecast demand. Some were too small; others just had never seen the value in it.
Does this seem the kind of thing that could persist for years without action from head office?
The Ministry of Education could easily provide some of that head office assistance for schools. Schools submit student performance data to the ministry and wait months to receive reports back that tell them little that they did not already know.
But the ministry is placed to provide sector-level data that is simply unavailable to individual schools, both because it has access to better data and because it has highly competent statisticians.
The ministry has already made some moves in this direction. This year’s budget targeted increased funding toward schools serving students who were at risk of not completing NCEA.
The government’s analysis found children from long-term welfare-dependent families were less likely to succeed at NCEA and consequently schools serving those students were allocated more funding.
As a starting move, it is laudable. But it still is not getting at the nub of the problem. Why are some decile 1 schools able to provide NCEA level 3 completion rates that are four times higher than other decile 1 schools?
Schools that make better use of data have better outcomes for their students but a ministry that made better use of data would be able to tell which schools, and which teachers, did particularly well with different groups of students.
Next week, The New Zealand Initiative will release Martine Udahemuka’s report on performance in the education sector.
There are fairly easy steps the ministry could make that would help schools to tell where they are succeeding and where they are failing relative to comparable peers, rather than as compared to pass-rate benchmarks that are far too easy for some and impossible for others.
One size doesn’t fit all. Making better use of data would let schools and parents know how well their schools are really doing. Isn’t it time that parents had better measures of school success than NCEA pass rates and decile rankings?