It’s the standard critique of local government, and it’s often right. Councillors and mayors are more interested in spending a lot of money on flashy convention centres than on maintaining the pipes. Deferred maintenance can be some future council’s problem. Ribbon cutting at new venues brings benefits today.
But it isn’t just local government.
On 4 February, an important data pipe at Statistics New Zealand burst. Or had to be turned off. In either case, the nondescript notice went up: “4 February 2022: NZ.Stat is currently unavailable”.
I did not notice it at the time.
But five days later, Tze Ming Mok despaired that she was no longer able to do her job because Statistics New Zealand had taken down NZ.Stat, with no notice, and no apparent plans to replace the service.
She posted a screenshot of the update she had received from SNZ about the outage. SNZ told her, and she told the rest of us, that SNZ had taken the tool offline following a security review and “because the infrastructure supporting NZ.Stat has become outdated.” The tool would be unavailable “for the foreseeable future.”
And a howl went up from New Zealand’s data geeks on Twitter.
Dr Emily Harvey said, “If only they’d told us so we could have scraped all the data at the finest resolution pre-emptively. Is it time for data Twitter to pool their stored data extracts?”
Economist Brad Olsen drew an apt comparison: “This is the data nerd equivalent of Instagram going down. Some concerned looks and incredulous stares in the @InfometricsNZ office! #NZDotStatDown”.
NZ.Stat and Infoshare are the two main ways that the public, whether journalists, researchers and academics, or interested citizens, can access critical statistics held by Statistics New Zealand.
They are also the main way that public servants access the country’s statistics.
Infoshare is the creaking workhorse that holds most of the statistics that economists would use most days. I do not know how old it is, but I was using it in the mid-2000s. The interface is antiquated but it still somehow works.
A decade ago, Statistics New Zealand announced the launch of NZ.Stat in its Annual Report, describing it as “a world-class web tool for publishing statistics that enables businesses to find relevant information quickly and easily.”
Some of New Zealand’s vital statistics are only available via NZ.Stat. If you want the results of the annual household incomes survey, they’re found in NZ.Stat. Oodles of data on households are only available through NZ.Stat.
The system collapsing is not good.
As an interim measure, Statistics New Zealand has a form inviting the public to tell them what data they need, and whether they need it in 2-3 days, 4-5 days, or 5+ days. The form asks researchers who they are and why they need the data, presumably to help Stats in triaging requests.
I am certain that the agency is doing everything possible to make the best of a very bad situation. I have only ever had excellent interactions with their staff, who will all be doing their damnedest to get statistics to people who need them.
But it never should have come to this.
After the front fell off of NZ.Stat one informed data analyst, David Friggens, reported that the system was running on software that was five versions out of date. NZ.Stat is built on OECD.Stat. The Australian Bureau of Statistics, by his report, has version 9 of OECD.Stat in beta. And while Statistics New Zealand had recently started a project to upgrade to Version 10, the version of NZ.Stat that failed was Version 5.
I asked Statistics New Zealand whether Mr Friggens was correct. Statistics New Zealand confirmed, via email, that they have been running “a legacy version of OECD.Stat software, with a project underway to upgrade this”, but did not specify which version has been in use.
To its credit, Statistics New Zealand had recognised some of the risk it faced. The Agency’s Statement of Strategic Intentions 2021-25 set workstreams aimed at ensuring “core information technology systems are at less risk of failing”. It sought to “identify risks to core systems and track the effectiveness of mitigating actions to ensure the stability of these systems.”
Unfortunately, it came a bit too late.
For years Statistics New Zealand has, like the proverbial local government, chased after shiny new objectives while largely ignoring the critical infrastructure that is necessary to keep the whole ship running.
The problem is not just budgets.
The problem is also priorities.
In 2018, Statistics New Zealand began a new project, Indicators Aotearoa New Zealand. The Agency held meetings across the country, and an online campaign, asking Kiwis what wellbeing meant to them. Data experts then tried to distil the essence of those meetings into indicators that might possibly be able to be measured.
It culminated in a meeting of about 150 people at the Michael Fowler Centre that included two hours of introductory remarks and a game of check-your-privilege Bingo. Then followed hours of discussion about potential new statistics across a range of wellbeing domains, including spiritual wellbeing.
New initiatives of this sort can be highly worthwhile if core business is not being neglected. But while attendees at the Indicators Aotearoa New Zealand event in Wellington were playing a hand-clapping game, Statistics New Zealand was botching the Census and core data infrastructure was decaying.
The Indicators Aotearoa New Zealand website is now live and active – though they have not yet found a way to measure spiritual wellbeing.
More recently, Statistics New Zealand has been devoting a lot of effort to better measurements of child poverty. Prior to Covid, the Government set reducing child poverty as a key part of its wellbeing agenda. The Child Poverty Reduction Act 2018 requires that the Government Statistician report annually on ten measures of child poverty. The Agency consequently produced a series of working papers explaining how it was defining the measures. It also substantially expanded the Household Economic Survey to provide the necessary data.
Trying to measure spiritual wellbeing is inherently risible.
Measuring child poverty most certainly is not.
But taking on new and expensive workstreams while not attending to the infrastructure supporting core statistics is very risky. The Household Economic Survey, which provides the child poverty measures, was only available through NZ.Stat – and now is only available by special request.
Because of the no-notice killing of NZ.Stat, a few other economists and I have started checking whether we can build an external mirror of Statistics New Zealand’s Infoshare archive. Losing access to NZ.Stat is bad. Losing Infoshare would be a disaster. And Infoshare is even older and more decrepit than NZ.Stat was before the front fell off of it.
Scraping and mirroring the giant spreadsheets underlying Infoshare would be valuable insurance in case Statistics New Zealand ever decides that it must abruptly end that service as well.
The overall problem is hardly limited to Statistics New Zealand. Upgrading Ministry back-end systems that enable both business-as-usual activities and new initiatives is never as appealing as announcing new things.
Central government likes to criticise local government for attending to every priority other than boring upgrades and maintenance to the underground pipes.
But when local government fails to fix its pipes, eventually everyone notices the sewage flowing out into the harbour. That sewage quickly makes headlines.
Central government ignored its data pipes and they have started breaking. The consequences are just as real as bursting wastewater pipes, but they are harder to see.
Perhaps both central and local government could, for once, try to make sure that core infrastructure is up to spec before taking on new nice-to-have initiatives.