A distinct advantage that the bench sciences and medical sciences have over the social sciences is the ability to run experiments. If a clinician wants to run a trial to check whether a new drug might work, the regulatory hurdles can be substantial, but the trial is certainly possible. If an economist wants to run an experiment to check the effects of a policy change, well, it can be more difficult to convince the government to give it a go.
But when a natural experiment comes along, like the proposed changes in Auckland school boundaries, you would think it would at least be worth collecting the data on it. Sadly, it isn’t being done. It should be.
But first, what’s a natural experiment?
Suppose an economist wanted to test whether students do better at one school rather than another. Differences in practice or in teaching methods at different schools could lead to different rates of success either at school or after graduation. But differences in outcomes could also be due to differences in the students that attend different schools, or to differences in peer-effects within classrooms. Simple differences across schools in things like NCEA league tables or university attendance can have multiple possible explanations.
Randomly assigning students to schools might hit something close to a clinical gold standard, but good luck getting approval to do it.
Instead, economists generally have to use observational data on students, their families, the schools they attend, and their outcomes. New Zealand’s statistical framework for that kind of work is world-leading: researchers in Statistics New Zealand’s data labs can adjust for dozens of potential differences in students’ family backgrounds.
Those kinds of adjustments quickly show that, at least when it comes to NCEA outcomes, differences across schools are a lot smaller than you might have thought. Family background, and especially the education of students’ parents, explain a lot of the differences in outcomes across schools.
But some schools still show substantially better, or worse, outcomes than would be expected. For the high performers, is it because they’re doing a particularly good job? Or is it because they have managed to attract a particularly promising pool of students? The schools can observe things about students that are invisible to researchers in the data lab.
These kinds of selection effects are more difficult to deal with. Parents and prospective parents choose where to live, in part, because of the reputation of the local schools. For those able to choose from houses in different parts of town, those who care most about their children’s education will put the most weight on the quality of the local schools when buying or renting a home. If that school then appears to have very strong results, is it because the school is very good? Or is it because the school has attracted a lot of families who care far more than average about education and who also then provide their children with a lot of additional assistance?
In either case, the effects of the school’s perceived quality can turn into substantial school zone effects in property prices. Moving into the neighbourhoods zoned for the most popular schools will prove expensive. Some families then seek out-of-zone placement into those schools.
The Ministry of Education has considered out-of-zone placements to be ‘cream-skimming’ because students seeking out-of-zone enrolments come from families with more education and higher income than others at the neighbourhood school. But while cream-skimming, in popular usage, suggests an active role by schools in choosing more promising students, it could just as easily be the case that more educated parents are better able to tell whether their local school is a dud and consequently seek an out-of-zone placement. In either case there will be difficult statistical inference issues, but the latter mechanism could be rather less worrying.
And all of this finally brings us to natural experiments. Statistical inference is hard when selection issues are at play, and nobody would let a team of economists re-draw school boundaries just to test these kinds of effects.
But the Ministry of Education has proposed a substantial set of changes to school enrolment zones in Auckland. Some 135 enrolment zones are being changed so that schools that are at capacity do not need to build new classrooms, while schools with excess capacity would have something of a captive audience with few choices. People who chose houses because they thought their children might be able to attend one school will find they won’t be able to do so; others might find surprising new options available.
The changes in zoning can make for a wonderful natural experiment. People who expected their children to go to one school will wind up attending another school for reasons entirely out of their control. One side of a street could remain in-zone, with the other side shifting out-of-zone. The luck of the draw will have determined which side of the street any family might have wound up on, along with their zone. If that leads to sharp differences in educational outcomes for kids on opposite sides of the street, that can tell us something about school quality.
But there is a catch. Big changes in school boundaries will mean a lot more applications for out-of-zone placement. While it is easy to tell which schools students have wound up attending, and where they live, researchers cannot tell which families have applied for out-of-zone placements. They can only tell where the students wind up, and whether that school is outside of their local school zone.
The Ministry of Education, as best we can tell, has no plans to have schools provide them with lists of out-of-zone applications, and of the schools’ offers of out-of-zone placements. That information could be rather useful.
If schools use a lottery to determine which applications win, then researchers could compare results for lottery winners and losers as another way of assessing school performance. If schools use other criteria for deciding who to admit, that too could be known and could be used in checking the importance of selection effects. And heaps of applications for out-of-zone placements from families in any particular school’s zone could provide the Ministry with an early warning sign of problems at that school.
Even if the Ministry of Education does not see it that way, what it is proposing for Auckland is an experiment. But experiments are not nearly as useful if nobody bothers collecting the relevant data from them. If the Ministry really wants to impose a pile of zoning changes that will have substantial effects on families all across the city, it could at least have the courtesy of gathering the data that could help show the effects of the experiment.