LANCE WIGGS: Measure then fix – why 107 academics are wrong
close
MENU
7 mins to read

LANCE WIGGS: Measure then fix – why 107 academics are wrong


A group of academics signed off on a letter against school league tables. The stated logic may work in an academic research setting but is inappropriate to apply to the real world. Let business model the data.

Lance Wiggs
Wed, 18 Jul 2012

A group of academics signed off on a letter against school league tables. The stated logic may work in an academic research setting but is inappropriate to apply to the real world. We should instead publish the measurements, improve the measurements and their context over time and, most importantly, focus energy and resources on understanding the issues and helping the schools at the bottom of the league.

Let's look at the reasoning used in the media briefing note:

1. National Standards data are unsuitable for comparing schools The performance of schools cannot meaningfully be compared with each other unless it can be demonstrated that assessment measures, processes and moderation have been used consistently across schools. <snip>

The argument is that schools have high variability between each other and across years. It's a combination of measurement error based on inconsistent and low samples and the national standards only measuring numeracy and literacy and not more holistic skills.

However to improve something we first need to measure it, and if we can't measure it accurately then an approximation will do. In business that means using surveys of customers that have clear sampling bias, reacting more to customers who complain and even believing what we read in the papers. We know all of these sources are incomplete and have bias, but we can account for it somewhat, and are much improved by using the input. The online advertising industry is a lovely example, using a system for measurement that is clearly wrong to measure traffic, but while it is wrong, it is wrong for everyone, and it's only the starting point for a conversation.

It's far easier to start a conversation about the quality of a school when confronted with a combination of the socieoeconomic data about the catchment area and the National Standards results over time.

2. The contextualising data are incomplete

Many elements of the school’s local community context affect teaching and learning processes and children’s achievement. These include socio-economic and other intake differences (such as ethnicity, student transience rates, the proportion of English language learners or children with special needs) and other school and area characteristics (local labour market, urban/rural location, popularity compared to surrounding schools).

There are also internal school contexts, such as past leadership or reputational issues, significant staffing changes or schools being damaged.

Many attempts at comparing school performance do not even try to use the best available statistical methodologies. Instead the school decile rating is typically used as a proxy for all these contextual indicators.

I agree. The National Standards data are only one piece of the puzzle, and the puzzle needs to be completed.

However we need to start somewhere, to create a minimum viable product and steadily improve it over time. While many criticized the early versions of iPhone, Xero and even Powershop, the steady improvement in functionality and usability were what won consumers over time. It's the same with a measurement system that relies on a variety of data. Some of the early data will be wrong, and some of the things measured will be missing, but we should accept that and move to steadily improve the quality and context over time. If we don't have the right socioeconomic data, for example, then someone will find it and mash it up with the National Standards data. The 107 academics are ideally placed to perform this work.

The reference source of information on schools will be the website (no doubt) that combines the highest quality information in a way that is meaningful to parents, teachers and students. Releasing the data in an open form is the first step towards creating complete school reports across a broad spectrum of facets.

I understand the natural academic reluctance to never release data that is potentially wrong, and I see that in business sometimes where companies do not want to release an imperfect product. But while they are polishing the bezels yet again competitors are releasing their inferior but higher selling versions. Similarly we should release the data, and call on the power of academics, hundreds of thousands of parents and even students to provide both sunlight as a disinfectant and the right context.

3. League tables are educationally harmful

The compilation and release of achievement data in league tables to enable comparison of schools has the potential to cause harm: to learners, teachers, schools and local communities.

These harmful behaviours include: ‘teaching to the test’ and ‘narrowing of the curriculum’; valuing of some students over others because of their ability to perform and to conform; prioritising the teaching and other support given to some students over others in order to maximize the numbers that ‘reach the standard’; and damaging effects on students’ anxiety levels and conceptions of themselves as learners – ‘I’ll be below standard’.

All systems can be gamed, and business is no exception. Larger businesses often make decisions based on their corporate structure and internal politics rather than on the facts at hand. Inside a business the 'wrong' people may be promoted as they are better at understanding their boss's requirements than their harder working colleagues.

However it's a lot easier to understand where there are performance issues when we have at least some degree of measurement. Early warnings in business are easy in some areas, like sales departments who have targets, and much harder in others such as marketing. But it's common to have at least some concrete measurements on your personal scorecard.

The really smart businesses understand that people will game the system to over-achieve on their personal scorecards, and make sure that the personal goals are 100% aligned with the corporate goals. Nobody complains about the sales guy who brings in another $1 million of business by pushing harder at the end of the quarter - in fact it's common to see spiky sales results based on timing of bonus periods. Harmful gaming, such as stealing customers, is deadly however, and jumped on very quickly.

It's can be the same with schools. No doubt the current system, whatever it is, is gamed. Some schools will get more resources than others simply because they are better at working with the system or at fund raising through other sources. Point England school has done incredible things on a tiny budget, because they worked in a smarter way.

So any measurements must cascade from the goals of the education system, and we should fight to remove bias and the potential for harmful gaming. I would imagine that getting people through the National Standards is the ultimate purpose of the system, and that means increasing the quality of the standards themselves each year as well as helping students achieve. We need to be constantly aware of the potential negative impacts, and, quite simply, measure and set goals for them. If losing poor students early is a problem, then make student retention a critical measure of performance, and so on. It may become a real battle, but ultimately a catch-all "don't game the system or else" will bring out the best side in our educators.

4. The political argument for league tables is weak

The argument that the Ministry of Education should release league tables in order to prevent the media doing so, does not address the problems that their effects will be damaging and the data used to compile the tables will be incomplete. <snip>

A long piece containing several arguments about why releasing the data is bad.

However while it might be considered bad by the academics, it is not by myself, and more pertinently, at least some parents. While even a small minority, and this is not a small minority, wants access to our data, New Zealand has a policy and obligation to provide it. Arguing against releasing data is quite remarkable for a group of academics. It should be easier to understand school performance than to read about individual student's private lives on Facebook.

In particular, the moral principle of social justice demands that the situation of the most disadvantaged in our society should not be made worse through the release of official information.

Businesses often ignore the fundamental problems, such as with their business model, until it is too late to do anything about it. As a consultant to businesses tiny to large I generally focus on these problems, often ignored before my arrival. A great reporting system which highlights emerging problems and a smart management team that follows up on them keeps the expensive consultants at bay.

It's the same here - the moral principle of social justice demands that the situation of the most disadvantaged in our society be identified and fixed, and not hidden from public view. We can fix these broken schools, and we don't have to look further than Wellington High or Pt England to find great examples.

I am somewhat dismayed at the attitude of the educators, although I do understand their reluctance to release what is seen as incomplete data from an academic study perspective.

From a society perspective there is at least some demonstrated demand from parents.

From a business perspective there are a number of businesses and individuals who would love to mash this data up to create something new and useful.

But, most of all, from an educational perspective, releasing the data as a league table will allow us all to ask the hard questions of everyone involved - how are we going to help the schools at the bottom? 

Lance Wiggs is an independent consultant providing management, strategy, growth and valuation consulting to industrial, media and internet based businesses. He blogs at Lancewiggs.com

Lance Wiggs
Wed, 18 Jul 2012
© All content copyright NBR. Do not reproduce in any form without permission, even if you have a paid subscription.
LANCE WIGGS: Measure then fix – why 107 academics are wrong
22227
false