Curves in the wrong places

Stuart Middleton
New Zealand Education Review
Vol 14 No.38, October 2, 2009, p16
APN Educational Media (NZ) Ltd
Wellington 

Back in 2004 I wrote about the impact of The Plunket Chart on the community’s understanding of educational assessment and evaluation (see Education Review, Bell Curve Babies, 18 February 2004). I suggested that the obsession of the New Zealand community with norm-referenced assessment was in part due to the influence of that chart on successive generations of New Zealanders as it fed into people, along with mother’s milk and supplements bought from the Rawleighs man, a belief that such reporting could be applied to all aspects of human performance.

I received in response, a very thoughtful letter from a senior Plunket person who had read the piece in good humour and assured me that the uses of The Plunket Chart were now tempered with an understanding of its limitations.  I had been a little less than charitable towards The Plunket Chart on two grounds. My early life was lived in the shadow of very poor performance on the chart. Not only was I below that dark line of normality, I was also outside the shaded area of acceptability.

Actually the Plunket uses of such reporting were a proper use of such an approach. Physical characteristics can properly be reported using means and standard deviations which conform to the humped shape we have come to know as the Bell Curve.

I was shocked however to open the Sunday newspaper to be greeted with the headline “Plunket-style tables for school reports” above an article that exclusively reveals the exciting news that charts in the style of the Plunket chart are being developed to report on the National Standards being promoted in this country and which are to be publicly reported by schools in 2012.

Now let’s clear the decks of one thing – I favour clear statements of learning targets for students at different ages and anything that will allow a child’s progress to be better communicated to parents and caregivers.

But can a Plunket style chart shake off the distortions of the irrelevant norm-referenced basis of its progenitor? An educational standard is in no way similar to the average weight or height of a growing baby. If educational standards were to be set on the same principles as those which generate a mean or an average then they would not in any sense be standards – they would be simply statements of where the average performance is – and that could be good, bad or indifferent. If there is any commitment in the introduction of National Standards to lifting the performance of the education system then they must certainly not be current mean performance.

The example exclusively revealed in the newspaper imports so many of the features of The Plunket Chart that one wonders whether the “good idea” of using it as a model for the national standards reporting hasn’t overtaken a clear examination of what is being reported. The rising dotted line of progress suggests that there is a connection between living longer and meeting the standards. The shaded areas getting incrementally wider for “just below” and “well below” make no sense and have no statistical validity in the way The Plunket Chart has. The obvious way to report standards is to show that the standard is just that and a child’s performance can then be described in terms of its relationship to that standard. So the successive standards could and perhaps even should be reported as a straight line with the child’s achievement relative to that charted simply above, on or below the standard. There are many ways this could be done.

The principle that is paramount in reporting the standard is that this is a report on each and every individual student, not on groups of students. Inevitably, someone somewhere will be collating the information and producing descriptions that will have little validity that this group (i.e. school) is better than that group (i.e. some other school). What parents need to know is how their chjild is performing in relation to the standards.

Now all of this will get a bit tricky if the standards are taken as group measures. Malcolm Gladwell of Tipping Point fame has recently challenged us to think about performance differently. His book, The Outliers, list a series of factors that impact on performance.

He rates the following as key influences on performance – practice (10,000 hours is critical), timing (being born early in the year is helpful), upbringing (of course), cultural legacy (big challenges here for the system) and lucky opportunities (some are in the right place). So the national standards and the reporting on them might be only part of a complex jigsaw of high performance. Success in education is the critical foundation so the reporting should be rigorous, clear, and about individual progress.

Another troubling element of the proposed chart is the suggestion in its design that the standards are reported on the basis of Year 1, Year 2, Year 3, and so on. It should be important for us to report on the standards expected at critical transition points (age 5 for entering school, age 11/Year 6 for the transition to intermediate school, age 13/Year 8 for the transition to secondary school and at age 15/Year 10 for the transition into the senior secondary school or into other pathways.

Standards at these key points are crucial and no amount of reporting on the points in between will provide comfort for students who know that their little ones are simply not prepared for the transitions. No other years are as im[portant as the ones in which key transitions are expected of the students.

Accepting that the Plunket Chart is a good model for reporting educational standards would be on a par with accepting that passing the test for a drivers licence is evidence of both the skill and intellect required to drive a motor vehicle, or that turning 18 allows the community to relax knowing that the skills for safe use of alcohol is assured.

We can do better than this

Leave a Reply

Your email address will not be published. Required fields are marked *