Thursday, April 29, 2010

Problems with measurement in education

Earlier this week Neil had an interesting post, The promised education post, looking in part at the ICT (Information and Communications Technology) test results for Australian schools. Then today the Sydney Morning Herald's education editor, Ann Patty discusses a report from the Centre for Independent Studies on indigenous disadvantage as measured by the NAPLAN test results. The common element in both is the use of standardised test results.

The problem with all these types of studies is to know exactly what you are measuring and in fact what it all means.

In the case of the ICT tests, they broadly show an overall improvement between 2005 and 2008 in test results, but with a widening gap between top and bottom. This was arguably inevitable in statistical terms.

As Neil notes, there will always be a proportion of students who simply lack the ability. However, if you assume that all students across the spectrum improve their performance by the same relative amount, say 10%, then the gap between top and bottom must rise.

Neil also quotes a Sydney Morning Herald, again by Anne Patty, on the city/country divide in computer literacy/ The story is headed City-rural divide hits computer literacy. Again, this type of result was arguably inevitable in statistical terms.

There is a known if complicated relationship between socio-economic status and school performance.

Many inland country areas have been losing people. Further, the economic restructuring that took place from the 1980s meant that those leaving included many middle class families - bank staff, managers in offices such as Telecom or local county councils, technicians. Their jobs either vanished entirely or were relocated to larger centres or the city.

In coastal areas that have experienced high population growth from retirees or those moving for lifestyle reasons, the new jobs associated with population growth have generally been lower level jobs in retail or services. Further, the proportion of people on welfare has risen. The New England/NSW Mid North Coast has now, I think, Australia's lowest per capita income.

Given these trends, you would expect a growing gap between city and country performance. However, and this is important, that gap does not mean that equivalent students in both city and country do not perform in the same way. Indeed, we know from experience over a number of years that country students can, and in fact often do, do better than equivalent city students.

The CIS study, full text here, by Professor Helen Hughes and Mark Hughes attempts to use the NAPLAN test data on numeracy and literacy to measure indigenous education disadvantage. This, they argue, remains very high. As part of their study, they rank Australia's schools on the test data; the majority of the 150 worst performing schools are totally or majority indigenous. In a NSW context, Anne Patty links this to the so-called "flight of the white", actions by white families in some country towns to move their children from public to non-Government schools, increasing the Aboriginal proportion in public schools.

As an aside, this links to another post of mine, Teaching in country NSW. Most, if not all, of the schools Thomas has put on his list have very significant proportions of Aboriginal students. According to the MySchool web site, for example, the proportion of Aboriginal students at the Boggabilla Central School is now  99%. By the way, have a look at the comments section of the post where Thomas and I are talking about the hoops that Thomas has to go through to actually get a teaching position. It's a remarkably complicated process!

My problem with the Hughes/Hughes paper is to actually work out what it all means. Australia's Aboriginal and Torres Strait islander peoples are not a single whole. The paper provides information that hints at the variation in NAPLAN results across the country. However, it really focuses on and is driven by problems in indigenous education in certain communities in Northern Australia that it then uses to generalise. In so doing, it suffers from the same problems as current Australian Government policy.

My problem with the Hughes/Hughes paper illustrates a broader issue: the overall difficulty with data such as NAPLAN or the ICT test results is not just to understand what the data means, but what you do with it once you know that.

The ICT test itself is meant to be a measure of basic competence, the capacity to do. For example, the standard set for Year 10 states that students will be able to:     

“generate well targeted searches for electronic information sources and select relevant information from within sources to meet a specific purpose, create information products with simple linear structures and use software commands to edit and reformat information products in ways that demonstrate some consideration of audience and communicative purpose”.

If you find that the gap between the best and worst is widening, does this mean that you should redistribute resources to improve the performance of the worst? If the gap between city and country is widening, do you redistribute resources to improve country performance relative to city?

Resources are limited. The totality of targets set by Australian Governments already exceeds the resources available to deliver those targets. We actually have to make judgements as to what is possible. If community response is that resources must be found, then people must accept that other things will suffer.

We also need to recognise that single performance measures whether it be ICT or NAPLAN scores cannot be treated in isolation. There is only so much you can do to improve performance on a single measure.

Let me link this back to my point about changing economic and demographic structures in country NSW.

Social and economic deprivation in any area affects performance across a whole variety of indicators. You either address the causes of that deprivation or you move the people. Otherwise, no matter how much money you throw to improve performance on one indicator, you are doing to have a performance problems measured by that indicator.

3 comments:

Rummuser said...

Jim, I give a link to a blog I visit frequently which has two embedded videos. I have watched and listened to both on the blog as well as on youtube number of times and have since then read "The Element: How Finding Your Passion Changes Everything" by Ken Robinson. You will be amazed as to how incisive he is on this very subject and his advise is as important to everyone connected with education as yours.

Rummuser said...

Sorry, I omitted the link.
http://paddyanglican.blogspot.com/2010/04/education-revisited-ken-robinson.html

Jim Belshaw said...

Thanks, Ramama. I couldn't watch the first video - its blocked in Australia on copyright grounds. I will try to listen to the second. Sounds interesting.