Thursday, December 01, 2016

TIMSS 2015: what does it mean?

The release of the TIMSS 2015 report attracted much media and political attention (examples here, here) because of the way it suggested that Australia is falling behind in maths and science performance at school. The report describes TIMSS in this way:
The Trends in International Mathematics and Science Study (TIMSS) is an international comparative study of student achievement directed by the International Association for the Evaluation of Educational Achievement (IEA). TIMSS 2015 represents the sixth such study since TIMSS was first conducted in 1995.  Forty-nine education systems tested at Year 4 level and 39 tested at Year 8 level. In Australia, TIMSS is managed by the Australian Council for Educational Research (ACER) and is jointly funded by the Australian Government and the state and territory governments.
The goal of TIMSS is to provide comparative information about educational achievement across countries to improve teaching and learning in mathematics and science. It is designed, broadly, to align with the  mathematics and science curricula in the participating education systems and countries, and focuses on assessment at Year 4 and Year 8. It also provides comparative perspectives on trends in achievement in the context of different education systems, school organisational approaches and instructional practices; and to enable this, TIMSS collects a rich array of background data from students, schools and teachers, and also collects data about the education systems themselves.
This report is a first look at the results from TIMSS 2015. Focusing on the achievement results in mathematics and science at Year 4 and Year 8, this report will be followed early in 2017 by the full Australian National Report, which will examine achievement more fully and incorporate descriptive and analytical findings using the background and demographic data.
Looking at the results, I had real difficulty in understanding just what TIMSS told us and what we might do about it. A key reason lies in the existence of correlation among the variables measured

The results suggest that there is a positive correlation between academic performance and the social economic status of the families measured by books at home, the educational attainment of parents and access to learning supports. No surprise there.

The results suggest that kids in metropolitan areas are likely to do better than kids in regional areas who in turn do better than kids in remote and very remote areas. Indigenous kids perform less well than non-indigenous kids. No surprises in either case,

Now consider this pattern. Regional areas have fewer higher income families and a smaller proportion of higher educated people. That feeds into lower academic performance. Indigenous people have lower incomes and academic attainments too and are also more likely to live in regional areas. So the measures are interrelated.

This simple point goes to the way the statistics are used and the conclusions drawn from them. Much commentary has really dealt with the aggregate results. Here I quote Stefanie Balogh in The Australian (link above):
The alarming results from the four-yearly Trends in Internat­ional Mathematics and Science Study last night sparked calls for Australia to “wake up’’, reject short-term fixes, raise the effectiveness of teaching, and improve retention and training of qualified maths and science teachers.
I suppose that we could call this an education focused response. Here are a few more examples from the same story:
Education Minister Simon Birmingham said despite increased funding Australia was not achieving sufficient improvements. “The fascination of some policymakers and special interest groups with how much money is being spent on schools has been to the detriment of the real questions we should have been asking that would turn around these declining trends — ‘how should the money be best distributed?’ and ‘what are the initiatives in schools that are proven to lift results that we should be backing?’ ’’ 
Victorian Education Minister James Merlino said high-­quality maths and science education was a “key part of making Victoria the education state’’ and the state had set a target to increase the numbers of students excelling in scientific literacy by 33 per cent and maths by 25 per cent over the next 10 years. 
Federal Labor deputy leader Tanya Plibersek said the results underlined why needs-based funding was vital and “poor kids in poor schools need extra help to get better results’’. “Only around 7 per cent of the six years of Gonski needs-based funding had flowed in 2014,’’ she said, insisting it would be “completely wrong’’ to draw links between the results and funding. The ACT outperformed other states and territories, except for Victoria, on a state-by-state breakdown in Year 4 maths. The ACT and Victoria again performed well in Year 8 maths, while the ACT was ahead in Year 4 science and Year 8 science. Results in NSW for both science and maths declined.
I go back to what I said. I'm not sure what the statistics mean, what conclusions to draw from them.

Starting with some very general points. What is the purpose of education? How much weight should be placed on one set of measures in one area? Does a focus on simple specific measures actively disadvantage students whose strengths lie elsewhere? Are so called STEM courses in fact the be all and end all?

Continuing. To what degree do the results simply reflect social and economic change, including the hollowing out of the middle class especially in regional Australia and the rise of socially disadvantaged communities? To what degree can we expect education to solve problems that are not educational at their base? How do we tailor education to meet local needs instead of statistical aggregates?

Focusing on the last question. In writing my biography of my grandfather, a long serving NSW Minister for Education, I had to research the history of education. especially but not only in NSW, over one hundred years.

Oh dear, I am feeling jaundiced. The modern debate on education has become so boring, so standardised, so based on universal standards, that's it's hard to identify a single new idea. Please correct me. Surely I'm not right? In responding, it would help if you could identify ideas and initiatives that link actions to the needs of local or regional communities.          


Anonymous said...

Wish you hadn't raised this reporting Jim. I had just got my head around the 'revised fact' that sex was totally a social construct, and here we are with our thought leaders blithely sub-categorising our students as to male and female?


Jim Belshaw said...

That is a problem, kvd. I'm not sure what to recommend to ease your confusion.

2 tanners said...

"Rich information" that leads to indexed outcomes is almost a contradiction in terms, or a clear admission of waste. I haven't read the report, but it sounds from Jim's summary like a limited world view focused through a more limited lens of marks being used to make sweeping statements about the state of education and recommendation of 'how to put it right'. I can only hope this is not the case.