The education world is awash with data, Misty Adoniou writes. In Australia, federal education policy takes its direction from the results of international testing regimes such as PISA, PIRLS and TIMMS. A to E reporting, the Australian Curriculum, the national teacher standards and the MySchool website can each trace their origins back to government panic that Australia was losing the international education race, a race refereed by these international tests.
At state and territory level, governments are driven by our own home grown education race between the states and territories, as measured by NAPLAN.
The pressure to win the NAPLAN race, or at least improve on last year’s placings, is directly passed onto schools. To ensure they pull their weight, their results are made public each year on the MySchool website.
More and more data collecting tools are passed on to schools to measure performance, with the hope more data will improve NAPLAN scores and help their state win the NAPLAN race. Data walls proliferate in staffrooms, and excel sheets swamp teachers’ computers.
We are told the data collecting isn’t about comparing schools, systems or states, it’s about accountability – which makes it hard to argue with. Of course teachers should be accountable, they have a hugely important job – the education of Australia’s future. However, it is unfair to be held accountable to a regime you have had little to no say in. And for the last decade all the talk has been about testing, and nobody has been talking about teaching.
Skim the surface
Politicians and bureaucrats skim the surface of the international and national test data, and make an interpretation of the numbers that is divorced from any understanding of the tests themselves and what they measure, and therefore what kind of teaching they presuppose.
It would be surprising to hear that any education minister or their advisers had even seen the PISA test papers, or read the Year 5 NAPLAN Reading Magazine, for example. Nonetheless, they set policy direction based on the spreadsheet numbers that the tests generate. It’s like pronouncing the failure of an experiment, without even knowing what the experiment was.
Strip away all the spin, and a press conference would sound something like this:
Minister: “We’re going with a brand new approach.”
Reporter: “What will it be replacing?”
Minister: ‘I don’t know.”
Reporter: “What was wrong with old approach?”
Minister: “I don’t know, but whatever it was the numbers looked bad.”
Reporter: “So, what will this new approach be?”
Minister: “I thought we’d try something I remember from when I was in infants school.”
What is missing from the big data puzzle is the expertise of the teacher. The teacher, a scientist with a four year degree and daily field experience, whose training is not only in data collection and interpretation but who also knows what it is we are testing, and what we should be teaching.