Australia’s leaning tower of PISA:

Reflections of a Canadian exchange teacher

The 2018 report by the Programme for International Student Achievement (PISA) has distressed many Australians given the nation’s dismal results.

PISA, which in 2000 began measuring maths, science, and reading literacy among 15 year old students in its 79 participating countries and regions, provides an internationally agreed upon framework to track student performance.

The test is taken by approximately 600,000 students from such diverse places as Denmark, France, Indonesia, Morocco, and the Philippines.

The test is administered every three years and focuses on a student’s ability to apply their learning to solve problems. Results of the report influence national priorities, policies, and practices on assessment, curriculum, and performance targets.

My interest is based on my experience as a high school science teacher, and as a scholar. As a Canadian, I have participated in two teaching exchanges, in Melbourne in 1999 at a co-ed public school and now 20 years later, here in Sydney at a Catholic girls’ college.

In addition, I also hold a doctorate in Comparative and International Education from the University of Alberta, in Edmonton, where I currently live and teach with Edmonton public schools.

My PhD, and ongoing related research, examines education and socioeconomic development in Arctic indigenous communities.

As such, I have a unique vantage point, especially given that the PISA report’s release coincided with the end of my exchange in Sydney.

My observations are also informed by the many stimulating conversations I have had with other Canadian teachers on exchange, as well as my valued Australian colleagues.

The PISA Report highlights significant differences in math, science, and reading literacy as noted in the two country’s international rankings:

At the provincial or state level, the learning gaps are even more pronounced. Alberta’s results in math, science, and reading place it in the top tier, scoring third in the world in reading, fourth in science, and second highest in math in Canada.

Conversely, the steepest declines in state performance occurred in NSW. According to one report, NSW students are about three quarters of a school year behind Canada’s in reading, maths and science.

Another report laments that Australian students are “among the worst in the world” for class discipline, ranking 70th out of 77 participating nations as measured by PISA’s index of disciplinary climate.

PISA’s results has sparked a flurry of explanations (and solutions) by experts, ranging from the state of teacher training, parental involvement, classroom discipline, curricula, and socioeconomic divisions.

While these factors impact student success, I believe they are mostly symptoms, rather than underlying causes. A more careful analysis involves unearthing foundational aspects of schooling – particularly those relating to assessment and graduation requirements.

Depending upon their level of achievement in Grade 9, students in Alberta are streamed into either academic or non academic courses in Grade 10, which they then must pass before being granted entry into specialised senior level courses.
For instance, a student taking Grade 10 Science must receive a pass of 50% before enrolling in a Grade 11 Science course otherwise the course must be retaken (often the following semester). High school diploma requirements must include credits in senior Mathematics, Science, English, and Social Studies, with the option to either pursue an academic or non academic stream.

In my Canadian classes, performance feedback is both ongoing and cumulative, with students usually being evaluated each week.

Whereas some assessments receive little weighting and are designed primarily to provide formative feedback, exams or major assignments are weighted heavily and represent summative evaluations.

By the end of a five month semester, students will therefore have completed and received feedback from approximately 15 to 20 assessments that cumulatively contribute towards a final numeric grade.

Regular assessments provide clear benchmarks to continually track progress, as well as opportunities to consolidate learning and closure once a topic is completed.

Regular feedback also drives pedagogy, teachers know whether or not students have mastered a concept.

Both class awarded marks and external diploma exams are then used as a significant (if unspoken) component for principals to assess teacher performance; in turn, school averages are then used to track school performance within a district.

The case in Australia is radically different. NSW graduation requirements include compulsory units in English (and Religion in Catholic schools), with the remainder of the units to be fulfilled through optional courses.

In terms of assessment, students must complete a minimum of three (and maximum of four) assessment tasks per course, over a 10 month school year (compared to the 15-20 or so evaluations my Canadian students receive over a five month period).

In my experience, teachers are reluctant to spend time developing and grading informal assessments, as students do not adequately prepare for them because they know their grade does not count.

Opportunity to provide crucial feedback (particularly in courses like Mathematics and Science where concepts build) is sporadic at best.
Formative evaluations therefore remain ineffectual and unenforceable aspirational platitudes by educational authorities who are far removed from the day-to-day realities and pressures of classroom life.

At the same time, lax graduation requirements conspire to create a downward pressure on student motivation and achievement for courses students do not wish or see themselves taking in their senior years.

What the PISA data fails to illuminate is the inordinately high attrition rates occurring in both Mathematics and Science during this pivotal transition into the senior years – an attrition that has significant implications for adult literacy, not to mention the need to procure skilled labour from overseas in the STEM (Science, Technology, Engineering and Math) related fields.

It is not too much of a stretch to link these challenges with the discipline problems being reported, as motivation on the part of students and teachers alike is continually eroded by a system that conspires against those who strive to deliver or receive a good education.

To end on this depressing note, however, would paint a skewed picture. What the PISA results also fail to capture are the many outstanding learning opportunities occurring in Australian classrooms.

Here, I am referring to the long term assessments that emphasise project, inquiry based learning. In both Melbourne and Sydney, my senior Environmental Studies and Biology students engaged in partnership with a local university (Melbourne), and environmental organisation (Sydney), where they were required to gather empirical data and secondary sources into a detailed report.

Similarly, Grade 8 students in Sydney, undertook a remarkable and highly successful, 10 week interdisciplinary STEM project on sustainable housing.
The benefits gained from such rich learning opportunities are not easily measured by a one time snapshot of data gathered from paper and pencil testing.

Andrew Hodgkins

As a Canadian exchange teacher, I have certainly come to value and appreciate the growth these learning opportunities have on students and teachers alike, as well as their impact on school culture. Indeed, the challenge will be to return home to a system where the pressure to ‘teach to the test’ conditions both teacher and student motivation.

Despite their limitations, PISA reports help countries link the local context of teaching and learning to international empirical results.

Here, I am reminded of the sociologist Seymour Lipset’s dictum: “An observer who knows only one country knows no countries. Without comparison, there is no way of knowing whether a particular practice or behaviour is unique to the society in question or common to many” (cited in Fukuyama, 2012, p. 18).

The question then is not then so much: Which system is better? But rather: How can we learn from best practices occurring across nations? As I have argued, a balance must be struck between delivering regular and meaningful assessments, while also providing rich learning opportunities.

However, this balance cannot occur without consideration of the institutional context, whereby, realistic, attainable and stringent graduation requirements are needed to put students on solid ground.

If a radical restructuring of schooling is to realign this leaning tower of PISA, it must first confront the bureaucratic inertia embedded in the very institutions responsible for making changes. Otherwise, I fear that Einstein’s definition of insanity: doing the same things over and over and expecting different results will be reaffirmed in three years time with PISA’s next report.

Do you agree with Andrew’s assessment of Australian education? Write to us at

Alberta students best in Canada at reading, science, international test results show (Dec. 3, 2019).
Australia’s dismal school results state-by-state. (Dec. 3, 2019). big-spending-state-continues-slide-down-education- rankings-20191202-p53g7a
Australian students ‘among the worst in the world’ for class discipline (Dec. 4, 2019).
Fukuyama, F. (2012). The origins of political order. Great Britain: Profile Books Ltd.
PISA 2018 Results (n.d.).
While individual schools might shine, PISA results show our education system is stagnating (Dec. 6, 2019).
Why are Australian students lagging behind Canada? (Dec. 4, 2019).