AI in education: putting your view forward

The voice of IEUA members is being heard on the issue of artificial intelligence (AI) and digital technologies usage in schools, Emily Campbell writes.

IEUA Federal Secretary Brad Hayes said our union was actively contributing to various consultations and inquiries underway at state, territory, federal and international levels that seek to respond to the impact of AI in education settings.

“In May 2023, a federal parliamentary inquiry was established to examine the risks and opportunities associated with generative AI tools in school and tertiary education settings,” Hayes said.

“Shortly after, the IEUA made a submission to the House Standing Committee on Employment, Education and Training inquiry into the use of generative AI in Australia’s education system.

“Most recently, our federal union made a written submission in response to the draft framework published by the Federal Government’s Artificial Intelligence (AI) Taskforce, which outlines core elements and principles to guide the use of AI in schools,” he said.

In some jurisdictions such as Queensland, state and territory governments have held their own inquiries, engaging with education stakeholders to determine the best course of action regarding AI in schools.

Hayes said IEUA representation on groups such as the Federal Education Minister’s AI in Schools Taskforce was crucial to ensuring the insights of practising teachers and education staff informed policy.

“Our schools are places of innovation and creativity where new technologies are embraced to benefit student learning; however, IEUA members have identified serious issues to be managed in the use of emerging AI programs in education,” Hayes said.

“These concerns include veracity of information, respect for intellectual property, data security and online privacy and safety issues for staff and students.

“School employers and governments have a joint duty of care to their staff and students to address these concerns, so generative AI tools are not a threat to privacy, equity and wellbeing, but rather a valuable resource that equips learners for future-focused education,” he said.

Voice of teaching profession crucial

In submissions to both the House Standing Committee and the AI in Education Taskforce inquiries, our union reiterated the voice of the teaching profession must be paramount during consultation to develop appropriate guidelines regarding the use of AI and digital technology in education.

“The National AI Taskforce and all other agencies collaborating on AI guidelines must include meaningful engagement with the education workforce and their union representatives,” Hayes said.

“The professional judgement and autonomy of education workers and school leaders must be respected because practising classroom teachers are best placed to decide how, when and where AI or any digital technology is deployed in their classrooms,” he said.

Submission to House Standing Committee

Several key considerations were outlined in our federal union’s submission to the House Standing Committee, highlighting the threats and opportunities presented by AI in education.

Hayes said the submission emphasised that the primacy of relationships and teachers’ professional judgement and autonomy must not be undermined by AI technologies.

“Professional relationships between teachers, support staff and students in achieving learning outcomes and preserving student wellbeing cannot be trivialised,” he said.

“Any diminution of the role of human relationships, risks to employment or deskilling of teachers and support staff, is likely to have long-term negative consequences for students, families and school communities.

“Differentiation of content and assessment practices must remain within the control of classroom teachers, not dictated by education providers as digital technologies advance,” he said.

The submission acknowledged that developing strong critical literacy skills in students of all ages was more important than ever.

“In consultation with teachers, careful review of current curriculums should be undertaken to ensure the sequential and age-appropriate development of ethics and critical literacy skills in students,” Hayes said.

“AI policy responses and amended school practices designed to safeguard the integrity of student work and academic assessments must also be evaluated.

“Teachers require ongoing access to high-quality professional development related to AI in education.

“The impact of AI on teacher workload and work intensification must be monitored, and additional resourcing of time given to teachers to undergo PD and manage such challenges,” he said.

Hayes said capitalising on the opportunities presented by AI must be balanced with protections for staff and students.

“The responsibility for managing safety risks is a burden that cannot lie solely with education practitioners.

“Government agencies, education systems, and employers must take joint responsibility for a co-ordinated approach.

“The rush to implement emerging technologies should be tempered by a considered approach to managing risks, including the spread of misinformation and threats to security and privacy,” he said.

Another key consideration raised in the submission is the need to prevent AI from perpetuating inequity by ensuring all schools have access to AI and digital technology.

“Students from disadvantaged backgrounds must be given equal opportunity to engage with AI in education,” Hayes said.

“Equitable access for all students, including from regional and remote areas, should be guaranteed through significant and ongoing investment in infrastructure and the elimination of any disparity between levels of use available to schools,” he said.

Ultimately, AI must be the servant of teachers and students, not their master.

Feedback on draft framework

The draft framework released by the federal AI Taskforce focuses primarily on the benefits of AI for education outcomes.

It is premised on considerable confidence in establishing the safe use of generative AI in Australian schools.

“In our feedback, we indicated the draft framework does not give enough attention to the considerable risks for education outcomes and safety,” Hayes said.

Hayes said establishing national guidelines was important, given the complexity of issues and that it is useful to develop principles that should underpin the decisions of schools and teachers about how and when AI is deployed.

“The principles outlined in the draft are sound but insufficient,” he said.

“While the IEUA supports a framework that acknowledges the role of school leadership in providing support and guidance for classroom teachers, there are concerns that the guidelines, in their current form, place the burden of risk identification and risk management on schools and teachers.

“This is inappropriate, particularly where schools are understaffed and under-resourced and where the nature and potential impact of different risks evolve as technologies change and adapt.

“The draft guidelines do not currently acknowledge the need for a co-ordinated, systemic and frequently updated approach to identifying and managing risks arising from AI use in school settings.

“Developers of AI technologies and governments should be taking a role in risk identification and management, in consultation with school staff,” he said.

The risks in question are broad and wide-ranging; these include, but are not limited to:

  • breaches of student privacy
  • erosion of professional skills
  • exposure to harmful content
  • perpetuation of social and economic injustice, and
  • negative impacts on human relationships.

Need for two new core elements

Hayes said the IEUA called for two new core elements to be added to the draft framework, which can better provide guidance to achieve improved education outcomes for all student cohorts. The new core elements are:

  • consultation with the teaching profession (and other school staff including librarians and cultural competence educators) and their unions, and
  • decisions about the use of generative AI in schools must be subject to a Teacher Workload Impact Assessment.

“Employers and governments have a joint duty of care to students and teachers to consult closely with education professionals in their workplaces and through their unions to address insights and concerns.

“The draft framework must also address workload and wellbeing to have meaningful application.”

Hayes said regular and ongoing consultation with practising teachers will be necessary to monitor how AI usage impacts teachers, students and the wider community.

“Our federal union will continue to actively ensure members’ voices are heard in this space.

“Ultimately, AI must be the servant of teachers and students, not their master,” he said.

Members can access the full submissions to both the House Standing Committee and the AI Taskforce’s draft guidelines at: www.ieu.org.au/policy-submission