Generative AI: What next for teachers?

Since the launch of ChatGPT and the rapid adoption of generative artificial intelligence (AI) worldwide, there has been extensive debate about how this new technology will fit into education, Katie Fotheringham writes.

Dr Tiffani Apps is a senior lecturer at the School of Education at the University of Wollongong and the Associate Academic Program Director Digital Technologies for Learning and Co-Head Postgraduate Studies.

Dr Apps’s work engages the impact of educational technologies on schools and how children, parents and teachers engage with digital systems, platforms and tools.

The Australian Framework for Generative Artificial Intelligence (AI) in Schools (the framework) was introduced this year.

Dr Apps says the framework painted AI in an unrealistically positive light
and overlooks many issues that will arise in schools.

“The potential of [the framework] to guide the responsible and ethical use of AI in schools is not to be understated,” Dr Apps says.

Dr Apps says a ‘one size fits all’ approach was not ideal when implementing this new technology in a school setting.

“The support required for teachers will look different in every school,”
she says.

“It’s a tricky situation because it’s so complex – we need to be able to think about how we support both students and teachers to understand the nature of AI and then be able to use the tools effectively.

“Because of this, there are sort of two layers to the situation, and often we can see very clearly in the history of educational technology that there’s always lots of hype around new tools, but teachers are often not very well supported to understand them and integrate them into their practice.

“I welcome a framework to support teachers to be able to understand and implement the tools – my concern is when we have vague, generalised statements around what we might expect AI to be doing, we are not able to best support teachers to understand and use the tools,” she says.

Understanding AI imperative

Dr Apps says a priority should be educating teachers and students on what generative AI is and how it works, as there may be misconceptions about the technology.

“Much of the research that we’ve been doing ahead of the advent of generative AI is exploring how teachers understand AI,” Dr Apps says.

“Until the advent of generative AI, AI was already embedded in many of the tools that we use, and it’s largely designed to be invisible in our practice.

“Because of this, there many who are confused about what AI is, and all of the kind of media hype around generative AI tools such as ChatGPT adds to this confusion.

“Teachers need to be supported to understand what generative AI is, what it’s capable of, and then also to understand the impacts it will have on students now and in the future.

“There is also a need to be able to think thoughtfully about how it fits with the existing curriculum and the kinds of needs that students have because students’ needs are very diverse.

“Something that teachers excel at is taking a standard curriculum or tool and then catering for the needs of the students in front of them,” she says.

What’s missing from the framework


Dr Apps says several key issues were overlooked in the current framework.
“The document overlooks the complicated nature of generative AI and many of the issues that will arise with mainstream use in schools – including exacerbation of existing digital inequalities, trespasses on the privacy and data rights of teachers and students, intellectual property rights, teacher competencies and contribution to negative environment impacts.”

Dr Apps says years of research revealed a plethora of privacy and data issues in relation to AI.

“There are huge risks around our privacy and the diversification of our work, as well as intellectual property risks for students and teachers.

“It’s important to understand that the ChatGPT API is open and is sharing data as part of the machine learning, so by its very nature there are privacy risks associated with that.

“The most important thing teachers can do to protect themselves is understand those risks by engaging in some professional learning surrounding that.

“We also see many educational jurisdictions, such as South Australia, now looking to build their own versions of generative AI to provide a safer platform for students and teachers.”

Dr Apps says the social implications are equally as important to understand as the technical side of AI.

“Schools should be making a real commitment to exploring what AI is through the digital literacy capability, with a focus on the ethical nature of it and the social impacts of those tools – these tools are not neutral,” she says.

“Having those conversations that are more social rather than technical in focus with students is part of building their digital capability, part of building their understanding of the impacts of these tools on our lives is a great entry point to understanding AI through a social lens.”

Relevant PD critical

Dr Apps says it is important that specialised professional development (PD) be made available to teachers.

“In my experience of delivering PD over the last 15 years, anything that’s kind of situated in a particular context is always more valuable to teachers because they can relate it to their specific practice,” she says.

“However, broader online courses, lectures and webinars can be useful as a starting point.

“Many schools run their own school-based professional learning regularly.

“I think that those spaces can be good because they provide an opportunity for teachers to feel safe – new technology is always risky, and it is important for teachers to feel supported and be honest about how they are feeling,” she says.

Dr Apps says the University of Wollongong runs a free yearly webinar around emerging technologies.

“The learning team aims to provide an overview and then to give some example strategies of what it might look like in the classroom with connections to existing curriculum as a first point for discovery around those emerging technologies,” she says.

“There is also a range of different providers who do face-to-face or broader online PD opportunities connected to the Australian curriculum – the Digital Technologies Hub tends to have some good case studies and examples of practice as well.”

IEU position

Terry Burke, IEU-QNT Branch Secretary, says our union made submissions throughout the development stage of the framework and remains conscious of the risks that could arise for members following mainstream adoption.

“It is important that we do have national guidelines and, given the complexity of the issues, it is important to identify principles that should underpin the decisions of schools and teachers regarding how and when AI is deployed,” Burke says.

“The principles themselves are sound, but they are insufficient in that there is a need for teachers and schools to have time to engage in the deep analysis of the risks that emerge when working with AI for education purposes – the responsibility for managing risk should not reside solely with teachers and school leaders,” he says.

Burke says the guidelines should acknowledge the risks associated with mainstream generative AI use in schools.

“It is important that the framework is developed in consultation with the profession and updated regularly to ensure that it recognises and addresses new and emerging risks and does not inappropriately shift the responsibility for anticipating, preventing and responding to negative events to teachers and other school staff,” he says.

The University of Wollongong has online university-supported PD sessions, resources and information related to the impact of AI on learning and teaching.
Access the information hub at ltc.uow.edu.au/hub/collection/ai-in-education
Examples and case studies from the Digital Technologies Hub can be accessed at digitaltechnologieshub.edu.au