This post is authored by Dr Jenni Greig | School of Psychology
I’m a board-gamer and love stories with mystical beings, fantastic worlds, quests … and, yes, sometimes dragons.

At the 2023 Faculty Learning and Teaching Symposium, I heard about generative AI (GenAI) creeping into learning and teaching spaces. GenAI took a particular form in my mind: an overwhelming dragon, sitting on a tantalising pile of the treasures of education. It beckoned those who want the treasures without having to work for it, only to loom large as they get too close, breathing fire and destruction. In short, a threat to the learning I want my students to embrace.
Ok, I have an over-active imagination.
That symposium coincided with thinking about updating an essay for one of my subjects, Psychology of Ageing. The task asked students to explore a question about the role of social and economic advantage in ‘successful ageing’.
One Symposium presenter, Kim Bailey, talked about ChatGPT hallucinating legal cases, and described incorporating GenAI outputs into classroom activities. I wondered if I could circumvent students using GenAI to write their essay by showing them that I’d already done it.
I went to ChatGPT to have a play. I tried putting the essay question in as a prompt, then tried adding extra instructions, such as length, sources and referencing, and specifying that this was for a Psychology subject. Regardless of the specific prompt, I was never impressed with the output. They were always superficial, covering some key ideas with the lightest touch, claiming various sources had ‘demonstrated’ this or that, but with no critical engagement with literature, or even mention of any specifics of research evidence. Some notable issues in the final version were:
- Despite asking for a 2,000-word essay, the output was 1,400 words, including the reference list and pointless headings.
- Only six of the 14 references could be verified as actually existing in the form given in the output.
- It presented one view of ‘successful ageing’, without reference to differences in culture, personal values or narratives which I would hope my students would be able to discuss.

Comfortable that ChatGPT would not be producing high-quality essays for a while, the output had stimulated a new idea. This is where the quest started. Rather than writing the essay, the students could be guided through a critique of this output.
The first stage of the quest was to work with educational designers to ensure the task instructions and marking rubric were clear and fit-for-purpose.
Next, scaffolding had to be built into instruction. Students needed support to understand how to engage critically with this output. David O’Sullivan (formally with the Academic Skills team) and I created a class to help students develop critical reading and thinking skills, even demonstrating how read a journal article to glean more than superficial information.
Two years on, and the quest continues. As I learn more about GenAI, I still have reservations. These include use of IP, including First Nations CIP to build LLMs without permission, or remuneration; environmental concerns; ownership of outputs; unknown long-term effects on cognitive processes, including learning, memory, attention. I’m continuing to think about how to do scaffolding well, taking on feedback from my two markers, Kirstie Northfield and Isabella Bishop, and students. GenAI is still a dragon for me; there are still threats and unknowns. But we can confront it with our students, demonstrating not only that GenAI isn’t a clever short-cut to learning, but that is another source that needs to be engaged with critically. It can be a good reason to incorporate instruction in critical reading and writing. It can be a quest that enhances learning.

Acknowledgement
The Author would like to acknowledge her colleagues and collaborators in this quest | David O’Sullivan, Kirstie Northfield, and Isabella Bishop. Thanks also to the input of DLT Educational Designers.