GLOBAL
bookmark

The real story about genAI is the way students approach it

When ChatGPT exploded onto the scene in late 2022, higher education went into crisis mode. The knee-jerk reaction was predictable: lock it down, ban it outright or develop elaborate detection systems. But what if we’ve been asking entirely the wrong questions? The real story isn’t about the technology at all – it’s about the mindset students bring to using it.

Consider two different students approaching the same marketing assignment. The first student, Alex, uses ChatGPT to generate a definition of ‘interactive marketing’, then critically examines it, identifies what’s missing, compares it with course materials and crafts a more comprehensive definition. Alex is displaying a mastery goal structure – focused on genuine understanding rather than just task completion.

Meanwhile, Jordan takes ChatGPT’s definition, makes minimal tweaks and submits it quickly to move on to the next task. Jordan exemplifies a performance goal structure – concerned primarily with efficiency and grades rather than deep learning.

Basic knowledge or deeper learning?

Our research reveals these approaches aren’t just philosophically different – they produce dramatically different learning outcomes. Students using generative AI (genAI) with a mastery mindset achieved significantly higher grades than their peers who took procedural shortcuts. More importantly, these mastery-oriented students demonstrated substantially higher critical thinking skills, applied knowledge and learning autonomy.

The data tells a compelling story: when students approached genAI as a collaborative tool for knowledge construction rather than a shortcut to task completion, they operated at the higher levels of Bloom’s taxonomy – analysing, evaluating and creating. Meanwhile, those using AI merely to regurgitate information remained stuck at basic remembering and understanding.

This challenges us as educators to face important questions: What if, instead of focusing on restriction and detection, we reimagined how we design learning experiences to foster mastery-oriented engagement with these tools? Instead of asking ‘how do we stop students from using AI?’, perhaps we should be asking, ‘how do we teach students to use AI to deepen rather than shortcut their learning?’

Specifically, our research highlights three key insights for higher education:

First, the way students approach AI tools significantly influences learning outcomes. Students who viewed AI as a means to construct and augment their knowledge showed higher overall marks and better assignment performance than those using procedural approaches. This pattern was consistent across multiple metrics, including across both undergraduate and postgraduate levels of study.

Second, different approaches to AI use directly impacted thinking capabilities. Students who used AI to augment knowledge demonstrated dramatically higher rates of critical thinking, applied knowledge and learning autonomy compared to those who didn’t use this approach. This suggests that when properly integrated into learning processes, AI can enhance rather than diminish higher-order cognitive skills.

Third, these findings align with established learning frameworks. Students who adopted mastery approaches using AI demonstrated achievement across higher levels of Bloom’s taxonomy.

Mastery-oriented engagement

These results challenge the approach to AI that many institutions and educators have adopted primarily focusing on restriction and detection. Rather than viewing genAI as a threat to academic integrity or a shortcut that undermines learning, educators should consider how to design learning experiences that encourage mastery-oriented engagement with these tools.

Our study considers some practical ways educators and institutions can approach this. For example:

• Course design can scaffold students’ learning from basic knowledge construction through to more complex augmentation tasks;

• Assessments can be structured to promote mastery goal orientations by requiring students to compare, contrast and critically evaluate genAI outputs against their own understanding; and

• Educators can teach explicit prompting strategies that optimise learning.

As genAI technologies continue to evolve, our research provides timely evidence that our focus should shift from restriction to education – teaching students not just how to use these tools, but how to approach them with a mindset that enhances rather than diminishes learning.

By encouraging students to critically engage with AI outputs and use them as a stepping stone to construct their own understanding, we can help ensure these powerful tools enhance rather than detract from educational outcomes.

For educational institutions navigating this rapidly changing landscape, the message is clear: rather than asking whether students should use AI, we should be asking how to guide them in using it effectively.

The answer may lie in fostering mastery goal structures that encourage students to use AI as a tool for knowledge construction and augmentation, rather than a substitute for their own critical thinking.

Dr Jessica Pallant is a marketing lecturer at RMIT’s School of Economics, Finance and Marketing in Australia. Her expertise spans consumer behaviour and market research, with a particular focus on how consumers interact with emerging technologies. Jessica’s current research examines AI’s transformative impact on work, education and creativity. Her published work on the paradoxes of AI in education and student outcomes offers insights for both academic and industry audiences. Beyond academic publishing, she shares her expertise through industry workshops.Through her research and teaching, Jessica continues to bridge theoretical understanding with practical applications in our rapidly evolving digital landscape.

This article is a commentary. Commentary articles are the opinion of the author and do not necessarily reflect the views of
University World News.