GLOBAL

Higher education lacks solutions to the challenges of the AI era

The first months of 2023 mark the moment when artificial intelligence (AI) became a real issue for the general public, with the extraordinary hype surrounding the capabilities of OpenAI’s ChatGPT-3.

One of the most visible implications of this large-scale adoption and use of AI has been that education, and especially assessment, has been challenged, or revolutionised, by the advancements of generative AI.

Noam Chomsky noted that ChatGPT in education is “basically high-tech plagiarism” and “a way of avoiding learning”. Regardless of how it is presented, ChatGPT and the large suite of similar AI applications reveal the urgent need to rethink assessment and change our industrial, neoliberal models of education.

Working for the last three decades in universities in different parts of the world, I have noted that many academics have the tendency to become overnight experts in the latest fad just to prove their own relevance in a competitive and largely dysfunctional academic culture.

To see this happening again is not surprising, but the hype surrounding ChatGPT is new and different: a real stampede of academics with no research, academic papers or even a tweet posted on artificial intelligence before January 2023 are now writing opinion papers, holding conferences and speaking with an air of absolute authority about the proper use of AI in education.

This unprecedented wave of opinions and pseudo-solutions comes mostly from aggressive ignorance, a fear of missing out on the new hype and a complete misunderstanding of the field.

It would be amusing if it were not so dangerous. There are serious implications for students’ privacy and their future, for learning and the quality of teaching, for the immediate and long-term future of our institutions of higher education and for the future of civil societies.

Despite all that is at stake, and the general call to change, we see the same curriculum design, the same kind of assessments that are detached from higher learning, and the same delusional statements that change is underway.

Higher education has indeed been in a state of change for many years – and not for the better – at a time when artificial intelligence is altering our workplaces, societies and interactions.

Education services and the markets

The ideological turning point in the rush to convert universities into rigidly hierarchical, amenable pillars of salt can be found in the 1990s.

There was a strong impulse after the 1950s to make universities more profit oriented and more ‘relevant’ to the markets and profiteering than to culture and ideals such as the common good, or to nurturing and building a civil society.

Of course, mission statements and other decorations about these values can be found on university walls. But the ethos and priorities of higher education have slowly changed, and the idea of dissent and courage has been replaced by a culture of fear, uncertainty and compliance, which serves the managerial push to serve the economy and corporate ideals better.

The first major step towards this radical change can be seen in the Marrakech Ministerial Meeting of 1994, where representatives of 124 governments and European communities set up the World Trade Organization (WTO), defining the basic framework for trade relations among members, guided by a narrow market-oriented ideology.

In this agreement, we find ‘education services’ specifically mentioned as part of markets that are guided by profit-oriented policies.

A few years later, in 1999, in Seattle, the WTO conference adopted the conclusion of the ‘Millennium Round’, where education was added as an integral part of international markets, regulated by the same rules of trade used for commercial entities.

From that moment education became a commodity like coal or steel, or cars and fridges, a product that is subject to trade and market exchanges.

The academic world has been dramatically changed by the WTO agreements, and scholars have been replaced by ‘managers’ and political decision-makers. Higher learning has become a simple line in a marketing brochure in the new commercial field of higher education, while gimmicks promising efficiencies and the ability to serve the market have been adopted without critique.

It should come as no surprise that we find banal news about reputable universities engaging students in online courses and degrees where there is no teaching or tutorials.

As long as profit is secured, and businesses running under the label of higher education report billions in profit, we have no reason to care that graduates may go through their studies with no real experience of higher learning.

A lack of ideas and solutions

This is the higher education context where the large-scale adoption of generative AI makes it highly plausible that students will ask AI to complete their assignments while teachers use AI for assessments and marking, in a process devoid of any meaning or utility.

The radical dissent and intellectual courage required to find solutions for the unprecedented challenges higher education faces as a result of AI – such as misinformation, manipulation and propaganda, the rise of fascism and dangerous ideologies and economic, environmental or cultural crises – cannot find oxygen in what has been left of academic culture in 2023.

The intellectual depth needed to interrogate and prepare students and universities for the risks and opportunities opened up by the advancement and wide adoption of AI has been stifled as academics stand defenceless or too feeble to come up with solutions.

The stage is dominated by academics who are unable to deal with this challenge: we have those who are too superficial to understand technology, its use and dangers, and promote a simplistic and blind techno-optimism, which also serves to advance their own careers.

An influential group of academics able to understand the more complex implications of technology in learning and teaching sell the same techno-optimism, influenced by vested interests, as some rare reports such as Google Academics Inc reveal.

Lastly, we have the informed and insightful mind of academia, struggling to survive in a culture of fear and distaste for alternative views, too intimidated and worried to fight for a lost cause to engage in an exhausting and futile challenge about the new hype.

Universities have little time to change until we will see the results of the current intellectual collapse. Most probably, if we continue along the current path, in less than a decade we will see the new iteration of ChatGPT writing a nice obituary for decrepit institutions of education that have lost any relevance for our societies and culture.

Dr Stefan Popenici is academic lead of quality initiatives at Charles Darwin University in Australia.