What next for generative AI in universities? Experts explain

Michaël Lobet, applied physicist at Belgium’s University of Namur and at Harvard University, had a hidden assumption: “I thought my students knew AI. But actually, they don’t.” A study at Namur found that only 13% of students use ChatGPT for their studies. “So, we must teach students how to use the AI tool properly.”

Lobet was speaking at a 23 June webinar on “Beyond ChatGPT – What next for generative AI in higher education?” hosted by the European University Association and European League of Institutes of the Arts. The moderators were Thomas Jørgensen, EUA director of policy coordination and foresight, and Maria Hansen, executive director of ELIA.

The University of Namur study surveyed 1,233 bachelor students anonymously earlier this year. It was consistent with a March survey by the Pew Research Centre in the US, which found that just 14% of adults have used ChatGPT for entertainment, to learn something new, or for their work.

Since very few students use ChatGPT, there are opportunities to use generative AI in class to teach students its proper use and the importance of checking ChatGPT’s frequently false information, and also to advance the teaching and learning process.

When designing a class activity, a guiding question could be: “What can our student do that ChatGPT or generative AI cannot do?” said Lobet, a researcher and lecturer at the University of Namur and associate in applied physics at Harvard.

“Because definitely we must change our learning goals,” he said. No longer can academics merely ask students to, for instance, write about a writer’s life. “Now AI is part of the game, we should ask for a text about a writer’s life, say, but the student must double check the veracity of the generated text.”

Generative AI is inspiring a revolution around creativity, Lobet continued, providing some examples. One teaching activity he found on Twitter asked students to use ChatGPT to explain Hegel’s philosophy of the dialectic, using an episode of The Simpsons. Another asked students to write wedding vows, using the style of Snoop Dogg.

“We can use AI to generate such texts and compare the different styles and the different forms.” ChatGPT can be used to help students improve their writing skills, or to develop codes, debug codes, and explain what a code is doing. “It can transform class activity.”

Art schools are AI-immersed

Interestingly, ChatGPT is considerably more prevalent in the arts and arts education.

Moderator Hansen said ELIA conducted a small survey among its 280 members. “We found that a large majority of arts students already integrate generative AI in their work and have for some time.” However, few art universities have an AI policy.

Pawel Pokutycki, an interaction designer, researcher and senior lecturer at the Design Academy Eindhoven and Royal Academy of Art in The Hague, the Netherlands, told the webinar that higher arts education had been “looking into the possibilities and controversies around generative AI for a couple of years now. It’s our daily bread today.

“Our students and tutors have been engaging in creative work with AI machine learning quite enthusiastically, full of curiosity but also with a certain level of criticality”.

Pokutycki, who works around the world, is “highly enthusiastic about the role of generative AI in the arts. If references are mentioned, if there is critical reflection on the tools used, there's a certain level of transparency. We are ready to accept it and move on with these new developments”.

The extraordinary development of creative technologies, tools and websites have led art experts “to perhaps see AI as a form of artistic intelligence. Obviously, it’s a discussion point to what extent AI is artistic or not, and how it is challenging us with our artistry”. Students tend to take one of three positions towards generative art: ‘why not’; ‘yes, but’; or ‘no thanks’.

“If we are transparent about AI processes in art education,” Pokutycki said, “then I think there is no need for more explicit or detailed policy for the implementation of AI in future.”

AI is also nothing new at the Zurich University of the Arts, Switzerland’s biggest arts university, said Grit Wolany, an AI scout. Her job is to observe developments – “horizon scanning the dynamic AI field” – collect information and disseminate it across the university.

“What is new is the accessibility and wide use of new generative AI tools like ChatGPT and [image generators] Midjourney and Stable Diffusion. It is also new that part of the creative process can now be done by machines,” she told the webinar.

The Zurich University of the Arts (ZHdK) digital skills and spaces team supports university members in digitalisation and skills building around new technologies and new forms of work. The university has a ‘cross-universal’ approach, coordinated by the ZHdK digital council – experts in digitalisation who advise management and assist in implementing transformation projects.

Wolany said: “Our approach is critical curiosity. We want to combine human curiosity with professional expertise, but also with systematic investigation of the topic. Then hopefully we can face the challenges of AI, which are high complexity, fundamental impact for the creative professions and the crazy high speed of development, which is sometimes hard to follow.”

AI action at the University of Bergen

The University of Bergen (UiB) in Norway has created a cross-faculty network called UiB AI, which facilitates interdisciplinary collaboration around AI, said Jill Walker Rettberg, a professor of digital culture and co-director of the new research excellence Center for Digital Narrative.

Generative AI is researched and taught across all faculties. For instance, people are working on learning analytics in biomedicine and medical imaging, others are researching how journalists use AI to generate stories. In digital culture, researchers are using AI and exploring its use in art, in narratives and in culture generally.

Walker Rettberg’s new book, Machine Vision: How algorithms are changing the way we see the world, is due out in September. Her next book, Loving AI, will explore why humans are so fascinated with AI.

“UiB held seminars and conferences and found that people really want to know about AI. We’ve focused on cross disciplinarity, letting different groups talk,” Walker Rettberg told the webinar. “The seminars have been really popular.” The university is also integrating AI into curricula and has a dedicated study programme focusing on AI in several faculties.

“We recently established tiny courses, 2.5 European credits, which are designed so all students can take them. Some are online, some are only taught on campus. They are very small so that they are accessible for any student,” said Walker Rettberg.

“They are immensely popular. We have hundreds of students taking each of the courses and they have now become available to staff and outsiders as well. I think there’s a huge hunger for knowledge about ChatGPT. As universities, it’s our responsibility to give that to people.”

She added: “It is incredibly important that we, as a university, make sure our students and staff have the basic knowledge to be able to use AI in a responsible and constructive way.”

The University of Bergen does not have a general generative AI policy, because there are differences between disciplines. But the social science faculty has established clear guidelines, particularly regarding student writing and exams.

Students have pointed out that it may be difficult to ascertain what ‘acceptable usage’ of AI is. The lines are blurred, said Walker Rettberg. “Without clear guidelines, it can be frustrating for academics, who may see clear ChatGPT use in a student’s work but do not know what is acceptable or what to do.”

Lobet highlighted a multi-author paper, published in ACS Nano, on “Best Practices for Using AI When Writing Scientific Manuscripts”, that provides useful guidelines for academics in their work. It stresses the need to clearly acknowledge the use of ChatGPT or other AI tools, and to check and check again everything that AI says. No AI-generated text should be used verbatim to avoid the risk of plagiarism, and it is the author’s responsibility to check and verify all sources.

Walker Rettberg concluded: “People really want to understand AI and we should support that with small, accessible courses, up-to-date seminars and events. It is important to think about cross-disciplinarity, and really use the breadth of the university, but also to think about discipline specificity, because each subject needs to discuss what their students need.”

Some concluding thoughts

Hansen spoke about the excitement of people working in AI. Healthy criticality was imperative, and students must be brought into the conversations at all times. There are serious ethical issues and questions around biases. For universities, the ‘why not? ‘yes, but’ and ‘no thanks’ could work well in starting up conversations around AI.

Thomas Jørgensen said the webinar provided a good example of the multidisciplinary aspects of generative AI. Rather than focus on ChatGPT, universities should look at how AI is more broadly used. Universities and academics are thinking hard about the consequences of using new technologies, and the importance of critical dialogue came up again and again.

The EUA, said Jørgensen, is watching the AI space. Among other aspects, it looks at legislation, at how AI is used on the ground, the experiences of European universities, and creative ways in which AI is being used. “That is a big takeaway. This is not just a threat. This is also a new, constructive and creative way of using technology.”