SOUTH AFRICA

ChatGPT – Calm your inner Luddite, keep your inner sceptic
“Calm your inner Luddite, hold on to your inner sceptic,” is one of the messages for educators contemplating ChatGPT and other large language models, from Dr Roze Phillips, a futurist who straddles the worlds of work and academia. “Trying to outsmart AI is not a viable strategy.”For example, trying to outsmart AI to ensure that students are not plagiarising. Rather, and importantly, many people are suggesting that educators make new technologies part of the learning process, Phillips told a discussion hosted by the Academy of Science of South Africa (ASSAf).
It is essential to have a “growth mindset”, which means being able to ask excellent questions, change, and grow from failure. That is just as important as getting the right answer.
“I’m in the world of work and in the world of education, where the job is to prepare students for the world of work. How does the world of work really work?” asked Phillips, who was a medical doctor and then a researcher for the South African Medical Research Council before moving into the corporate world, and then back into academia.
“Whether in medicine or law or finance, technologies are part of how people work. To deny our students these technologies – to hold that away from them in assessments – is not useful because we are not preparing students. The job of technologies is to help us be better at work.”
On 22 February 2023 the Academy of Science of South Africa hosted the 11th ASSAf Presidential Roundtable Discussion, titled “The Implications of ChatGPT for Assessment in Higher Education”. It was moderated by Professor Jonathan Jansen, ASSAf president and distinguished professor of education at the University of Stellenbosch.
Other panellists were Dr Franci Cronje, research associate at the design and marketing IIE Vega School in Cape Town, and Professor Johannes Cronje, professor of digital teaching and learning at Cape Peninsula University of Technology.
Their enlightening presentations showed, to the 1,400 participants, many ingenious uses of ChatGPT and AI-assisted technology for academics and students in the arts and the sciences.
Humanising the inanimate object
It is important not to look only at ChatGPT but at large language models, AI and technology generally, and what these tell us about what we should and should not do, said Phillips, founder and CEO of Abundance at Work and adjunct faculty at the Centre for Business Ethics of GIBS Business School.
“I spend a lot of time on ChatGPT, to be honest, and I find that the conversations are quite enlightening. Obviously technologies are helping us to organise the world’s information. But what technologies are also doing is starting to simulate being human.” With ChatGPT, “when you sit there and type in text, you are having a peer-to-peer conversation”.
Conversing with ChatGPT also reminds Phillips that it is a master in the realm of language. “As educators we do not quite understand how complex language is and how language itself can manipulate.” For instance, language can take a position, it can be optimistic, it can be cynical.
“As students and as members of society, being able to distinguish how we use language is going to be a fundamental skill that we are all going to have to have in the coming years.” People need to think about prejudices, misinformation, and how language is being used.
ChatGPT and other large language models organise the world’s information the way the world has input that information. “It has the same biases that humans have. It gets things wrong as we humans do. It simulates being human. It’s almost a childlike version of being human and it is learning exceptionally fast.”
She recalled spending many months travelling around South Africa with a little robot called Romo, conversing with people about the impacts of technology on the world of work. “I realised that we spend a lot of time humanising inanimate objects while dehumanising ourselves.
“Our young people are doing exactly that. They are humanising the inanimate object. We as educators have to be very careful about how we also get taken in by technology.”
Plagiarism and assessment
In education, the initial focus on ChatGPT has been on the plagiarism battle. “I am worried that we will take our eye off the student, and that we should never do. Is education about being as efficient as possible in assessing students? Or is it about the effectiveness of their learning?”
“ChatGPT can be a partner in learning and it can make the learning process as important, if not more important, than purely the assessment,” Phillips said.
There is a need to lift the veil on assessment. Many students find ways to pass assessments but go through the year worrying about exams, focusing on the product rather than the learning.
In the United States, ChatGPT has learned to pass some of the questions in medical exams. Two things became clear.
First, ChatGPT can become more accurate as it looks at more data, because it is learning from databases. “It can become more consistent and it can perform better in terms of holding the integrity of its arguments. It’s not as good as human beings are, but it’s getting better.”
More fascinating, Phillips said, is that being a generalist seems to make ChatGPT especially good at applying information. For instance, because ChatGPT draws on philosophy, biology, law and various other subjects, it is able to present strong arguments in, say, a legal or a medical case.
“When we teach students, are we teaching specialists or are we also teaching generalists? We are going to have to learn more about how the transdisciplinary nature of knowledge works.”
A word to the wise
What are we learning about how we are learning? “This is the curse of knowledge. As experts, it’s becoming increasingly difficult for us to teach. For the world of work, students need to learn how to collaborate – students already know how to compete.”
Tools like ChatGPT work well in conversation and can be used as a peer in education. “We can do learning in an a-synchronous way, where people are not coming together but ChatGPT is the partner in learning. I argue that would be the best way of using large language models,” said Phillips.
“The job of the educator is to hold the hand of the student as they go through the process of learning, and to remind them of what the integrity of the learning process requires. It’s not about getting the answer, it’s about the process of learning. And the job of the student is to learn how to learn – not just what to learn, but also how to learn.”
Science and technology are great at telling people what is, and technologies are evolving to do almost anything. ChatGPT and other AI-assisted tools have led Phillips to understand the importance of distinguishing between what is versus what ought to be.
A big problem – and an opportunity – for people is how to keep up with technology. “Our technologies are progressing faster than our wisdom.” There must be a focus on lifting humanity’s wisdom, said Phillips, and that is all about moral progress.
She urged educators to learn how language models work and to understand the complexities of language. Understand too the ethics involved in the use of technologies in education, and teach that to students. “If we can do that, then we will produce better humans, who figure out how things ought to be, and not just the way things are.”