GLOBAL
bookmark

Standards matter: Exploring AI’s impact on education quality

The audience at Dr Karen Panetta’s keynote address at the 2024 ABET Symposium, “Science Fiction to Science Fact: The impact of AI on higher education”, was forgiving when she reversed the logic of The Matrix (1999) and cast us as an anthropomorphic AI system that asserts: “Yes, Dracula has an aversion to garlic.”

However, the sobering fact is that Hollywood films aside, there isn’t enough data to back up this hallowed assertion, said Panetta, the dean of graduate education for the School of Engineering and professor of electrical and computer engineering at Tufts University, Boston, in the United States.

We were less sanguine about the mistake Panetta showed us that was made by a real AI program she was working with to improve traffic flow in Boston’s notoriously dangerous tunnels.

The AI system correctly identified as bicycles those bicycles that were illegally in the city’s car-only tunnels. An audible intake of breath followed when we saw that the system dutifully identified as a bicycle what was plainly a person in a wheelchair moving through the tunnel.

The problem, Panetta told the plenary, was “garbage in, garbage out” (meaning: “If you put garbage in, you get garbage out”), a point she developed during an interview with University World News, and which presenters in break-out sessions either referred to directly or alluded to.

The symposium, held in Tampa, Florida, from 4 April to 5 April, brought together more than 700 academics, administrators and business leaders, who enjoyed presentations by experts in AI on such topics as using AI to enhance engineering education; the ethical dimensions of AI; and promoting gender equity and retention in engineering.

There was also a range of sessions devoted to the technicalities of applying for or renewing ABET accreditation, in which participants learned best practices.

ABET is a Baltimore, Maryland-based organisation that accredits 4,674 STEM programmes in more than 900 colleges and universities in 42 countries and provides professional programmes.

“ABET is a quality assurance organisation with a focus on assuring quality in higher education. So, we are concerned about this, because part of our purpose is access for all to quality education. And we believe that education is essentially a means to improve society as a whole,” said Jessica Silwick, ABET’s chief financial officer and chief operating officer.

“Our big, lofty goal is that one day everyone around the world will have access to a quality education as education provides a means and a mechanism to solve problems, to make improvements to quality of life,” she said.

Handling hallucinations

Generative AI systems may be, as Karen Panetta explained, built on old computer architecture, but they are a difference in ‘kind’ when compared, say, to the computers used for computation (or even word processing) four or so decades ago.

The difference is in the so-called ‘black box’ which contains the virtual neural network, formed by the inputting (or training) of, in the case of ChatGPT, the equivalent of four times the data in the Library of Congress of the United States. This neural network is formed with layers of interconnected nodes with millions of connections of constellations.

“What happens in the black box is not always predictable, nor is it always correct,” Panetta told University World News: ‘exhibit A’ being the misidentification of the person in the wheelchair.

More than one presenter used the term “hallucination” to describe some of the more phantasmagoric responses generative AI systems produced.

For instance, when Professor Brock Croft, who teaches human centred design and engineering (HCDE) at the University of Washington (Spokane) and directs the university’s BSc programme in HCDE, asked the large language model to diagram how liquid mercury would have been extracted “in the old days”, the response resembled an Escher print with mounds of earth, wheelbarrows and weird arrows.

Panetta’s story of the AI program she and a graduate student developed to identify COVID-caused lung damage also serves as a cautionary tale. The stats were impressive as the program correctly identified well over 98% of normal lungs, lungs with pneumonia, and COVID-damaged lungs.

Panetta’s graduate student was, however, upset because a competing technology had a 100% success rate. It took, Panetta told us, some investigation but the eureka moment arrived: the rival technology had labels under each medical image.

“It had learned to read the label,” Panetta said sardonically.

The value of AI-enhanced education

Croft’s presentation on how ChatGPT4 facilitated the redesign of Professor Patti Buchanan’s critical professional ethics course, which both she and her students found boring, is a signal example of what AI-enabled pedagogy can accomplish – and in very little time.

“I suggested that she tried doing an activity around the professional engineering standards. So students can become familiar with them,” he told us. He suggested that the course could use learning cards, though he didn’t have any ideas.

Enter ChatGPT 4 and the prompts: Use the National Society of Professional Engineers’ standards to create an active learning activity for college engineering students in which they use actual examples to support their learning; the activity should use a deck of cards that are centred around the standards; write the instructions of the activity and what should be on the deck of each card. You can use case studies as a framing device for this.

Croft explained that instead of thinking of himself as speaking to a robot, he thought of explaining to a friend what he wanted to happen in the session.

“ChatGPT 4 can cope with natural language description, and the more precision that we add to the description, usually the better and more rich the outcomes are,” he said.

“Chat dutifully created an ethical decision-making game that leverages cards printed with different real world ethical quandaries. And what's exciting about this is that not only does the tool seem to understand exactly what we wanted, but it also is very detailed in its course design,” he added.

Pointing to the lesson plan, Croft continued: “Note that it gives a title and a learning objective for the session here. The title is ethical decision making in engineering, a card-based activity. And then the objective, which is what we wanted to do engaging students in active learning, focused on ethical decision making around the NSPE standards.”

One of the scenarios involved a civil engineering project which was progressing well but the client now wanted to use substandard materials to cut costs. The card also has a couple of questions to prompt students to think about what kind of quandary this could present and what kind of NSPE standard this quandary specifically relates to.

What was lacking, Croft said, was a backstory, so he then asked ChatGPT 4 to write it as a real story with concrete details.

“It provided a background, a turning point in the story, a description of what the dilemma actually is, along with some bullet points of the details. The ethical challenges are also presented here, followed by ethical questions that are raised. So, it paints quite a picture in a bustling city. A civil engineering firm, Horizon Structures, is working on a high-profile infrastructure project that aims to enhance the urban landscape. It's been progressing smoothly, but they want to save money.

“What’s really cool about this is that we also generated something that directly engaged students in responding to those NSPE … and we achieved [it] in less than an hour, literally less than an hour,” he said.

Panetta too pointed to the important success of AI-enhanced education in dentistry. She showed us slides which indicated how, by tracking students’ eye movements and matching them to a trained clinician’s, students can be taught in real time where to focus their attention and, therefore, where decay can be spotted.

The theme of caution

A theme that ran through the sessions was caution. Or, as Neil Gaught, special advisor to ABET for strategy, explained: for all its promise and all it can now enable technologists, researchers, professors and students to do, AI presents a challenge to an organisation like ABET that is committed to “confidence”.

“The core of ABET, the single organising idea that the whole organisation is built around (and that’s underpinned by values), is confidence. We’re in the business of assuring confidence,” he said. “And that extends from students wanting to have confidence in the programmes in which they enrol to parents wanting confidence in them and employers wanting confidence in these programmes’ graduates.”

To drive home the point, Gaught quoted from ABET’s website: “Our approach, the standards we set and the quality we guarantee, inspires confidence in those who aim to build a better world, one that is safer, more efficient, more inclusive, and more sustainable.”

He said standards matter but in the case of AI, what are they, who is setting them and where is the transparency? In the symposium sessions, speakers and audience members raised many concerns about how we can have confidence in AI, fears over unethical use and worries about the biases in datasets used. Some believe there should be something like a Hippocratic Oath – the oath of ethics historically taken by physicians – for AI engineers.

At the end of her keynote address, Panetta raised questions about the ethics of data collection in, say, health studies (for example, is the source data for health studies demographically and regionally collected? Are the proper protocols in place?). She then called for a “Food and Drug Administration-like organisation” to certify the veracity of AI as one way of dealing with the problem of deep fakes.

In defence of diversity

ABET is a private organisation based in Maryland, so it is not subject to the laws that pertain to Florida’s public post-secondary sector. However, since Florida is only one of more than two dozen states, including Texas, that have banned or sharply curtailed DEI offices, it was notable in our interview to hear Silwick’s defense of DEI initiatives, especially within STEM fields not traditionally associated with such advocacy, underscoring ABET’s commitment to accessible education for all students.

“An electrical engineer needs technical expertise. Obviously, right? So, they've got to understand their field of study, and how to apply the principles and how to do the work to create the solutions. But the world's issues are too complex to simply apply theories …

“You may go to school and learn thermodynamics, and how to use those principles when needed. But when you go out into the world, and you're creating solutions for people of all backgrounds and socio-economic statuses, you need to learn how to listen to and work in diverse teams and atmospheres. That is why diversity, equity and inclusion is so important to ABET.

“Because our graduates are going out there to make the world a better place, they’ve got to know how to be able to hear the perspectives, hear the needs of people, and how to create solutions that are going to work for them,” she said.

Sticking to values

Silwick added: ““We’re committed to our values as an organisation and helping ensure a respectful learning environment for everyone. Our values based upon sound data and just basic knowledge tell us that diversity, equity and inclusion lead to innovative thinking. Innovative solutions lead to more efficient and higher-powered teams.

“Our graduates are focused on going out there and creating these solutions. And creating a means to make the world a better place. It's essential that they have the ability to work with diverse groups as well as learning how to include those voices.”

In countries where DEI policies do not exist, Silwick said that ABET is respectful of both its educational mission (confidence) and of the different missions and purposes of the universities in which they accredit programmes. This is not always directly connected with a country's politics or policies around DEI.

“I would say there are different opportunities and ways that we see programmes and universities applying principles of DEI into either their instruction, or into their university and our programmes,” she said.

She added that in some instances, depending upon where a country is on the continuum, the fact that a woman can go to school and obtain a degree is a step in the right direction.

“We believe that wherever we go, wherever ABET is, we’re providing opportunities, that we are encouraging continuous improvement, that we are raising awareness … We don’t judge where you're at on your progress.

“If there were alarming human rights violations that were happening at campus, obviously, that would be brought to our attention. But we've been very fortunate enough not to be in any of those situations. So, again, we’re there to assure quality education and access to education,” said Silwick.

*The story was slightly altered after publication.