Education can help us stay ahead of the disinformation wars

Halfway through our interview, Lukas Andriukaitis, an expert in Russian disinformation campaigns, explains why he dislikes the term ‘fake news’.

“I don’t think it says a whole lot because ‘fake news’ is ‘fake’ by definition. More important is that it is disinformation – disinformation disguised as news and-or information. The techniques have not fundamentally altered since the days of the KGB.”

But what is different, he says, is the complexity that digital communications allow and the fact that the internet reaches “way more people than just handing out fliers”.

The Belgium-based Atlantic Council’s Digital Forensic Research Lab (DFRL), of which Andriukaitis is associate director, is one of several institutes devoted to exposing and debunking disinformation on the internet, and understanding the intended goals of both state and non-state actors.

Two others are the new Information Integrity Lab (IIL) housed in the Professional Development Institute at the University of Ottawa (UO) in Canada and the Network Contagion Research Institute (NCRI) at Rutgers University in New Jersey, United States, which has been training 100 students each year for the past three years.

The 18 February media release announcing the IIL, Canada’s first cross-disciplinary lab designed to research and expose fake news could not have been timelier. Six days later, after a massive disinformation campaign that including trumped up claims that Ukraine was mistreating ethnic Russians in the breakaway provinces of Donetsk and the Donbas (led by pro-Russian politicians), Russian President Vladimir Putin’s army invaded Ukraine.

According to UO President Jacques Frémont: “Knowledge, facts and truth are being challenged, and challenged very aggressively. Disinformation and fake news are being used not only by individuals and organisations, but also by state actors to destabilise entire societies, to severely erode public confidence in private and public sector organisations and to attack our core beliefs in freedom, equality, the rule of law and human rights.”

Frémont links the battle against fake news to the core mandate of higher education institutions: “research, critical thinking and the advancement and circulation of knowledge and facts”, as does Rutgers Psychology Professor Joel Finkelstein, who is chief science officer and cofounder of NCRI.

Russia’s three-part campaign

The disinformation campaign Russia unleashed prior to attacking Ukraine (and the one being waged today) consists of three parts.

In addition to claims about Ukraine’s actions in Donetsk and the Donbas, the first part of the disinformation campaign repeats Putin’s irredentist assertion that Ukraine does not have the right to exist as a separate country, made most famously in an essay he published last summer.

A number of websites and internet influencers parrot what Andriukaitis calls Putin’s “mismanagement of historical facts”: that Ukraine was created by Vladimir I Lenin, the founder of the Soviet Union (Ukraine was one of the constituent ethnic republics of the USSR); that half of Ukraine’s territory was given to it by the Russians and that the Ukrainians and Russians are one people and members of the same Russian Orthodox Church, despite the Ukrainian Orthodox Church having achieved autocephaly (independence) in October of 2018.

According to Andriukaitis, not every website or post needs to repeat all of Putin’s overarching narrative to help support it. Botnets produce tweets that may repeat only one part of Putin’s fabricated story, for example, the claim about the Russians and Ukrainians belonging to one true church.

By the reverse of the process of elimination, these tweets contribute to the ‘political imaginary’ desired by Putin. As defined by sociologist Craig Browne of the University of Sydney, Australia, and Paula Diehl, chair of political theory and the history of ideas at Christian-Albrecht University of Kiel in Germany, the ‘political imaginary’ is the “collective structure that organises the imagination and symbolism” of an individual’s political thought; in this case the idea that the Ukrainian Orthodox Church is at best a theological error.

“It’s often hard to connect the dots,” Andriukaitis told University World News. “And it’s hard to know whether the actors are connected to Russia or working with it. They might be, you know, just useful idiots in the West. But if you take a step back, you see the whole thing, you can see recurring messages supported by various stories and posts.”

The second part of the disinformation campaign denied that Russia was planning an invasion, despite the documented build-up of its forces around Ukraine. Andriukaitis likens this claim to those on pro-Russian websites charging the West with having started the war in Syria that began a decade ago.

“One of my favourite quotes was that Russia has never started a war, which is hilarious,” says Andriukaitis, who has taught personal digital security and introduction to advanced digital forensics at the College of Europe, Natolin in Warsaw, Poland.

Russia invaded Georgia in 2008 to support the pro-Russian insurgents in the breakaway provinces of South Ossetia and Abkhazia; in 2014 Russia invaded and then annexed Crimea; and on 17 September 1939, as part of the secret protocols of the Nazi-Soviet Pact, Russia invaded Poland.

At first glance, the third part of the disinformation campaign appears to undercut the second because it identifies a casus belli. Among the items that made up this fake news is Putin’s claim that the Russians had to liberate Ukraine from the drug-addled Nazi gangsters who had taken control of the country.

These assertions were so outrageous that fact checkers easily debunked them, says Andriukaitis, the Nazi charge being ludicrous since Volodymyr Zelenskyy, Ukraine’s president, is Jewish and lost family members in the Holocaust, and far-right parties poll in the low single digits.

Since Russia invaded Ukraine, the disinformation campaign has continued. On 5 March, for example, the day I started writing this article, CNN debunked a series of (fake) postings that purported to come from the network, one of which reported that an American had been killed in Ukraine – a claim repeated a few hours later on the floor of the United Nations Security Council by the Russian ambassador to the United Nations.

Tracking code words

Code words or expressions like ‘Hail Honkler’, the visual of which is a clownish figure, is important for creating group cohesion among “people who don’t really operate in the real world and spend hours and hours online looking for increasingly sensational and dark material”, says Finkelstein.

The seemingly edgy humour and ambiguity in ‘Hail Honkler’ is intentional. ‘Hail Honkler’ seems like a joke, unless you are in the know, and know that it is a way to get around restrictions on ‘Heil Hitler’ on platforms such as YouTube and Reddit.

For NCRI’s researchers, however, these code words and visual memes are traces that they feed into programs like Pushshift, a large-scale social media ingestion engine that’s leveraged by more than 307 universities across the world.

“We use data-driven machine learning analyses that take their findings and are able to sort of understand how the depth and breadth and extent of networks that we are seeing are connected,” says Finkelstein. Both Finkelstein and Serge Blais, executive director of UO’s Professional Development Institute, liken the tracking of code words to the way weather satellites look for weather patterns.

In its Insights Report of 1 March 2022, for example, the NCRI shows that between 1 and 23 February, the topic network, formed by over 29,000 tweets, denouncing the New World Order, and linking COVID-19 vaccines, the Great Reset (conspiracy) and the Trucker convoy, as well “cryptically antisemitic framings such as [George] ‘Soros’ and ‘Globalist’”. (A topic network is a cloud of words connected to each other by contextual similarity and can be imagined as the idea-mapping that university and college teachers use to teach brain storming.)

This topic network, shaped vaguely like a hot air balloon, all but vanished on 24 February, the day the Russians invaded Ukraine. It was replaced by one in which Ukraine and Putin were the largest central nodes, the point at which the lines indicating relationships came together.

“Popular right-wing Instagram accounts like dc_draino, the_typical_liberal and dreamrare have insinuated that Ukrainian President Zelenskyy is a crisis actor [who is] part of a deep state plot to bring about the new world order,” the NCRI reported.

These linkages allow analysts like Alex Goldenberg, senior intelligence analyst at the NCRI, to “understand the dynamics of these groups and what fuels their growth”.

It is not possible to know precisely which of the right-wing influencers Russia supports – or even, Blais told me, whether they are human as opposed to the digital expression of an artificial intelligence (AI) algorithm.

Nor is it known which left-wing digital actors Russia supports, but, says the NCRI, groups that use “communist emoji symbols (such as the hammer and sickle, the symbol of Communist Russia) express support for communist regimes”. Bot accounts disseminated the hashtag #abolishNATO, created by @mountainchen24, which NCRI calls a “communist influencer”.

What is known is that the Russians are “paying or supporting both the radical right and the radical left”, says Andriukaitis. Further, he told me that he does not think it’s a coincidence that “those highly pro-Kremlin, highly anti-EU and anti-NATO groups were among the most aggressive antivaxxers. I think it is part of a longer game to sow discord.”

Professor Christian Luprecht, who teaches political science at Queen’s University and at the Royal Military College of Canada (both in Kingston, Ontario), and who has criticised the laxness of Canada’s laws about foreign actors’ financial influence in the country’s politics, makes a similar point with reference to Project Lakhta, the Russian name for its meddling in the 2016 US presidential election, that saw Donald Trump lose the popular vote but win the Electoral College and, thus, the presidency.

“The Russians may not have cared who became the president of the United States. What they’re interested in doing is throwing into question whomever is the president of the United States. For them, success is polarising American society. It’s undermining the institutions or showing that they are paralysed that is the aim.”

For his part, Andriukaitis thinks that the Russians wanted Project Lakhta to swing the election in Trump’s favour, but not just to sow discord (as was clear from candidate Trump’s statements). Rather, Putin had another goal, which Andriukaitis stated in chilling words: “To show that Russia is almighty and all powerful and that Putin decides who will be president of the United States.”

Case studies

One of the 50 case studies that NCRI uses to train the next generation of analysts is about the far-right Boogaloo boys in the United States and how the institute’s analysis allowed police to thwart a possible blood bath in Richmond, Virginia on 20 January 2020.

A few weeks earlier, NCRI’s analysts monitoring Twitter spotted a new set of memes and photos posted by the Boogaloo boys, who had not previously been on NCRI’s radar.

Photos showed young men, their faces hidden by balaclavas, wearing body armour and carrying semi-automatic weapons. Some of the men wore patches with skulls on them, an in-group reference to the Atomwaffen Division (AWD).

Founded in the United States in 2013, the ‘Nuclear War Division’ has recruited members from retired and serving US armed service personnel and has tried to recruit at Boston University, the University of Chicago, the University of Central Florida and Old Dominion University in Virginia. Among the AWD’s goals are instigating a race war and violently overthrowing the government of the United States.

The Boogaloo boys also wore shoulder patches featuring Pepe the Frog. Developed as a comic figure by Matt Furie in 2005, Pepe the Frog was soon appropriated by right-wing groups and popularised on 4chan, an unedited discussion board, which has posts of Pepe the Frog wearing night vision goggles and holding an automatic weapon.

Using Pushshift and other software, NCRI’s analysts tracked how the Boogaloo boys’ postings had become increasingly apocalyptic and violent. NCRI’s software also identified Facebook groups that post links to sites showing how to use 3D printers to make firearms and high-capacity magazines.

“We were able to combine our social media investigation engine, social media analytics, with open-source software to assess that the Boogaloo boys wasn’t just a meme; it was an actual group that represented a kinetic [real world] threat,” says Goldenberg of the NCRI.

NCRI gave this information to a number of police forces, and Goldenberg briefed the US Army’s counterterrorism unit. Prior to the rally in Richmond, the FBI arrested a number of Boogaloo boys who planned on inciting violence by shooting into the crowd.

“There were going to be thousands of people there that were heavily armed,” says Goldenberg. Had the Boogaloo boys fired into the crowd, “it would have been a nightmare”.

Fake photos and videos

A large percentage of fake news postings include faked or mis-attributed photos and videos.

In mid-January, at the height of the supply chain disruptions, a picture purporting to be from CityNews in Toronto was posted to both Facebook and Twitter. The picture showed a woman standing before empty supermarket shelves with the words, “Empty Canadian Grocery Store Shevles (sic) Could Become a Larger Problem”. The caption was used in another article, superimposed on the picture.

CityNews told Reuters: “Our logo is being used on a photograph that is not ours and we didn’t use the photo in our news coverage either.” The image, in fact, was a stock photo from Getty taken in a British supermarket.

Since the Russian invasion of Ukraine there has been a veritable torrent of doctored or completely faked images. One shows the American actor Steven Seagal, whom Putin gave Russian citizenship to in 2016, in the uniform of the Russian army, for example.

Organisations like CNN, IIL, DFRL and NCRI have a number of ways of authenticating photos and videos. The first is the provenance of the image; seasoned analysts know which organisations can be trusted to release a true image or video. They also know what image passes the so-called ‘smell test’.

A second analytical tool available to these organisations is a reverse image search. Essentially, this involves entering the image into a search engine that scours the internet and databases looking for the same or a very similar image. As Andriukaitis explains, if you suddenly have an image of a Russian plane being shot down, a reverse image search might find the exact same image posted 10 years ago from the war in Syria.

As anyone who has scanned a document knows, digital photos are made up of dpi (dots per square inch) and on screen by pixels. There are a number of analytical programs that can determine whether the compression rate in one part of an image differs from the rest of the image. This is a tell-tale sign of an image being photo-shopped or otherwise altered.

Satellite imagery is used to authenticate photos of, for example, buildings and other large outside structures such as bridges. “If someone is showing a photo of a burning building, we can use satellite images to look and find that building on a map and see if photos taken from space on that day show whether the building was, for example, on fire,” Andriukaitis explains.

On 2 March, after some confusion about which building had been hit by a Russian missile, this method was used to determine that a building of Kharkiv National University that houses the sociology department and not the police station next door had been hit. The two buildings are architecturally similar.

Addressing social media ‘blind spot’

For more than four decades, before the creation of the IIL, UO’s Professional Development Institute had run about a thousand courses for both public and private organisations, training some 10,000 people annually.

Among the organisations the Professional Development Institute worked with are the New York Police Department as well as the Washington DC Police Department to help them understand what disinformation is and how to identify its nefarious or insidious purpose and how it seeks to “to divide, to devise, to distort, brew up dogmatic activity, or lead to violence”, says Blais.

“Most police leaders will tell you that they have a bit of a blind spot as to what goes on in social media,” Blais told University World News.

They understand, for example, the back alleys of the neighbourhoods they patrol. They understand what is happening on the street. They know who’s out there and who to watch for. But social media has evolved rather recently, very, very rapidly and very haphazardly and doesn’t fit into traditional ‘situational analysis’, he says.

Blais, whose office at UO is a few short blocks from where the truckers set up camp on 29 January and shut down the centre of Canada’s capital city, pointed to the so-called ‘Freedom Convoy’ as an example of the police failing to take seriously information gleaned from social media and make it part of integrated threat analyses.

To be complete, police planners have to know, Blais says, “who’s bullying whom on social media” to see “where are the hotspots, who’s calling for violence, who’s calling for social unrest” across multiple platforms and being heard by an audience numbering into the hundreds of thousands or even millions.

As both Blais and Luprecht emphasised, the thoughts of the leaders of the ‘Freedom Convoy’ were freely available on the internet or, in the parlance of the field, their plans were “open-sourced”.

“Everybody knew, my kids knew, that the trucks were coming to Ottawa. It took them a week to get here and they posted their plans,” Blais says, alluding to the fact the convoy leaders like Pat King posted videos – and gave interviews – in which they said the truckers were prepared to shut down Ottawa until the government reversed the mandate that truckers coming into Canada had to be fully vaccinated against COVID-19.

The so-called ‘memorandum of understanding’ the truckers released went further. It called for the dissolution of the government led by Prime Minister Justin Trudeau (that was re-elected last October) and its replacement by a committee that included members of the truckers’ leadership. Other postings threatened Trudeau directly. King, for example, posted a video on in which he openly said, “Trudeau, someone’s going to make you catch a bullet someday.”

For his part, Luprecht says of the truckers: “They were 10 days coming right across the country. It’s not like they just showed up overnight. We had open-source intelligence … [that] made it clear from the beginning that they were here [in Ottawa] to stay. And from the beginning, some of them, not all of them, but some of them had some sort of seditious intent. ‘We’re here to bring down the government’ may not have been explicit, but they were here.”

And yet, as has been widely reported, the Ottawa Police Service (OPS) as well as the Ontario Provincial Police (OPP) ignored this digital information – and not only failed to stop the truckers from reaching the downtown, but effectively invited them into the downtown.

The OPS later explained that based on a protest last year, they expected the truckers to remain encamped at the foot of Parliament Hill for the weekend only. The occupation of Ottawa ended weeks later, on 20 February, after the police from the OPS, Royal Canadian Mounted Police, the OPP and more than 10 other jurisdictions moved in.

The dense lines in the topic network for the last week of the truckers’ protest contained in the NCRI Insights Report of 1 March indicate that the Russians and, likely, other state actors and non-state actors aligned with anti-Western governments, saw the truckers’ protest, or, to be more precise, Canada’s seeming inability to deal with it effectively, as an opportunity to underscore the weakness of democratic countries.

These posts are part of the narrative advanced by the Russians, which, according to Luprecht, is that “basically democracies don’t work, that they’re completely dysfunctional”.

‘Reviewing’ false information

When he turns to discussing fake news, Blais speaks candidly: “Our role is not to counter every piece of bullshit that is out there. That would be impossible.”

Blais would like to see what he calls a “bottom-up” effort to testify to the veracity of an image or posting. Modelled on the reviews used on, for example,, this system would see platforms like Twitter, Facebook and YouTube include a ‘true’ or ‘false’ button that viewers can use to indicate whether a posting is factual. Ideally, he says, you would be able to attach a video or other posting that counters what an individual judges to be factually incorrect.

Even this real-time system would have struggled to cope with the story of there being secret US biolabs in Ukraine that, as Ottawa investigative journalist Justin Ling showed in a series of tweets, shot around the internet on 3 March.

Having been seeded by the Russians a few weeks earlier, on 3 March it became the subject of at least one QAnon video post as well as being featured by pro-Kremlin influencers on a number of platforms before Russian Foreign Minister Sergei Lavrov mentioned it in a press conference, also on 3 March, as being close to a casus belli.

The video, which was posted to TikTok, purported to identify the location of these secret sites. A posting to Twitter by someone who calls himself Dimitri Alperovitch (but who is not the chairman of the Washington DC-based think tank Silverado Policy Accelerator) is, according to Ling, known to be linked to QAnon. This tweet presented map overlays.

The first purported to show the location of the secret laboratories that had received US funding. The second was a map of Russian airstrikes. Ling deadpans the supposed coincidence, noting that were they to exist, such labs would almost certainly be in cities, and cities are Russia’s primary targets.

QAnon’s reasoning is, he shows, backwards causation: we know that these labs exist and Russia’s bombing and shelling campaign proves they do because the Russians are bombing and shelling cities.

On 3 March, the pro-Kremlin disinformation channel StalkerZone, which is linked to the separatist leaders in Donetsk (which Russia formally recognised as independent three days before invading Ukraine) posted four stories.

Among the titles are: “What are Secret US Biolaboratories Doing in Ukraine?”, “The US Has Opened Dozens of Genetics Laboratories on the Russian Border: What are they Hiding?” and, echoing Putin’s claims about Nazis in Zelenskyy’s government, “The Nazi Origins of US Biolaboratories”.

At 12.29pm Ottawa time (EST, the same as New York), Ling tweeted: “This stuff is just popping up everywhere”. Two and a half hours later, Ling tweeted that Vasily Prozorov, who purports to have been a contractor with the Ukrainian security services before defecting to Russia in 2019, claimed (earlier in the day) that the Russians had liberated the US biolab in Kherson (a city that fell to the Russians that day).

About the same time that Prozorov is said to have tweeted, Sergey Sudakov, a political scientist and regular on the state-run network Sputnik, tweeted that the US was working on a “deadly virus” in a biolab in Kharkiv. The post had already been viewed 160,000 times by the time Ling tweeted about it.

In a press conference in Moscow, late on 3 March, Russian Foreign Minister Lavrov repeated the fake news. “We have data [to show] that the Pentagon is preoccupied about the chemical and biological installations in Ukraine because the Pentagon built two biological war labs and they have been developing pathogens there in Kievan [Kyiv] and in Odessa. And now they are concerned that they may lose control over these labs.”

The importance of knowledge and context

Towards the end of our discussion, I ask Finkelstein and Goldenberg what professors who are not at universities with institutes devoted to fighting fake news can do.

“Because the only thing that’s capable of determining what is information is knowledge, we need knowledge about how to sort out what that [a meme or a piece of ‘news’ on the web] is in context,” says Finkelstein.

To understand information as it is packaged on social media platforms requires digital literacy and the historical context to identify misinformation. “We’re creating pathways for students to be exposed to a multidisciplinary mode of critical thinking that is designed to get outside the boxes, both in terms of being analysts and in terms of being outside disciplinary silos.”

Perhaps because he was speaking from Vilnius, Lithuania, a scant 35km from the border of Russia’s only ally in Europe, Belarus, for all his faith in DFRL’s analytical techniques to identify disinformation, distinguish between true and fake pictures, and understand the purpose of a disinformation campaign, Andriukaitis struck an almost fatalistic tone when I asked what he wanted to say directly to University World News readers.

“Disinformation wars are happening at the moment. And I feel like they might be the prelude to bigger things.”

After referring to the war in Ukraine and to the Chinese, who are now following the Russian digital disinformation playbook, he says: “Information wars might be one step from actual wars, but we should not underestimate how serious disinformation wars are and how much damage they can cause.”