For years now, many people I know have been sharing online content with me on all kinds of subjects. COVID certainly got this ball rolling. Much of the online content shared just wasn’t true, or at least wasn’t supported by any current known facts. Some content, I’m sure, was created for nefarious reasons—though often it was hard to know the real motivation behind an “influencer” jumping onto and pushing a narrative, besides the obvious monetary gains around emotive clickbait. It became tiring and disheartening, spending time trying to find out if there was any substance of truth to it. Often this content would have strings, or a base, of truths, with falsities woven through—or eventually burying the truth entirely.
Lately though, this sharing has increased. Along with my pulse rate.
Kremlin-influenced content around Ukraine is obviously especially troublesome to me, and it’s rarely not seeded with falsities. This is nothing new—Russia has provided a masterclass in how to use the internet to their advantage. They’ve spread doubt around what “actually” happens in Ukraine, around Crimea and Donbas this was rife. But this doesn’t stop at Ukraine. The Kremlin’s tentacles reach far deeper and wider globally than most realise. Even to here in Australia. They’re all about fostering division and eroding the pillars of our societies to weaken us.
The Kremlin knows how to spread rage and hate—though willing participants can be found in most countries. They often feed into areas of genuine concern for people, exaggerating, skewing or outright falsifying facts around these. Growing what may have been a fair concern into fear, despair, hate, anger. Leading like-minded people into online echo chambers and groups which further amplify these emotions, even weaponising them. Sadly, it seems the Kremlin barely has to start the ball rolling—we humans do the rest now.
The anger at rallies. The hate being shown towards fellow humans. The mistrust toward governments and organisations. The changes within politics itself. Not hard to see. But much is harder to see.
Then there were the comment sections on news posts. Stories of Ukrainian suffering—hospitals bombed, children killed, entire cities under attack—met with waves of fellow Australians posting callous dismissals, victim-blaming, or outright celebration of Russian victories. The sheer volume and viciousness looked coordinated, even though I thought it probably wasn’t.
And now the ‘random’ posts popping up in my algorithms. Posts from the Russian embassy. Posts from Australian political groups and individuals saying ‘Putin isn’t really that bad.’ Pictures of Trump and Putin with a shining cross between them. But what was behind this? How can people look past the thousands of proven and documented horrendous war crimes and crimes against humanity that Putin has allowed to happen and somehow excuse this invasion?
I told myself I was being paranoid. That my immersion in Ukraine’s reality—the medical missions, the documentation of Russia’s systematic destruction—was making me see patterns that weren’t there.
But the talking points were too similar. Too perfectly aligned with what I knew Russian state media was pushing globally. The same deflections about Ukraine. The same divisive topics being rerun. The same erosion of trust in institutions, elections, media, expertise itself.
So I did what any good researcher does when they suspect confirmation bias: I asked for a second opinion. I asked ChatGPT to conduct a deep analysis of Russian influence operations in Australia and whether the patterns I was seeing were real or imagined. I made sure to ask for references too, to check it wasn’t made-up rubbish.
What came back confirmed that what I’d been noticing wasn’t coincidence. It was strategic. It was documented. And it was working.
What the Analysis Shows
The full analysis below examines how Russian state-backed actors have targeted Australia through state-sponsored troll networks ASIO is actively investigating, the Pravda Australia propaganda operation designed to “poison” AI chatbots with Russian narratives, and strategic amplification of Australian political figures whose messaging aligns with Kremlin interests.
What struck me most was reading specific Australian politicians quoted, their statements placed alongside the Russian narratives they echo. Anti-Ukraine rhetoric lifted from RT. Climate denial serving petrostate interests. Sovereignty arguments undermining international cooperation.
This isn’t about claiming anyone is a conscious Russian agent. It’s about recognising how influence operations amplify existing divisions, finding fracture lines in our society and applying pressure.
I want you to read the full analysis—to sit with the documented evidence from intelligence agencies, parliamentary inquiries, and investigative journalism. To understand this isn’t speculation. It’s operational reality.
When evil presents itself as light…
What concerns me most is what I’m seeing in Christian communities. Just as in America, Christians—people whose faith calls them to truth and justice—are being specifically targeted by disinformation wrapped in the language of faith and freedom. The communities that should be most resistant to authoritarian propaganda are often most vulnerable, because the packaging speaks their language.
Differences and divisions have always existed. Climate, health, politics, finance, immigration, freedoms, beliefs, values. But the Kremlin is too often behind increasing these divisions, weaponising emotions around them. Please be sure, and aware, of this. It’s more widespread than most realise and is certainly weakening countries from within. The USA is a shining present example. The Kremlin has been hard at work there for a long time and is now reaping dividends.
The goal isn’t to make Australians love Russia. It’s to make us suspicious of each other, doubtful of our institutions, cynical about truth itself. To fragment the social cohesion that makes democracy work. To ensure that when autocrats commit atrocities abroad, Western publics are too divided to respond.
The best defence is knowledge. It’s asking where information comes from. It’s noticing when narratives align too perfectly with authoritarian interests. It’s maintaining complexity—recognising you can have concerns about policy without swallowing disinformation, that you can question government without believing elections are rigged, that you can have faith without falling for content that weaponises it.
Most importantly, it’s remembering that Ukrainians are dying right now to defend principles we claim to value. When we let Russian propaganda convince us that supporting Ukraine is against Australian interests, we’re not just failing them. We’re failing ourselves.
Read the analysis. Sit with the evidence. Make up your own minds.
But please, make them up with your eyes open.
