Skip to main content

“The brain is the next battle space.” James Giordano— Neuroethicist at Georgetown University Medical Center

“It takes less time to make up facts than it does to verify them.”— Christopher Paul

The human mind, scientists contend, is built for belief.” — Matthew Hutson, Psychology Today 

“It’s increasingly difficult for people to navigate what’s true and what’s false in their own memories.”— Robert Nash

“There is irony in the fact that Putin’s invasion aimed at the “denazification” of Ukraine is repelled by a Russian speaking president of Jewish origin.” — Tatiana Zhurzhenko

Have you ever remembered something and then wondered if it was a genuinely real memory or something you merely imagined? The human mind is a powerful thing. It can provide us with some of the most detailed memories of our past as well as repress other memories that bring us stress and emotional pain. What if I told you that there are techniques that media outlets, governments, and commercial enterprises have developed and are using that can not only stimulate our remembrances, but also manufacture them through the use of words and images. Once these memories have been manufactured, they can be used to manipulate us to believe something to be true that has no basis in reality. Ultimately, this is the primary aim, to manufacture within people’s memories a false reality that favors a specific agenda. I want to use a discussion of these techniques as a foundation for understanding how governments, such as  Russia in their present invasion of Ukraine, can attempt to justify their unconscionable actions to their own people and other nations around the world. 

The question is, “will Russia’s Post-Soviet era propaganda machine be able to weaponize the memories of its own people and those of its former satellite allies?” It has been interesting to note that, even amidst the heavy bombardment of lies and manipulation, the actions of the Russians still only seem to evoke anger and contempt. 

The contemporary Russian model for propaganda is characterized as “the firehose of falsehood because of two distinctive features: high numbers of channels and messages, and a shameless willingness to disseminate partial truths or outright fictions. In the words of one observer, “[N]ew Russian propaganda entertains, confuses, and overwhelms the audience.”

Russia is by no means alone in their attempt to manufacture thoughts and beliefs. One of the realities that each of us must deal with is our own susceptibility to the machinations of information outlets and how they are impacting our understanding of the world and especially current events. Our minds are currently under attack; human memory has been weaponized by technology, distancing us from fact and fomenting disagreement. If the devolution of discourse needs a timeline, look to the word of the year named by leading dictionaries—post-truth in 2016 (Oxford), fake news in 2017 (Collins English), misinformation in 2018 (

To be sure of one thing, there is so much more than meets the eye when it comes to understanding the shifting landscape of geopolitical conflict and relationships. The world has quietly transitioned from a two dimensional world, where nations rise up against one another because of economics, land, natural resources, and religious ideologies, to a world where the grievances of the distant past are now as much a reason to enter into armed conflict as any present reality. 


The current reality is that neuroscience research is a heavily invested-in field of study. The ability to manipulate the human mind is the holy grail of neuroscience exploration. Though most of the research is originally designed to assist in the rehabilitation of those both born with and who suffer from neurological damage through accident or injury, there are just too many applications that can be derived from this research that apply to military and geo-political purposes to ignore. Frankly, there is a lot of money in the weapons R & D community.  Few of us have the educational background to really understand the research, so, rather than bore you with the details, we will broad-stroke the science and look primarily at the applications. If you are interested in learning more I have included a few articles below that give a good background on the science. 

Interestingly, a massive amount of funding has gone into the ability to alter cognitive capacity through the research of  what psychologists call source-monitoring errors. 

When a scene or fact comes to mind, the brain tries to identify its source: Was it stored in memory or are you simply imagining it? We often use unconscious guidelines, or heuristics, to determine a source. If a scene is pictured in rich detail, you are inclined to assume you actually experienced it and to nestle it in your own personal timeline. More systematic conscious processing might also be used; If you know you were somewhere else when the event happened, you reason that you can’t have lived it. 

Things seem more true upon repetition, what’s known as the illusory truth effect. Psychologists offer two main explanations. First, if you hear something a lot, especially from many people, you reason that it’s probably true. Nine out of 10 dentists agree. “With the internet, it no longer matters how bizarre your belief is,” says Stephan Lewandowsky, a cognitive scientist at the University of Bristol in England. “If you jump online, you can find a community of like-minded people. Flat Earthers are a prime example of that.” The first step toward remembering something is believing it.

We’re also more likely to say an event really happened when told of it by a trusted source—a parent or a close friend or mentor. And we are prone to accept events that align with our ideologies. Using images of fabricated public events, Elizabeth Loftus from the University of California found that liberals were more likely than conservatives to remember Bush’s (fake) vacation during Katrina, and conservatives were more likely to recall Obama’s (fake) handshake with Ahmadinejad.

Here is an example of a way that our memories and realities are being manipulated. ”The landscape has shifted in the last couple of years,” says Hany Farid, a professor of computer science at Dartmouth College and a veteran of digital-image forensics. “You have the technology to create sophisticated and compelling fakes. You have a delivery mechanism with unprecedented speed and reach. You have a polarized public willing to believe the worst about adversaries. And then you have bad actors, whether state-sponsored agents disrupting a global election, people trying to incite violence, or those looking to monetize fake news for personal gain. This is the perfect storm of an information war.” It is a war our memories are not adapted to win. 

The origins of the current weaponization of historical narratives by the Russians can be traced back to Ukraine’s Orange Revolution. In the wake of the 2004 presidential elections, spin doctors working for the Ukrainian Party of Regions (a pro-Russian political party funded by the Kremlin) denounced their opponent Viktor Yushchenko as a radical nationalist and created a semantic link that equated Ukrainian nationalism with fascism and, by extension, with Nazism. This association, which drew on an old Soviet ideological cliché, was deployed in political propaganda and picked up by Moscow in support of Viktor Yanukovych’s political efforts against Yushchenko. When Russia supported the anti-Yushchenko campaign in Ukraine it built on its experience of memory wars with the Baltic countries.

Ukraine’s divided political elites have used conflicting memories and antagonistic historical symbols as a tool for mass mobilization. Russia has profited from these domestic Ukrainian “memory wars” in its efforts to weaken Ukraine and prevent its re-orientation toward the West. In spring 2014 antagonistic historical narratives were weaponized by the Kremlin and its allies in Ukraine to fuel unrest, undermine public institutions, and delegitimize the Ukrainian state. 

These same fake narrative machinations are being used again today in the Ukraine by the Russians to justify their actions. Their narrative is this: Ukraine has been infiltrated by anti-Russian nazis who are tormenting the Russian-speaking populations of Ukraine and must be dealt with in order to protect their people. Putin even went as far as saying that it is the Russian government’s responsibility to protect Russian speakers anywhere in the world that they may be found. Think about that for a second. The Russian American population is estimated to be 3.13 million. Does Putin see it as his responsibility to protect Russian speakers in the US, even to the point of invading the US to secure their safety?

It seems internal conflicts on memory and history are deemed irrelevant in front of a mortal danger. The ongoing war has already produced a huge collective trauma for which the Ukrainian society will need decades to deconstruct. And a new national narrative is already in the making which places this war in a decolonization perspective, challenging the established cultural hierarchies in the Ukrainian-Russian relations. The long-lasting ramifications of this war and its relentless attempts to redefine the memories of millions of Ukrainians, Russians, and all Eastern Europeans will remain with us for quite some time to come.


 When you don’t just know something, but can replay it in your mind’s eye, will you listen to someone telling you, basically, that you’re not just wrong, but crazy? This is the reality of billions living in the world today. Information is not the only thing being weaponized today. The weaponization of our own memories can be an even more destructive thing than the propagandizing of false information, even though both are closely related. 

Research suggests that experiences with a high degree of emotional impact have a higher chance of being encoded into our long-term memories. Memories that drastically exceed the threshold of what our mainframe registers as a normal event often stay with us our entire lifetime. These mechanisms in our brain present a powerful gateway for emotional manipulation and have been exploited by governments and media since the early days of propaganda and advertisement. 

With the emergence of the internet, an explosion of cloud-connected devices, and recent developments in data mining technologies, the ability of governments and corporations to gather, analyze, and use personal information has been weaponized at an unprecedented scale. With the use of artificial intelligence(AI), they are slicing through large datasets of timestamped, geo-located, labeled information—defamiliarizing it, remixing it, repacking it, and projecting it back on us in the form of targeted personalized ads and snippets of information. With this immense and unregulated power they have taken control of our individual shopping habits, news feeds, and entertainment choices to shape the beliefs, memories, and emotional states that direct our political, social, and emotional lives both online and off.

We are seeing this happening politically for sure—not just in Ukraine or here in the US, but globally as well. There is a polarization among what used to be semi-tolerant societies. The manipulation of information and our memories is no trivial matter. It is breathing life into anger and division among people that need to be able to work together. The fallibility of information or the lack thereof has impacted our ability to reflect, categorize, forget, and ultimately think freely. We have been molded and gradually worn out, and an alarming number of individuals today show symptoms that resemble victims of trauma.

It has caused me to take a much harder look at what I am believing in my daily studies. I am spending as much time trying to understand the historiographical leanings of a writer as I do processing the actual things they are writing. We all come with an agenda, myself included. The question I am asking myself is, “how much am I willing to simply believe without taking a deeper dive into the research, facts, and conclusions of those who are doing the investigation?” What are their presuppositional dispositions and what are they trying to get me to believe? On the other hand, I have found that as much of the blame lies with us. I have been guilty of intellectual laziness myself. It is easier to just read a short social media post and believe it than to pick up a thoroughly investigated book and see how the author arrived at their conclusions. I have decided that I am not going to do that anymore. I am going to set aside my TV and my social media and take more time seeking to really understand, not only the information landscape, but also myself. I am beginning to take a harder look at my own presuppositions and ask how they are shaping my understanding and, more so, my memory. Am I remembering things that simply never happened? I think we would all be surprised to see how many things didn’t happen how we recollect, or even worse, not at all. Do we have the intellectual integrity to ask ourselves these questions? Or will we just stay on the cruise ship of our imaginations, sunning on the main deck, waiting for our next buffet adventure?


How can we contain the viral spread of misinformation and inoculate our memory against it? It’s possible to marshal a number of defenses, although they are only partially effective against an assault that is rapidly advancing in sophistication.

Information. Retractions don’t always erase bad information from memory, but Stephan Lewandowsky has found that detailed retractions work better than simple ones, and repeating retractions enhances their effectiveness. Lewandowsky also recommends providing new facts to supplant wrong ones. Explaining where false facts came from heightens suspicions of bad sources. “We’ve evolved to believe things,” he says. “We’re not good at letting go of a belief unless it’s replaced with an alternative that explains the world equally well.”

Regulation. Banning misinformation is generally difficult because of first-amendment rights. However, scholars looking at “deep fakes” argue that some speech is not protected: speech that is fraudulent, defames private citizens, incites violence, or impersonates government officials. Individual creators and sharers of fake news are often hard to track down, and social media platforms can’t easily be sued as publishers. Still, Facebook, Twitter, and Google are not immune to new regulation that assigns them more responsibility for their effects.

Filtering. The same types of AI technology used to mimic and manipulate humans online is also being used to filter out fake and bad-faith content. But fact-checking and the refinement of newsfeed algorithms require news analysis, moral judgment, and common sense, none of which can yet be automated. Hany Farid insists, “This is a very human problem; it’s going to require human intervention.”

Tilting toward disbelief. “The only real weapon is cynicism,” Robert Nash proffers—while immediately recognizing its unworkability. Even if it were possible to pull off, questioning everything would come at the cost of everyday functioning. “Even as an expert in memory I don’t go around distrusting my memory.” But a little questioning goes a long way.

The follow-up.

Moscow Is Using Memory Diplomacy to Export Its Narrative to the World…

Finland, Sweden Move Closer To Joining NATO Amid Russian Aggression…

The feed-back.

For your comments or questions about any of our digests please feel free to write to me at:


© 2019 • More Than Meets