Why the Kremlin Loves Social Media


Russian troll farms and social media bots are now old school. The Kremlin’s favorite way to sway U.S. elections in 2024, we learned this week, makes use of what many Americans consider a harmless pastime — content created by social media influencers.

A DOJ indictment on Wednesday alleged that content created and distributed by a conservative social media company called Tenet Media was actually funded by Russia. Two Russian government employees funneled nearly $10 million to Tenet Media, which hired high-profile conservative influencers such as Tim Pool, Benny Johnson and Dave Rubin to produce videos and other content that stoked political divisions. The indictment alleges that the influencers — who say they were unaware of Tenet’s ties to Russia  — were paid upward of $400,000 a month.

It’s the latest sign that Russia’s online influence efforts are evolving, said Pekka Kallioniemi, a Finnish disinformation scholar who is the author of “Vatnik Soup,” a book on Russia’s information wars set to publish Sept. 20. Influencers with a fanatic following are far more successful at spreading disinformation than bots and trolls, he told POLITICO Magazine in an interview.

“These people, they are also idolized. They have huge fan bases,” he said. “They are being listened to and they are believed. So they are also a very good hub for spreading any narratives in this case that would be pro-Kremlin narratives.”

This conversation has been edited for length and clarity.

Why are far-right social media influencers ripe targets for Russia? How has the Kremlin been able to infiltrate far-right media so effectively?

The main reason is that they share a similar ideology. This kind of traditionalism and conservatism is something that Russia would also like to promote: They show Putin as the embodiment of traditionalism and family values. And this is very similar, of course, in U.S. politics. Anti-woke ideology is also behind this.

There are also these kinds of narratives promoted by people on the left. It is an extremely cynical system where the whole idea is to polarize the U.S. population by providing extreme ideologies and extreme ideas and push them to a U.S. audience.

So it isn’t just a right-wing thing, it happens on both sides? 

Yes, and I would emphasize that it is far-left and far-right. It is the far ends of the political spectrum that are both targeted. The narratives [on the left] are the same as the ones promoted by right-wing influencers.




How have Russia’s influencing tactics been changing? Is there a reason behind that evolution? 

If you go way back to the launch of Russia’s Internet Research Agency in 2013, they started mass producing online propaganda and they used these so-called troll farms. Later on, they also started using automated bots. But in addition, the Russians seem to be using these big, big social media accounts that are called “superspreader” accounts. They are being utilized to spread the narrative far and wide. This term came from Covid-19 studies: There was this Covid study that found out 12 accounts were responsible for two-thirds of Covid vaccine disinformation, and actually Robert F. Kennedy Jr.’s account was one of them. These studies, also in the geopolitical sphere, discovered that actually a lot of this disinformation is spread through the superspreader accounts. Russia had probably realized this, and this incident is a good indicator that they are being utilized by the Kremlin.

What about the superspreader accounts does the Kremlin find useful? 

Because their reach is so big. They have usually organically grown to be popular. Whereas with troll and bot accounts, the following is not organic. They usually have a smaller following, and it’s very hard to spread these narratives outside the network. So if you have a main hub — a superspreader account with 2 million followers — it is much easier to spread a narrative because these accounts already have a huge reach and a big audience and sometimes their content even goes into the mainstream media or traditional media.

These people, they are also idolized. They have huge fan bases. Huge superspreader social media personalities — they are being listened to and they are believed. So they are also a very good hub for spreading any narratives that would be pro-Kremlin narratives.

Would you say that the rise of social media has helped Russia’s disinformation campaign? 

Of course. Before social media, they had a lot of difficulties penetrating the Western media. It happened, but not so often. So social media has been a useful tool for Russia to spread its propaganda. They were the first ones to actually utilize social media to do these kinds of mass disinformation campaigns and information operations, and they had a really good head start in that sense. It took the Western media and intelligence services years to figure out the whole thing.

The Internet Research Agency was established in 2013. First, they started in a more domestic environment, so they were defaming the opposition, Alexei Navalny and so on, and of course Ukraine. But after that, when there was no more opposition in Russia, they moved on to the U.S. audiences and U.S. elections in 2016.

It is also worth mentioning that probably they are using AI now and in the future, because it's just automating things. It's so much cheaper and also more effective. You can create huge volume by using AI. So for example, what Russian operatives have done is create fake news sites or blogs, and the content on these blogs is completely generated by AI, but sometimes they inject Russian narratives or propaganda manually. There are hundreds of these blogs. Also, of course, they use the traditional system of bots and trolls to then make these stories seem much bigger. It's kind of this multilevel system, and sometimes one of the superspreader accounts can pick up the story, and then it really goes viral. It's a very sophisticated system that is still not very well understood.

Are you surprised at all by this DOJ indictment that involves two Russian media executives pushing pro-Kremlin propaganda in the U.S.?

I was not surprised. For a long time, people have thought, “There is no smoking gun, there is no direct evidence of any kind of foreign influencing.” But now this is it — and I think that this is just the tip of the iceberg. There's so much more happening, especially through these shell companies located in the United Arab Emirates or Czech Republic, or whatever because Russia's very good at masking money flows.

What is the ultimate goal of Russia’s disinformation campaign? Electing Donald Trump? Or is there a broader objective? 

They want to polarize and divide countries, especially the U.S., which has a two-party system. Whenever a country is focusing on domestic disputes and arguments, its foreign policy becomes much weaker. We saw that with the Ukraine aid that was delayed for months and months and months, and that's basically their goal: to create these internal conflicts, so the foreign policy of various countries becomes much weaker and indecisive.

So they want division and also for people to stop paying attention to what Russia does? 

Yes. But the famous thing about Russian disinformation is that it rarely even mentions Russia. So it's usually talking about other issues, for example, the southern border of the U.S. or woke culture or losing traditional values. I think the main narrative that is pushed is that the U.S. shouldn't send any more money to Ukraine, because there are so many domestic problems that should be fixed instead.

And the reason is that when you start doing an investigation on Russian culture in general, you realize that it's not really that traditional or conservative or anything like that. You see that they have very big problems, and they are actually quite secular. The image that Russia tries to create of themselves, it's not the same as reality. They just decide, OK, let's not talk about Russia at all. Let's talk about other countries and their problems. It's very different from China. China likes talking about China and how great they are. So it's like this complete opposite in that sense.

Some people refer to Americans sympathetic to Kremlin arguments as “useful idiots.” Is that a fair characterization of this situation? Has there been a change in the type of “useful idiots” Russia is seeking out?

I'm quite sure that the owners of Tenet Media, Lauren Chen and Liam Donovan, I'm pretty sure they knew what they were getting into. There were a lot of signs that they actually knew that the money was coming from Russia. About the influencers? I'm not sure. I think almost all of them have stated that they didn't know. But I mean, it raises questions, if somebody is willing to pay you $400,000 for four videos a month. There has to be due diligence. You have to think, where is this money coming from? Why is somebody willing to pay so much for producing these YouTube videos that get maybe 100,000 views, which isn’t that much, or 200,000 views? Maybe they didn't know, but they certainly didn't do their due diligence. They didn't do proper background checks of where the money was coming from, because that was a lot of money.

When it comes to seeking useful idiots, I think it's pretty much the same as before. There is a counterintelligence acronym called MICE. Basically, it lists what motivates somebody to do espionage: money, ideology, compromise or ego. This is a very simplified model, but I think it fits quite well in this propaganda domain. So there's usually something that motivates these people. And I think “useful idiot” as a term is not very good, because a lot of these people, they are not idiots. They might be greedy. People have different motivations to do things. But I think the basic idea behind the so-called useful idiot is still the same. It is somebody who's willing to work for a foreign nation, usually in order to undermine their own country.

So who do they seek out to spread propaganda? What kind of person are they looking for? 

I think a lot of these people who are doing it very well are usually charismatic and in some ways controversial. They know how to create controversy around topics and on social media. Creating controversy usually also brings engagement — people like your content, share your content, comment on your content. So charismatic people are probably the most valuable assets right now.

Do you think people have a growing understanding of Russia’s disinformation campaign? And to what degree do they care? 

I think a lot of people simply don't care. Most people care about inflation, food prices, energy prices, the kind of stuff that actually affects their day-to-day life. If somebody is being paid to promote Russian narratives, I don't think a lot of people care about that, because it doesn't really affect their life that much. But the interesting thing is that Russian narratives usually revolve around these day-to-day topics. In the indictment, the narratives being pushed were about food prices and everything becoming too expensive and so on. So Russia also promotes this day-to-day stuff in their disinformation. But yeah, I don't think people care as much as they maybe should.

Ahead of the election, how can we be vigilant against Russia’s disinformation campaigns?  

Well, I've always said that the best antidote to this is education, but I think it's too late when it comes to the November elections. But Finland, it's a great example. We have a very good education system that promotes media literacy and critical thinking, and also cognitive resilience, against propaganda and disinformation. I think this would be the best solution.

In general, people should be more critical of what they read, especially on social media, and realize that there are people who are willing to spread lies and fake news just for engagement. Always remember that people might be paid to spread these stories like we just witnessed with Tenet Media. So critical thinking as a general rule is a good way to stay vigilant.

But also, I always say that people should just close their computers and smartphones and go out and just live their lives and enjoy it. The digital world can be pretty hostile, and it can bring out these negative emotions. Maybe take a break and go for a hike. Just enjoy life.




Comments