CrossOver podcast – episode 2 “Dangerous Liaisons”
How do the algorithms that recommend posts on social media work? Do they reinforce foreign propaganda from China, among others? And how can you protect yourself against this online?
In this podcast, we look at how social media algorithms favor certain Russian and Chinese state media. For example, our research shows that reporting on the war in Ukraine on YouTube in Belgium is dominated by Chinese and Turkish state media. We try to understand how these algorithms work and how they amplify the influence of foreign propaganda on the Belgian and European public.
With the cooperation of :
Jan Walraven (interviewer, Apache)
Bram Souffreau (Apache, CrossOver Project)
Hind Fraihi (Apache, CrossOver Project)
Zara Mommerency (Mediawise)
Marieke Rimaux (voice over)
Each podcast of the CrossOver toolkit is accompanied by an interactive quiz. Did you listen carefully? Come and test your knowledge!
Voice over: You’re listening to the CROSSOVER podcast. Crossover is a European project that explores the amazing behind-the-scenes of algorithms that make the world go round.
But what exactly are algorithms? They are a precise and unambiguous sequence of instructions and operations, written by developers, that solve a problem. You can kind of compare it to a whole list of ingredients and instructions for the recipe of a complicated dish.
In this podcast, we will talk primarily about social media algorithms, which automatically determine which posts appear on timelines, based on personal preferences and popular trends, among other things.
Those social media algorithms influence our relationships, our consumer behavior, our culture and most importantly, the way we inform ourselves and perceive the world. For example, an algorithm on- let’s say- Youtube, ensures that if you type in a Dutch-language search term related to the war in Ukraine, you will mainly see news from the Netherlands.
But do we really know what they are, how they work and do we know the economic or ideological logic behind them? And how can we try to control them, rather than endure them? Understanding algorithms, being able to analyze them critically and learning how to protect yourself against them: these are the goals of so-called “algorithmic literacy,” and this is also the area that the Crossover project is exploring with a series of analyses and podcasts.
In this episode “Dangerous Liaisons” we will talk about algorithms, propaganda and resistance.
On March 1, 2022, at the beginning of Russia’s invasion of Ukraine, the European Union banned Russian media outlets Russia Today and Sputnik in order to cut off the Kremlin’s propaganda. A few months later, on YouTube, information about Ukraine in Belgium was dominated by media from China and Turkey. And the voice of the homegrown media is much harder to hear on the video streaming service. What is going on?
And how do algorithms relate to state propaganda on social media?
Through the Crossover Dashboard, a tool that allows us to analyze algorithms, we will try to understand how algorithms work and how they can influence the grip of foreign state media on the Belgian and the European public opinion. We will interview Bram Souffreau, co-founder of Apache, and Hind Fraihi, journalist at Apache, in this podcast to help us do so
As a final conclusion, our Media and Information Literacy expert, Zara Mommerency, will explain how we, as citizens, can all fight against online propaganda.
Jan Walraven: Hello, I’m Jan Walraven from Apache. What if recommendation algorithms favor certain state media? That is the question we will explore today with our guests. We have two partners from the European CrossOver project here with us in the studio: Hind Fraihi and Bram Souffreau, they both work at Apache, and they will help us get a clearer picture on the situation. Hello Bram, you are co-founder of Apache, a Belgian media specializing in investigative journalism. You also work on the CrossOver project, which published a study in June 2022 titled “Are Youtube’s algorithms addicted to state-controlled media?”. Can you tell us a little more about it
Bram Souffreau: My pleasure! And I want to point out that this research was conducted by our partners at Check First, experts in algorithms
Jan Walraven: The study begins with the European Union banning Russian media outlets Russia Today and Sputnik from broadcasting in EU countries.
Bram Souffreau: For the European Union, it was about protecting itself from Russian propaganda, especially online propaganda, in the context of a war. Ursula von der Leyen, the president of the European Commission, was clear when she said that “the state-controlled media Russia Today, Sputnik and their subsidiaries could no longer broadcast their lies justifying Putin’s war.”
Jan Walraven: And did that ban work? Did it achieve its purpose?
Bram Souffreau: Yes and no. Technically, these Russian state-controlled media were quickly effectively banned in Europe, their broadcast licenses suspended, their social media accounts and their websites blocked. Even though some people could still access their content, by and large it worked. Sputnik France, for example, quickly went out of business because they were no longer getting revenue from social media. What is a problem, however, are the media that took their place.
Jan Walraven: What state-controlled media or other media have taken their place?
Bram Souffreau: We have studied this with the dashboard developed within the CrossOver project. This tool analyzed the search results and videos recommended by YouTube’s algorithms when a Belgian user typed the keyword “Russia” into the search bar. And it did so before and after the banning of Russia Today’s French branch, RT France. In February 2022, at the beginning of the war in Ukraine, RT France appeared as the main source and their videos became the most recommended. When RT France was banned on YouTube, it was quickly replaced. First there was the rise of France 24, a French-speaking channel, and then came CGTN-Français (China Global Television Network) , the international French-language branch of China’s state media – a tool to promote China abroad. This medium, which has already been banned in England since February 2021 because it is under the control of the Communist Party of China, is still legal in Europe and is at the top of the Youtube recommendations on news in Ukraine. So what we see is actually that Russian state channels have disappeared on YouTube but have been replaced by Chinese propaganda by this channel called CGTN.
Jan Walraven: Do you think CGTN Français will become an important news medium for Belgians? Considering it is at the top of Youtube search results.
Bram Souffreau: A major news outlet, not really I think. What we do find, is that CGTN Français scores well when it comes to Ukraine and from a survey we also found that the videos of CGTN Francais, when it comes to the Uyghurs, the Chinese minority that is oppressed by the communist party, the state of China, that those videos also get a lot of views. And that they actually have an influence on the Belgian viewers as well. But when it comes to other topics that have nothing to do with the war in Ukraine, or that have nothing to do with Russia or the Uyghurs, then we do not notice that CGTN scores much better or is recommended more than other channels.
Jan Walraven: And then to conclude, how did the French CGTN manage to get so high in that list of recommendations from Youtube?
Bram Souffreau: That’s just very difficult to explain or to comprehend, the algorithm of Youtube is, let’s say, a kind of black box, nobody really knows how the algorithm functions and how that it works. So we don’t really know how it is that certain search results score much better than other search results. And why certain videos, for example from CGTN, score better than other similar videos from let’s say western channels for example. Maybe it has to do with some clues that we have but of course that is not 100% certain.
A first clue we have is that the algorithm apparently prefers content like those created for the French branch of CGTN because there is a certain buzz or because they are popular. They also push, for example, accounts that produce and upload videos very much and very often, they subvert the content that Internet users pass on and comment on the most. So the algorithm of Youtube, is the one that pushes the accounts that also pays more attention to accounts that have more followers than others. The algorithm of Youtube itself that we know is not transparent, leaves us in the dark about how and why certain videos are or are not recommended to users.
So we don’t know that, we can’t comprehend that.
Voice over: It is the end of April 2022. Two months have passed since the invasion of Ukraine began, and RT France and Sputnik were banned. The information war is entering a new phase as the state propaganda of China, Russia and, to a lesser extent, Turkey have a strong presence on YouTube in the coverage of Ukraine. At the expense of the national media. This is according to an investigation in Belgium, in which Hind Fraihi of Apache participated.
Jan Walraven: I have Hind Fraihi here with me in the studio, she is an investigative journalist at Apache and works on the Crossover project. Hello, Hind welcome.
Hind Fraihi: Hello Jan.
JW: You already worked with Check First a Finnish partner in Crossover and you use the dashboard of Crossover intensively. The situation regarding online disinformation, propaganda, certainly in the war between Ukraine and Russia has not improved online. What happened, what’s going on?
HF: First, I’d like to briefly say something about the dashboards, which were developed by our Finnish partner Check First. A data company that has developed a software that is a dashboard, a monitoring system, let’s say, where we can track, monitor recommendation algorithms, and by recommendation algorithms we mean what content is recommended on social platforms to Belgian internet users, and so we follow about six platforms from Reddit to Twitter, Facebook, Youtube but also Odysee, let’s say less known in Flanders but already widely known as a platform for extreme right-wing users.
JW: And do you find it disturbing that Belgian users on Youtube who are looking for information about the war in Ukraine, that they mainly get those foreign media, foreign state media on their plate.
HF: I can’t be against that, what I can be against is how that news is applied, and of course there you see some disinformation techniques applied here and there. In the case of Russian propaganda, they are actually out to destabilize Europe. They want chaos. And in the case of Chinese state broadcasters, you actually see mainly a self-glorification, self-promotion directed at their own address, according to our research regarding the coverage of the Uygurs.
JW: Does it also have to do with the language, that videos that are in Dutch and in Flemish are less recommended because that covers a smaller language area so that those foreign media who produce videos in English or in French, that those media are favored by Youtube?
HF: The small language area obviously plays a role, we are a language minority. But it is actually very interesting to see on our dashboards that there is a greater influence from Dutch media than the other way around, from Belgium to the Netherlands. We have seen this very clearly with the demonstration around the freedom convoy in February, in the middle of February there was a huge mobilization on social media to paralyze Brussels, to come and demonstrate. Preferably with a vehicle to demonstrate against the corona measures and the vaccine policy. And there you saw that on our social media some content was shared that came from the Netherlands.
JW: And what can you tell us about the online disinformation that circulates on these platforms and how the algorithms of Youtube, for example, deal with that.
HF: I can tell you a lot about that but actually not much at the same time. It’s like Bram just said, algorithms don’t always show their cards or at least the Big Tech companies who don’t always answer or know how to answer our questions about how algorithms do work, especially what is definitely recommended to Belgian internet users. Now Crossover is a project we submitted last year in 2021 so even before Russia invaded Ukraine. And in March, no at the end of February, it will come to that invasion and then it will be even more, let’s say super exciting, to follow up on those dashboards.
What have we been able to conclude so far, well if you enter search terms related to the war in Ukraine for example Kiev, Donbass or Ukraine itself as search terms, then 9 out of 10 let’s say you end up with foreign media. Before Russia Today was censored, Russia Today was especially recommended, but later we saw that after the ban of RT, the ground was taken by the Chinese global television network, a Chinese state channel, a media giant that is multilingual. Who makes news content in Spanish, English, French, and of course that also helps to pull that algorithm ahead.
JW: We have also asked the question to Google and to Youtube: How that algorithm is put together, but we have not yet received the secret recipe from them.
Voice over: On May 25, 2022, nearly 70% of the 50 most viewed videos on Youtube were related to the Russian-Ukrainian conflict, according to CheckFirst’s research. These videos are also special because they have higher engagement than videos on other topics. Is there a way to capitalize on this participation factor to combat propaganda? Let’s ask our media literacy
JW: Hi Zara Mommerency, welcome to our studio! You work around media literacy, you do all sorts of things around media literacy, can you briefly explain to us what exactly you do?
Zara Mommerency: Hi, I’m working at the knowledge center ‘Mediawijs’, and what we actually do is we get citizens both young and old to actually actively engage critically creatively with media as in using and understanding media and for example if you then look at online information that would be: How does that sort of news reach us, how does social media work, how do they actually disseminate disinformation. Those are some of those things that we then respond to.
JW: Apache did a study on the recommendation algorithms of Youtube and other social media
as we have already discussed it. The research showed that on Youtube, many videos of Chinese state media, often propaganda videos, are recommended. So that Belgian users also see those videos first when they search for information about Ukraine on Youtube. Can we do something about that? Should we just establish that or…?
ZM: Well, I think that we should certainly not be too afraid, on the one hand with those algorithms, there is a certain regulation on the way, that ensures that they will have to be more transparent so that we will also know a little bit more about what information they are pushing up. And on the other hand, if we are already aware of the fact that we know that there is propaganda and we learn to recognize it, we will also be able to better assess the risks involved. I think that it is certainly essential to know what it is in the fight against disinformation and propaganda, and I think we can work on that with media literacy, so that we can make young people more aware of the techniques behind propaganda. And once you know that they exist you will recognize them more quickly in the information you see, read and hear online. Also in those videos on Youtube for example.
JW: So an awareness of the fact that propaganda is present on social media, on the Internet, is a first important step.
ZM: First of all, know what propaganda is. The fact that there are people who want to influence public opinion in order to direct your thinking and your actions in that way. And we also see that very often typical techniques are used such as, for example, playing very hard on your emotions or using strong ‘us v.s. them’-thinking, responding to people’s needs. These are things that, once you know about them, you will recognize it. That’s an important step in arming yourself against it.
JW: What is the most important characteristic to recognize a propaganda video and then when you recognize it, what can you do? Can you do something about it?
ZM: I think the thing that happens the most is playing on your emotions. Both positive and negative. And so what we always say is when you watch a video you notice that it really does something to your emotions you have to stop, zoom out and just let it come to you and think why do I feel so angry or sad about that, or what does it do to me because the idea is that when something appeals to your emotions you are so moved by it that you are much more likely to believe it and share it. And that’s kind of the problem because once you start sharing it then the algorithm thinks: that’s something that a lot of people want to see and they’re going to push that upwards. Then of course that propaganda is going to spread very quickly. So I think that idea of once it does something to you, don’t share it right away and just leave it and then watch it again. That’s really already a first step to arm yourself against that.
JW: Let the emotion calm down a little bit, broaden your horizons maybe don’t just look on social media, don’t just look on Youtube. Are there other channels?
ZM: I think it’s always good anyway that you look at more than 1 source whether that’s on social media or something from traditional media. What is being said on the radio about a topic, what they say on the TV, what they say in the newspaper. You look up another news site, etc. But it’s also very important that you actually start actively looking up opposing voices yourself, so for example, you follow a certain political party, maybe you should also follow an other party on the spectrum as well to see what they’re saying about the same topic and so in a way you’re actually also going to confuse your algorithm a little bit because you don’t follow everything in the same line. You look for different opposing voices on different platforms and that’s actually very rewarding to get as much information on a topic or as many perspectives on information as possible. What actually works very well if you want to get some more information or some more different information in, is to occasionally clear your search history, clear your cookies, search in an alternative browser for information, even search in an alternative search engine like DuckDuckGo.
So if you then go on Youtube, you’re already going to get a whole different home page because that’s not exactly an algorithm that has all your personal information ready, so I’m saying you can actually do a lot to keep propaganda from necessarily getting all the way to you.
JW: And not just to not let that propaganda get to you, but to actually, as you put it, confuse the algorithm?
ZM: Actually one of the main things you can also just do is ignore a message because if you don’t respond to it if you don’t like it, then it won’t be spread further by an algorithm. So that’s a very easy one, when in doubt, just don’t share it and ignore it.
JW: So to wrap it all up, I’ll recap some tips: A varied diet of different (news) sources, let the emotions calm down a bit before you share something and be aware of your own click and read behavior so disinformation has less of a chance.
Voice over: This podcast was produced by Savoir Devenir as part of the Crossover project, funded by the European Union. Responsible for the concept: Sophia Hamadi and Pascale Garreau. Responsible for the technical production: Marieke Rimaux and BNA-BBOT. Responsible for editing and writing: Savoir Devenir, assisted by Apache. Find the notes of this podcast, the Dashboard and all the episodes on the site Crossover.social