CrossOver podcast – episode 3 “How algorithms changed my job”

Based on the “making of” of an article she wrote for Apache, Hind Fraihi explains in concrete terms what the algorithms have changed in her work.

Subscribe the CrossOver podcast on

 In the article, she addresses the conspiracy theories that can be found en masse on the Odysee platform, often described as the “YouTube of the far right”.

Based on this conversation, our media literacy expert, Zara Mommerency, explains why it is essential to understand and take into account these new ways of information creation. This is to improve our media consumption.

The podcast is a production created by Savoir Devenir under license from Creative Commons
Production technique : Marieke Rimaux and BNA-BBOT Bruxelles
Ecriture : Savoir Devenir and Apache

Voice over : [00:00:05] You’re listening to the podcast Crossover. Crossover is a European project that explores the amazing background of algorithms through a dashboard. The dashboard is a tool used to monitor algorithms of social media, search engines and certain platforms. 

Algorithms are a finite and unambiguous set of instructions and operations written by developers to solve a problem. Just as a recipe describes how to prepare a dish. They influence our relationships, our consumption behavior, our culture and most importantly, the way we inform ourselves and perceive the world. But do we really know what they are and how they work? Or what economic and ideological logic lies behind them? And how can we try to control them rather than simply submit to them? Understanding algorithms, being able to critically analyze them and learning how to protect ourselves from them. These are the goals of media literacy. And that is precisely the area that the Crossover Project explores in this podcast series. In this episode, titled: “How algorithms changed my job?”, we talk about algo-journalism. That is, how new algorithm-based investigative techniques are changing the journalist’s work. More specifically, how these new techniques increasingly used by news editors are changing the way information is produced and how do they help combat disinformation? That is what we will explore in this podcast. 

First, we let Hind Fraihi have her say. We will talk about how algorithms have changed her profession as a journalist, using the “making off” of an article she wrote for the news site Apache. Hind investigated how conspiracies circulate en masse on the Odysee platform, often described as a YouTube of the far right. Next, our expert in media and information literacy, Zara Mommerency, will explain why it is critical to be intentional about these types of new media channels and how we can follow news in other ways to broaden our perspective.

Jan: [00:02:50] Hi Hind.

Hind: [00:02:51] Hello Jan.

Jan: [00:02:52] Hind, you are one of the journalists for Apache who has been researching algorithms, especially on social media, since early this year, early 2022. Can you tell us a little bit more about this new form of journalistic research on a phenomenon that is fairly new?

Hind: [00:03:14] Yes, I’m going to give it a shot. The term algo journalism is a term that comes on top of the more well-known term “data journalism”. And by that we actually mean an approach, an approach that goes back to the late 1950s, when statistical data was first used to create precision journalism. And today the term encompasses all journalistic practices that use data processing. That includes sorting information, reviewing it, presenting it visually or even using robots to write articles. And what is happening now, in an early stage, is that we have bigger and bigger and more precise databases. We then call those: Big Data. That stands for collecting and analyzing a large amount of information data in a short period of time. And we almost systematically use algorithms with artificial intelligence to process all that. And that pretty much reflects my research within the Crossover project.

Jan: [00:04:39] Yes. You guys within Crossover, not only research algorithms, but you use algorithms in that research or to conduct that research. How exactly does that work?

Hind: [00:04:51] First of all, I do my research on platforms that use algorithms, social platforms from Twitter to Google News to Facebook, Mastodon. And I then make a selection in the information provided to me by these algorithms. And secondly, I use Crossover’s dashboard, which allows me to study topics that come up on social media, so that are actually trending and recommended by certain algorithms. 

Jan: [00:05:29] And so what comes up on that dashboard are the topics that are apparently important on social media. And then you go on to look at that to see what might underlie that.

Hind: [00:05:40] That’s right. We actually have several social media platforms that we follow, that we monitor. We then see what topics, what information comes up there and we then work with that.

Jan: [00:05:54] Do you see this as a regular or a normal source of information? This dashboard and social media posts? 

Hind: [00:06:03] Yes those are sources of information, but they’re not news stories obviously. It’s just posted by people. Or by opinion makers, by different organizations or parties, media outlets, self-proclaimed or not self-proclaimed news sites, you name it. But those are unfiltered, unverified. So of course you have to look at it with some suspicion. It’s not just any information source. So of course it’s up to us to verify those sources like all sources. And as I said during the first episode of the podcast, namely “the keyboard fighters”, the information that the algorithms put forward on social media does not necessarily match reality.

Jan: [00:07:00] As a journalist, what advantage do you see in using or researching algorithms? What do you get out of that as a journalist?

Hind: [00:07:09] It does come in very handy to sort, organize a mass of information. And data mining actually does help with that. Crossover dashboards allow us to detect social trends. It allows us to spot topics early. We also track Odysee, for example. Also known as the YouTube of the far right. It is a video platform that is especially popular in America, somewhat less known in Flanders. And then it turned out that Vlaams Belang MP Dries Van Langenhove has a channel on it, where hate speech and conspiracy theories go together seamlessly. 

So surely it has an early signal function. In this case anyway. Those dashboards are actually a useful tool to try to understand or chart the economics of attention. The battle for attention has become a kind of profit, a virtual profit to win the public’s attention. And then, of course, emotions are magnified and fear is fed. Especially on platforms like Odysee. And yes, that’s actually very interesting to be able to closely monitor that struggle and that economy of attention using algorithmic journalism.

Jan: [00:08:49] That algo-journalism provides opportunities to detect trends early on to maybe see earlier than others that Dries Van Langenhove has a channel on Odysee. But are there limits to this kind of journalism? What should you be wary of? What are the limits to this?

Hind: [00:09:07] Yes, of course there are a lot of limits to that, but algo-journalism is purely a method. It is not a miracle thing, nor will it be a surrogate for the journalist per se. So field research remains hugely important. The information we get from algorithmic processing, we also have to use that with caution. There are, of course, all kinds of tools that tell you what the audience might like or should like, what topics are on trend. What words you can use in a title to get better rankings. Those tools are all very useful, pleasant too. Can also help now, of course, to streamline that very large mass of information. But of course there is the journalist in his or her own right who has to filter it, who has to verify it, who has to check the sources. And that can’t just be replaced by a tool.

Jan: [00:10:06] You’re pointing it out, the journalist’s role remains important. The possibilities of algorithms, including artificial intelligence, are great or at least are being grandly proposed. So do you see the journalist and yourself eventually being replaced by a kind of robot, a kind of algorithm that produces articles artificially intelligently?

Hind: [00:10:29] No, no, not at all. I don’t believe that. I think we will have more and more qualitative data. Hard data to work with. More and more sophisticated algorithmic tools to process all that. That in itself is a very exciting prospect. But of course we still need journalists to analyze certain things, make connections, verify, do site research to check certain things. Of course, the virtual world does not always correspond to the real world. So that, let’s say, journalists would be replaced by robots, I don’t believe in that at all. The robots, so to speak, artificial intelligence more broadly, is very useful for quickly processing large amounts of data and producing factual articles. But they will never be those capable of in-depth analysis. So of course, in itself, I have nothing against using artificial intelligence and an algorithm. And robots, they can be useful for using articles for example near real-time tracking of elections, so highly qualifiable data to display. For sports results or financial statements, that’s very useful, but then you really just have the data per se, not the analysis.

Jan: [00:12:02] Let’s go back to Dries Van Langenhove and his presence on Odysee. You wrote an article on Apache about that with the title: “Dries Van Langenhove also roams Odysee, the YouTube of the far right”. For that research, you partly used the Crossover dashboard we just talked about to investigate that very controversial platform. Can you briefly explain what exactly Odysee is and why you’re following this platform with the Crossover dashboard?

Hind: [00:12:35] Odysee is a video platform. It’s popular in America, to a lesser extent in France and even to a lesser extent in our country, in Belgium. But it’s gaining some popularity. That’s why we follow it. It is also a platform that claims to be censorship-free. There is no moderation there. Gives a lot of space to, let’s say, freedom of speech. Nothing wrong with that, of course, but what we see or encounter a lot of there are posts that feed strong feelings of distrust of everything and everyone. And also a lot of hate speech, conspiracy theories, extreme right-wing hate messages as well. You can say: “That’s diversity, plurality of opinions”. But of course that stops at racist or anti-Semitic or calling for violence posts, messages. So you can ask a lot of questions about that. How censorship-free a platform is allowed to be. And then of course we noticed there was a Belgian parliamentarian there. At a very early stage, being Dries Van Langenhove, known in the Belgian political landscape as a member of Vlaams Belang and founder of Schild en Vrienden who have been discredited and prosecuted for internet memes calling for racism, anti-Semitism and violence.

Jan: [00:14:23] And how did you handle that research, specifically Dries Van Langenhove’s channel on Odysee? And then why Van Langenhove? Was he the only one? Belgian or Flemish person of interest already present on Odysee? What was that reason?

Hind: [00:14:41] The reason was actually also messages about monkeypox. That pretty much came up on our radar, on the virtual radar, and we did want to zoom in on that. And we saw that trending mainly on the video platform Odysee, to a lesser extent on other media. And then little by little you then get into the training ground of the young generation of right-wing extremists on Odysee, where  Dries Van Langenhove also already has a channel on. Not coincidentally during the pandemic he then started his channel there. Let’s say in the slipstream of a lot of criticism that came against the corona measures, you name it.

Jan: [00:15:31] And he’s kind of building a community of like-minded people there?

Hind: [00:15:36] Oh, that’s actually a very small community, if you compare it to YouTube, he has a lot more followers there than on Odysee. But let’s say he’s sticking a toe in the water there to test how far he can go anyway.

Jan: [00:15:53] For this research, you worked with Check First, a partner in Crossover who specialize in algorithms. What was the added value for you as a journalist to work with Check First and get a better understanding of how those algorithms work?

Hind: [00:16:10] Journalism cannot be improvised. The teams at Check First, software developers, trained me and other colleagues to use the dashboards. And since then, we have regularly exchanged information about trends, keywords and changes. Recommendations that we should follow up on according to current events. We look for clues, for what has news value, of course. And that’s a lot lately. If you see the successive large-scale crises from Covid-19 and the war in Ukraine. So there is a lot of material for making fake news or disinformation.

Jan: [00:17:01] So there are a lot of leads and topics about which fake news and disinformation is circulating online on different platforms. How to better deal with that as a news consumer, how to guard against that, that’s what we’re going to talk about with our media literacy expert Zara Mommerency. Welcome Zara.

Zara: [00:17:23] Hello Jan.

Jan: [00:17:24] Zara, Do you think the general public is aware of these new methods in journalism? This algo-journalism?

Zara: [00:17:32] I think if I have to follow my gut feeling, I think people actually in general don’t really know how journalists work or that there’s very little known about what methods journalists use, let alone they know anything about this algo-journalism. I think we shouldn’t miss the fact that journalists are increasingly transparent about how they work. So that people also know a little bit about how that data and algorithms are used in their work. So it’s kind of double.

Jan: [00:17:59] So that’s a dimension that media literacy does factor in.

Zara: [00:18:03] Definitely. Yeah, that studying and understanding that algo-journalism actually frames for us within algorithm wisdom and actually more broadly within data wisdom. Say you have the necessary knowledge, attitudes and skills as an individual to actively, critically, creatively and very consciously deal with, use data, also understand data and also understand what impact that actually has on our lives. When I then look at those algorithms, that is also knowing how that actually certain information comes to us through those algorithms, what choices are made by those algorithms, what impact that has on our news consumption, on our information consumption.

Jan: [00:18:41] Can you identify or explain exactly what skills are involved?

Zara: [00:18:46] For example, learning to analyze data. Learning to use it is a skill on the one hand, but also learning to understand those algorithms, learning to evaluate what the role is of data and algorithms in society, making that, for example, automated decisions are actually made by AI. 

It is also about knowing that algorithms cannot fully grasp how we actually think as humans, what we do. Also machines, they just can’t teach us the same standards and values as humans. Also knowing that algorithms are not neutral is very important, that’s knowledge you have to have. If we then look within that new context of news production that is linked to algorithms, with all those skills we can actually make better choices. What kinds of media we consume, I think, there is information we consume, and then we can make critical choices. If we consume media that is supported by fact checking tools, for example. I would also say when we talk about that dashboard that is used also in the research that you just mentioned. More advanced people can certainly already use tools like that dashboard when they themselves have doubts about things when they want to investigate something themselves, when they want to do their own analysis.

Jan: [00:19:58] Let’s come back briefly to the article that we talked about earlier with Hind regarding Odysee, the platform where conspiracy theories and the like are rampant. What can be done to prevent people from disappearing into such a conspiracy theory or losing themselves in such a conspiracy by being active on these platforms? How do we keep them from getting into the rabbit hole?

Zara: [00:20:25] I think there are multiple answers to that. I think on the one hand it’s about having knowledge of and indeed knowing how that information is processed on social media. Having knowledge about those platforms is very difficult, because there are always new ones coming and going, so there’s not really an unequivocal answer to that. I also think your awareness that there is such a thing as a filter bubble, that there are things like an echo chamber. On the other hand, that’s also about arming yourself against disinformation. And then when we look at those conspiracy theories, it is about learning how to recognize a conspiracy theory, knowing why people also spread certain disinformation and conspiracy theories, and what their intentions are. I think that’s very important. I think especially as a news user, you just have to be very critical. And of course you still have to be able and willing to do fun things on social media, but also be aware of what’s going on there.

Jan: [00:21:17] Odysee markets itself as a kind of sanctuary where freedom of speech is absolute, where there is no censorship. So a slogan or rallying cry that also echoes now at Twitter since Elon Musk took over the platform. He also promises or he also promised absolute freedom of speech. How do you see that?

Zara: [00:21:42] Gee. That freedom of speech is a bit of an ethical issue I think, or the question of what is free speech… Where does it begin? Where does it end? What is the line with discrimination? With hate speech? Who decides what can be said? What cannot be said? It’s very scary to think that’s one person now. Elon Musk actually, who is going to determine that. I think you have to be aware that, what you read, see and hear, for example, on Twitter or on other platforms, is certainly not all true. But I think also as a user yourself you just have to be very critical or very conscious of what you post and what you are doing online.

Jan: [00:22:20] So as internet users, we should all be critical and careful with the things we see and also post ourselves online. Is that a good summary?

Zara: [00:22:26] I think that’s a good conclusion. 

Voice over: This podcast was produced by Savoir Devenir, as part of the European Crossover project. This project was funded by the European Union. Responsible for the concept Sophia Hamadi and Pascale Garreau. Responsible for the technical production Marieke Rimaux and BNA-BBOT. Responsible for the writing: Savoir Devenir with the contribution of Apache. Want to know more about this podcast and all the other episodes? Go to Crossover.social

CrossOver podcast – episode 3 “How algorithms changed my job”