Propaganda, misinformation, the DSA, Section 230, and the US elections
An interview with Dr Lukasz Olejnik (November 3, 2024)
The following is a transcript of our recent interview with Dr Lukasz Olejnik on Masters of Privacy. The original recording can be found on the podcast’s website, as well as on your favorite podcasting feed.
Introduction
OK very well. We’ve got Dr Lukasz Olejnik here today. He is an independent cybersecurity, privacy and data protection researcher and consultant. Senior Visiting Research Fellow of the Department of War Studies at King’s College London.
He holds a Computer Science PhD at INRIA (French Institute for Research in Digital Science and Technology), and an LL.M. from the University of Edinburgh. He worked at CERN (European Organisation for Nuclear Research), and was a research associate at University College London.
Lukasz was also associated with Princeton’s Center for Information Technology Policy, and Oxford’s Centre for Technology and Global Affairs. He was also a member of the W3C Technical Architecture Group.
In case it wasn’t enough, our guest is a former cyberwarfare advisor at the International Committee of the Red Cross in Geneva, where he worked on the humanitarian consequences of cyber operations. He is the author of scientific articles, larger pieces, analyses, he has been quoted by leading publishers across the world and he has written two books, the latest of which, Propaganda, was published this year and I have it here with me.
With Lukasz we are going to talk about the concept of propaganda to then move into misinformation in the context of the EU Digital Services Act and the threat that this piece of legislation could pose to freedom of speech in a healthy democracy. I will mention controversial measures recently taken by the Spanish government against certain media, and we will also talk about Section 230 in the US for comparative law purposes. Eventually we will land on current external threats to a peaceful outcome in the upcoming US elections.
Quite a lot to chew. Let’s go for it.
Sergio Maldonado
Lukasz, thanks for coming.
Lukasz Olejnik
Hello, thank you for the invite.
SM
Congrats on your new book, Propaganda.
LO
That was a lot of work so yeah I am happy that it’s finally out.
SM
I can see it’s a lot of work. As said, I have only read about 20% of it, but I’m impressed by how much structure you’ve given it. It’s very digestible as a result.
LO
That’s from my PhD years, so I like structuring things, the more sections the better. Some of them are really short and I know that.
SM
So, I like the historical background to the notion of propaganda: the in-fighting within the church, or rather the protestant movement against the Catholic Church, the Pope and so on… it’s a good background, but: How would you define “propaganda” yourself?
LO
So I would define propaganda as information or influence to achieve an effect, and this is quite a very general definition because it doesn’t mention who is making this influence, who the source, who is the recipient, what is the target, what is the goal… That is something that may be made more precise but at the most general and high-level view that is informational influence to achieve an effect.
SM
Very well, yes, everyone heard the stories of Bernays in the past century getting women to smoke, for example, or the public to change their mind on key issues, and maybe even Hollywood is in itself a gigantic propaganda machine, or was. So, can we say that there is no absolute truth, and that everything is subject to interpretation and therefore subject to influence?
LO
I disagree, you can have a factual and 100% truth for example in physics. You can have verifiable theorems, for example, testable with experiments.
SM
That’s very good. So, let’s say that Galileo back in the day when he started claiming that the planets were not spinning around the Earth, but rather that all of the planets, including the Earth, were spinning around the Sun, that he was spreading misinformation or, according to the Church back then, that was the case on the basis of the knowledge that at that point was well accepted. Can we describe that as misinformation for example?
LO
Absolutely. And that would be misinformation not only from the Church’s perspective, because that was an established view of the world and also other people like experts, lay persons… they also considered it a different view, so that was quite a bit more complex than just the Church criticizing.
SM
So let’s say that science is evolving, as it happens with vaccines. And today we are able to say that something is true because we have new methods of testing and there may be something that we believe today that in the future may be proven wrong. So, what do we do with the element of bad faith? Do we require that as a precondition to misinformation?
LO
Obviously intention is very important, whether the goals are for example malignant or harmful, so propaganda may be seen as informational influence but it may really use various approaches. It can use, for example, information that is factually incorrect but it is being spread like that without intention to deceive.
You may have misinformation where you have an intention to spread something that is verifiably untrue with malicious intent, and you can also have a state actor or even a commercial company that is using misinformation but then ordinary people, like normal social media users may spread it convinced that is true so they are unwittingly spreading misinformation but not intentional misinformation.
And then you have a third prong which is information that is verifiably true to achieve an impact or effect. So you can classify it in three ways depending on the goal or the tools you use, and also on the circumstances, for example, perhaps there are also events accompanying.
SM
Ok, very good. Let’s put that into practice then. So you mentioned Doppelganger, a Russia-based group, in your recent newsletter. What is this about?
LO
So, according to recent information, it is a Russian state-supported group that is manufacturing narratives and, let’s say, informational takes that are supporting the Kremlin’s point of view and they are also engaged in sewing divisions in western societies, and they do this in different ways, for example by setting up fake websites that portray themselves as media, they even copy the design of esteemed websites such as for example the Guardian, hence the name doppelganger — they are they are pretending to be real media and some people may make a spelling mistake or they are being referred there, for example, by clicking on a link. It looks like the Guardian but they are seeing distorted messages. So that is informational influence.
SM
So, yeah, I am really sad to see that these things actually work. I know some people that have fallen for these things, and then they spread them further with the best of intentions, so they sadly work.
But then we go into more blurry scenarios, right? Things are not so clear-cut as misinformation, with that bad intention, bad faith, with an agenda behind them… and I want to enter that blurry space and look at what the Digital Services Act (the DSA) in the EU is asking us to do, or is already doing. Isn’t there a risk that if an administration, a government, local government or national government can label a piece of news as misinformation, even though we’re talking about news media that are possibly shooting in every direction, maybe leaning towards one political party or the other… Isn’t that a risk to democracy and to free speech?
We’ve got something that starts smelling like this in Spain, with the government recently labeling certain online media as pseudo-media, as not real media, and it happens to be publishers that attack the government, that criticize the government, and I am not fully familiar, but I understand they’re gonna be deprived of advertising resources from public institutions, or, I guess, large entities in which the government has a controlling stake. I think that’s a very slippery slope.
But the DSA mentions the word misinformation 13 times, I’ve counted it. And then we’ve seen people within the European Commission breaking free and deciding what amounts to misinformation, and I’m thinking of course of Thierry Breton, who is no longer with us at the Commission. He told Elon Musk that he would not be allowed to interview Donald Trump on X, or at least that should not be available in Europe, because it would amount to misinformation, so fully understanding that Donald Trump cannot control himself, and he will lie through every pore, but still: How do you see that kind of intervention from the European Commission?
LO
Returning to your arguments, indeed there may be a balanced and objective view on the issue and all that all, but in the end the DSA needs to be implemented by Member States, and so it will be subject to political process. It will be a pity if some governments designed misinformation as something with which they simply disagree about. I mean, that’s not the point, and it will be under the interpretation of courts. The implementation of the very tools about fighting misinformation and disinformation, this is a result of years of hyping disinformation as a really dangerous risk, so that was several years in the making, and that’s how political processes in Brussels work.
So now it’s included and indeed it will be the major European law that is, or may be, used to regulate free speech or modulate it. So indeed no one advertised DSA as something that may be used as a tool of censorship, because blocking entire entire media, or entire social media, yes, that would amount to censorship and to a sort of censorship even predating the first world war state censorship.
So that’s also how the DSA trigger-happy Thierry Breton abused its capabilities: first he did it one year ago in 2023 during the riots in New Caledonia, where he threatened to block Snapchat or TikTok because it was simply a means of exchanging information or perhaps of coordinating those uprisings, with no information proving it being disclosed by the French state to my understanding. I can understand that sometimes such media may be misused, they may be used to coordinate riots, let’s say, that may happen, but threatening the blocking of entire platforms on some not really strong pretext, that’s probably crossing a few lines and this happened also regarding TikTok recently, so that is quite problematic because I think that policy-makers should be worried and very careful with issuing such [order] officially on paper as what was done by this Commissioner, and we need to remember that the power to block access to such a medium lies with that actual state, so it will be the Spanish state, the French state, the German state… they will have the power to do it and it works as follows: they may issue such a blocking order and it will work for four weeks and then the court may inspect the rationale and decide whether it’s possible to extend it. So yes, it’s all written in this file, although the parameters are completely undefined as of now.
SM
So yeah it makes sense that it happens at the national level. Certain things are illegal for example in Germany, and they may not be illegal in France, plus even consumer protection laws are very different even at regional level for example in Spain, so the closer you get to the consumer the closer you get to those sensitivities, but it is also more dangerous because, as you get closer, then the censorship of media has more political impact and that’s my worry. And so now we’re opening that debate both in the US and the Europe (well, in Europe it’s already been opened thanks to the DSA), but in the US there’s talk of of canceling Section 230, or reverting those precepts — in essence the same thing that we did in Europe back in 2000, so if you have a platform and people are publishing harmful content on your platform, that you’re not liable for it because it’s very hard to monitor, but if you do decide to monitor it -after that Compuserve case in the US for example- then you will not be liable for the moderation effort that you undertake.
LO
You are not liable unless you take any action about the content, you modify it somehow. In Europe it’s quite similar but the DSA quite puts a dent in that, I mean, changes it a bit in ways that the platforms are not liable but once they receive a notification of something potentially infringing they have an obligation to take action.
SM
Yes, and we have this notion of the “trusted flagger” in the DSA. Are you familiar with this?
LO
Yeah, that will likely be delegated to some NGOs, which would be more or less fact checkers, and I am not sure how that could work in practice. I would be very cautious because someone still needs to find those NGOs and you know how it is. If they have any inclination, they may be more inclined to fact-check certain sides, more than other sides.
SM
Yes, so that’s where I wanted to go, and I wanted to get your opinion. So, we can establish that whenever something is purely scientific then, ok, we know what’s true and what’s not. As we depart from that and we get into the opinions of others, or into what we think others are thinking, or what we think is behind a politician’s decision, in the sense of what her or his plan is, then we enter very muddy waters, and, as I had your book with me in the past few days I was reading an article in the New Yorker about this guy called Lehane who used to work for the Clintons and came up with very effective concepts like Right Wing Conspiracy, which I guess back in the day wasn’t such, and even then moved into the private sector and went on to increase the limits, or the money, that people recover after medical malpractice, then, through very effective public campaigns, with very questionable ads, eventually he moved on to do lobby for Silicon Valley. And he’s very good at projecting the right words and having an impact on the political agenda through propaganda.
So if we’re going to intervene on misinformation on the one hand we could be hampering free speech, on the other hand the manipulation of words is gonna keep on happening, and now when we open the debate, for example in the US, about Section 230, both sides of the aisle have different perspectives. Democrats tend to be asking for more moderation, for example, on Meta, on social media platforms, whereas Republicans are asking for less moderation, because they feel it’s hurting them, and they want more free speech or some do. Where do you stand in that debate? How do we find a balance?
LO
Well, my understanding is that once the States name Digital Coordinators they will have a working group and they will try to establish some standards which will be standards of operation on how to enforce the DSA. Those will be soft regulations. They will claim that it is based on the DSA, perhaps on other laws. In the end of course it will be subject to court oversight, including the European Court of Justice. But returning to your mentioning… so we need to distinguish that and we need to make sure that today we have various roles and various ways to describe such activities. We can say that lobbying may be PR, it may be communication expertise, it may be a number of roles about design and narrative, including political commercials. So, a hundred years ago the name would be propaganda, but today we have a versatile description and some of it has simply been legalized because it’s transparent: we know who finances it, we also can clearly distinguish when a commercial ad for washing powder or Coke is legal, even using persuasive techniques.
So it’s not that when someone tries to persuade or influence the audience it is always harmful, so there needs to be a balance and in Europe, for example, there is a European Court of Human Rights verdict which says that when it comes to political campaigning there is a really broad permission for political influence, because otherwise there would be a risk to a society and the democratic order, so someone could always claim that something should not be welcome in the public debate and then it could start affecting democratic elections and things of the kind.
SM
Look, we’re a few days away from the election, here in the US.
LO
This is also why western societies may be a bit prone to external influencing because, for example, external actors might exploit such permissive stance to inject harmful narratives or inject harmful influencers like for example the recently exchanged Pavel Rubtsov which was, there is an understanding that he may have been a Russian military intelligence operative, but he is also a famous and recognized Spanish journalist under the name Pablo Gonzalez, and some people in Spain, until this date, doubt that he was actually a Russian intelligence operative, and even after they saw him, well, at the Moscow airport welcomed by president Putin.
SM
Oh, that’s impressive.
So, a last question for you. Back to the election. Something that I heard or read, but who knows, right? [Something that] is orchestrated by Russia and others is that if Trump loses, it will be because the election has been stolen. That campaign seems to be effective. It seems to be working from where I stand: the idea that millions of illegals are being allowed to vote without an ID to steal the election, so that if Trump doesn’t win, as he should, the election will have been rigged. If you want the US to face a civil war, I cannot think of anything more effective. How do you see the US can survive something like this?
LO
Indeed in the book I am also considering the possibilities of making a Coup de état, I mean, trying to hijack power in a state, starting from information, not resources. So, that’s not sufficient because you still need some strength and power, and I mean by that quasi-state or parts of a state structure to actually hijack power and, in particular, the US is quite resilient to that, because it’s a federation, basically, so you have a number of states, all of them have their own law enforcement, armed units… there are also dozens of federal agencies, so it’s not that easy to hijack power and such… so you can’t take over power in a state only with propaganda. You can’t.
SM
Thank you. That’s a relief. I wanna believe that, that’s helpful.
LO
Thank you.
References:
Propaganda, by Lukasz Olejnik
Lukasz Olejnik on Cyber, Privacy and Tech Policy Critique (Newsletter)
Doppelganger in action: Sanctions for Russian disinformation linked to Kate rumours
The story of Pavel Rubtsov (“Journalist or Russian spy? The strange case of Pablo González”), The Guardian
Silicon Valley, The New Lobbying Monster (mentioning Chris Lehane’s campaigns), The New Yorker
Financial Times: Clip purporting to show a Haitian voting in Georgia is among ‘Moscow’s broader efforts’ to sway the race
“Pseudo-media”: Spain proposes tightening rules on media to tackle fake news.