Voted Number One PR Podcast in Goodpods
Sept. 8, 2024

The Disinformation Playbook

The Disinformation Playbook

There is an anatomy of a disinformation campaign.  

In this episode we reveal the tactics that make false narratives so dangerously effective. From the subtle beginnings of a lone post to the orchestrated amplification by bots, and finally, the unwitting spread by receptive hosts. These campaigns infect the public psyche faster than you can blink.  

But we can fight back. We can invest in early warning systems, develop proactive diagnosis and intervention, and perhaps most uniquely, we can conduct experiments to test different interventions and techniques on audiences – simulating exposure to disinformation in a controlled environment. 

Listen For:
4:40 Understanding the Life Cycle of Disinformation
9:36 The Role of Receptive Hosts (or DisInfluencers)
11:15 Importance of Early Warning Systems
15:06 Experimentation and Simulated Exposure 

Guest: Shayoni Lynn, Lynn Group
Website | LinkedIn

How Hate Spreads: How Hate Spreads: A briefing note from The Misinformation Cell - Lynn

Mpox and Conspiracy Theories:  Mpox and conspiracy theories - Lynn

Rate this podcast with just one click 

Stories and Strategies Website

Do you want to podcast? Book a meeting with Doug Downs to talk about it.

Apply to be a guest on the podcast

Connect with us

LinkedIn | X | Instagram | You Tube | Facebook | Threads

Request a transcript of this episode

Support the show

Chapters

04:40 - Understanding the Life Cycle of Disinformation

09:36 - The Role of Receptive Hosts (or DisInfluencers)

11:15 - Importance of Early Warning Systems

15:06 - Experimentation and Simulated Exposure

Transcript

Doug Downs (00:05):

In 2019, the confirmation hearings for US Supreme Court Justice Brett Kavanaugh were a flashpoint of controversy. Amidst the intent scrutiny, several allegations of sexual misconduct were brought against Kavanaugh. One particularly sensational claim was promoted by attorney Michael Avenatti and his client Julie Swetnik, who alleged Kavanaugh had been involved in gang rapes during high school. The claim was widely covered by many mainstream media outlets amplifying the narrative without proper verification. The allegations were later found to be lacking credible evidence, and Avena himself was discredited. Another example of misinformation is the birther movement, which falsely claimed that President Barack Obama was not born in the United States and was therefore ineligible to be president. This conspiracy theory was primarily fueled by folks like Donald Trump who repeatedly questioned Obama's birthplace despite overwhelming evidence he was born in Hawaii, the birther movement persisted for years, so in doubt and division, and serving as an example of how disinformation can be used to undermine the legitimacy of political opponents.

(01:17):

Even after Obama released its long form birth certificate, some continued to question its authenticity. These two stories reveal myths and disinformation comes at you from all angles, and they reveal a common thread, the willingness of individuals and groups to spread myths and disinformation when it serves their purpose. Remember what Mark Twain once wrote. "A lie can travel halfway around the world while the truth is putting on its shoes." Want to know something? Mark Twain never said or wrote that it's just attributed to him today on stories and strategies, the disinformation playbook, because in the game of truth, some players are experts at moving the goalposts.

(02:22):

My name is Doug Downs, and just as we get started with this episode, big thanks to Vince Nero who publishes the BuzzStream newsletter for digital pr. Vince included stories and strategies in his list of best PR podcasts saying What I like is they cover the intersection of marketing and pr. And that's MarComms, right? In addition to the content itself, this podcast stands out to me because of its top notch production quality right to my heart, and this is where I have to thank our producer, right, Emily Page. Thanks Em and thanks Vince for mentioning us in your post on LinkedIn and your blog article. Vince and I met earlier this morning. We're going to have him on an episode of the podcast. He's going to talk about search engine optimization, so look for that coming up. My guest this week is Shayoni Lynn joining today from Cardiff in Wales. Hey Shayoni.

Shayoni Lynn (03:13):

Hi Doug. Thanks for having me again.

Doug Downs (03:16):

Good to see you again. How are things in Cardiff? Still warm and sunny and rolling hills of green, right?

Shayoni Lynn (03:21):

I mean it's never warm, but yes, climate change and all that's warmer than it used to be.

Doug Downs (03:26):

Yeah, yeah. Shayoni, you are the CEO and founder at Multi-award winning behavioural science communications consultancy. Lynn Group, you are a C-I-P-R and PRCA fellow, the chair of PRCA C-Y-M-R-U, and I didn't know what that, what is PRCA? CYMRU? What is CYMRU? I don't know what that means.

Shayoni Lynn (03:50):

CYMRU is Welsh for Wales, so it's the Welsh language for Wales. So it's nothing other than the Welsh language. So I love that you said CYMRU. I'm sure all the Welsh listeners will really appreciate that.

Doug Downs (04:03):

Well, you told me seconds ago before we started recording, so by

Shayoni Lynn (04:06):

That was meant to be a secret.

Doug Downs (04:08):

The hamster didn't quite fall off the wheel. How do I say, public relations in Welsh?

Shayoni Lynn (04:12):

Oh, no, no, I don't speak. We so don't

Doug Downs (04:15):

Put you right on the spot. You're also listed in PR Week Power Book, the definitive list of the most influential and respected communications professionals in the UK. It is always great to have you on the podcast shone. You outline three macro steps to a disinformation campaign just in case I want to start one. What are those steps?

Shayoni Lynn (04:40):

Please don't start one, but the lifetime of a disinformation campaign has certain specific patterns I suppose that we should all be aware of. As PR and comms practitioners, the first of it is it typically starts small. So false narratives typically could start with a lone post, a couple of posts on Facebook or a few posts on Twitter. It might seem really innocuous at first, but the people who understand the nature of misinformation and are researching it will be able to identify tropes and codes and signals from those very small innocuous looking initial posts. And then once you've got a couple of those lone posts out in the information environment, typically a disinformation campaign will then build with bots, typically with bots, but it could also just organically latch on. But typically bots are deployed to push those narratives out into the information environment. You'll probably see them repeating the same content, having the same message, so it's bot activity.

(05:53):

And so they create this sort of noise if you know what mean they do it. So it suggests that that narrative is legitimate by virtue of volume alone. And then finally, you wait for receptive hosts. So typically when the soft narrative, a false narrative is bubbling under, you will find that there are certain people who will share this content, whether it's wittingly or unwittingly because it serves some sort of benefit to them for some such as high profile disinformation influences. It's because it engages their base, it gives them engagement, it gets them coverage, it proliferates fathers the ideology into their communities. Fathers like the general public, those narratives might conform to their existing beliefs. If you think about, and I know we are going to talk about potentially this analogy in a bit more detail, but if you think about the analogy for virus, the receptive hosts lack an immune system.

(07:00):

So they share this narrative because they're receptive to it and they push it further and further into the information environment, and they information ecosystem spills into multiple channels. And with enough momentum, this type of disinformation campaign will latch on to the public psyche. And this spread, although it might seem like it takes a bit of time to build, it, can be rapid and cross channel and create monumental amounts of false content which ripple through a population faster than you can blink. And so by the time organizations wake up realize this has happened and start debunking and fact checking, the damage is done.

Doug Downs (07:44):

Wow. Two key pieces that stood out to me. There were the bots and the receptive hosts. So let's break that apart, these bots, and I've got this image of storm troopers programmed by Darth Vader. In my mind, is there a Darth Vader character with an army of storm troopers, the bots, and seeing the posts and putting the bots to them or these bots programmed by an algorithm looking for something that agrees with the perspective they're already programmed to agree with, and then spreading that disinformation. How do the bots work first?

Shayoni Lynn (08:20):

So probably a bit of both. So we know that disinformation campaigns organized false inflammation campaigns are often at some point sponsored by state actors, typically certain state actors that we can I suppose point to. It could also be sponsored by individuals who work in certain political groups where there's an agenda that is met. But doing so there is a financial motivation in being part of a disinformation ecosystem. So there will be both, I suppose, individuals who are leveraging technology to proliferate a narrative as well as potentially algorithmic platforms. I suppose a algorithmic ways of sharing the information, which may or may not be intentional. I'm not too sure actually, because I haven't really delved into that. That's a really good question, and I'm going to take that away and start looking into it.

Doug Downs (09:22):

Okay. And those receptive hosts, because that's the next layer, are they influencers or what we might call dis influencers of some kind? Are they people we know or think we know and already trust? How does that work?

Shayoni Lynn (09:36):

So I love the term influences, and it's not a term that I can take credit for. It was a term coined by Mark Owen Jones, who if you are aware, during the recent UK riots, did some incredible real time analysis of the information environment and him up with this tub. And he called Elon Musk dis influenza in chief, which I also think is absolutely brilliant. But going back to your question on receptive hosts, again, going back to that of analogy and lens of viral transmission mission, think of it as ideas pass between those susceptible to an ideology infecting the minds of a population who may believe false narratives in good faith, or they might believe it or they might even often, one of the key insights is sharing misinformation is not believing misinformation. So often these receptive hosts may share a lot of false information without believing in it. So I think that sort analogy of a virus infecting a population is a good one to remind us of how it proliferates and passes through continents and populations. And Cambridge University professor Sandra Vander Linden, who is obviously one of the key, I suppose the most, well-known names and misinformation research, he refers to this analogy as well in his book, foolproof, which I recommend you read.

Doug Downs (11:07):

Perfect. So if we know the playbook, what are the steps to prevent and when needed respond and remedy?

Shayoni Lynn (11:15):

Yeah, that's a good question. So if you think about the three steps to being proactive, I think that's the way we are trying to get our industry and our clients to think about response. So the three key steps I think is worth noting. The fast is to invest in an early warning system. Now this is an active monitoring system specifically built to identify misinformation. So not social listening, although some of that technology is used in this, but specifically used to identify information. Now, this can be an incredibly powerful tool in your arsenal because an early warning system will provide longitudinal data and that can help flag potential information threats before it becomes a crisis. So once we've then identified these key information threats, the second, and it goes back to behavioral science, which is the podcast we did previously, it's about diagnosis. So diagnosing those threats correctly.

(12:24):

Going back to that insight, that sharing misinformation is not believe in misinformation is very I to assess your information threats against your audience to understand what is harmful, what is influencing our audiences and communities, and how can we protect them proactively. So this is all about being proactive and preventative, I suppose, in solutions. And then finally, once we've understood the landscape in more detail and more nuance, I think it's about designing effective interventions that can help mitigate the harms from disinformation. And I think for too long, Doug, we've, as an industry, we've believed that we can fact check our way out of this. And I think we are starting to wake up to the fact that that's probably not the magic bullet that we thought it was. Because if you're fact checking, if you're debunking, you're already in a crisis, it's a reactive response. So we need to consider proactive interventions like pre bunking, critical thinking, analytical reasoning, and also using experimental approaches such as understanding what interventions work and what don't before you activate those interventions in the population.

(13:44):

And you can do that with things like, I was going to say synthetic audiences, but that's exactly not what I meant. You can do that with running experiments within your audiences to understand which techniques, which tools, which will piece of content is most effective in getting your audience, getting people to pause and consider when presented with false information. And I think that's the key takeaway is how do we get our audience to pause when they are presented with false information in a live environment? And false information is created to be emotionally responsive, right? It's created to affect our beliefs and our identities. So it's a really big ask, trying to get our audiences to pause, consider, and challenge themselves on their existing beliefs. So I think that's a long-term strategy. It doesn't happen overnight, but if you invest in some of these methodologies, you can start priming your audiences to think differently about when they're exposed to false information.

Doug Downs (14:54):

Is that what you mean by simulated exposure?

Shayoni Lynn (14:58):

That's exactly the experimental setting that we were talking about, right?

Doug Downs (15:01):

Yeah. And have you done that? Are you able to share any stories of simulated exposure?

Shayoni Lynn (15:06):

Yeah. Well, as you know, we at Lynn, we are the only UK agency that has embedded experimentation into all our campaigns. So that's very much in our wheelhouse. It's a similar methodology to running experiments. And what that means is you are exposing your audiences through randomization, obviously using quite robust academic methodologies to different interventions, different techniques, tools, triggers to see what helps them better assess the situation better, understand accuracy of source, for example, better understand manipulation of content or imagery for example, and then make a different decision. So you set a baseline and then you run them to an experimental phase through different items you want to test and assess against. And then you assess them post that to see if there's been a change in how they would respond in a real-time situation.

Doug Downs (16:12):

And fairly quickly, if you could, you touched on this, that facts at the end of the day are not going to win the day. It's probably because it's humans. The right side of the equation has already been settled, and now we're arguing over the left side of the equation. I'll give you an example of what I mean by this. The first murder trial I ever covered as a young cub reporter in my twenties, I would listen to the final arguments from the prosecution and was completely convinced this guy's guilty, he committed the murder, the jury is going to find him guilty, no doubt about it in my mind. And then of course the defense gets their opportunity, and the way this plays out in real life is these aren't 10 minute dissertations. These take a day or two for them to give their full spiel.

(17:02):

But as the defense laid out their case, there were all kinds of holes in the prosecution's case all over the place. And I laughed going, well, now I think, I don't just think they're not going to find him guilty. I think he might be innocent, and I didn't know what to think. Who's facts at the end of the day, do I rely on facts or facts? But if I omit this set of facts and only focus on this set of facts, I reach this conclusion as opposed to that one. This almost seems like a philosophical issue.

Shayoni Lynn (17:33):

Well, quite so dialing it back to misinformation, and a lot of the reasons misinformation is effective or disinformation campaigns are effective is it is created to really resonate with our inner belief systems and our identities. So it uses quite effectively psychology and behavioral science principles to connect on a very deep seated level. But if you look at misinformation, false information, whether intentional or not, it's contextual. So a false belief or a false claim about the US military, for example, is that it hides information about UFOs from the public. Now of course it does because it's for national security reasons, so it doesn't create a frenzy within the population. But note here, the statement the US military hides information about UFOs from the public has an implied claim that there is a conspiracy afoot. An implied claim is one that is made by reference. So there are many ways to imply a claim.

(18:43):

You could have statements and you see some of this, you talk about that court case where a truth value might be absent, a statement might leave out relevant information or statement might be framed as a truth claim. So it is misleading. So there are lots of ways for us as individuals and organizations to present information to our audiences. So we are believed, and it may be intentional or not, so we must remember when it comes to implied claims, we are also dealing with motivated reasoning, which is a model of misinformation. Belief. What counts as true can sometimes depend on your perspective. So whose factors true? It is a philosophical question. I'm not sure we have the answer to that, but I think we can look to science, we can look to outcomes, we can look to if you ladder back through the conspiracy theories, there's a lot that we can use to debunk some of it, but ultimately, will we be believed that that is yet to be seen?

Doug Downs (19:45):

Yeah, that's why we're human. I suppose I want to introduce a new feature to this podcast. I want to ask you Shei to leave a question for our next guest, and you have no idea who it's going to be. Neither do I that's coming up next week, but leave a question behind and we'll ask them to answer it in our next episode.

Shayoni Lynn (20:05):

My question is looking at the social media platform, which I shall always refer to as Twitter, and looking at how Elon Musk is not just proliferating his ideology because we know over time he has become more anti progressive and more anti woke. There is now a big question about the responsibility of social media platforms. Zuckerberg provided that letter to Congress. I would like your next guest to consider the future of social media. Do we think that this is a moment in time, people will revert back to usual behavior, go back to Twitter because we are habituated to it, we know that that works. Or do we think that this is a pivotal moment where we will change how we consume information on social media and the types of social media that we access?

Doug Downs (21:06):

Brilliant. I love it. And in my mind, I've got this micro media splintering effect, which we've seen with the traditional media where they don't go away, but there's just a lot more of them. And I pick the ones that already agree with my opinion. That makes me smart, right?

Shayoni Lynn (21:21):

Yeah. But the challenge is if you start policing some of these platforms, there's a lot of unregulated fringe platforms that people can go to where there is so much more threat to individuals and communities. So you can ban something, but that doesn't do you ban everything. Well, I dunno. Big questions

Doug Downs (21:42):

Shayoni as always, thank you so much for your time. I love spending time with you.

Shayoni Lynn (21:46):

Thank you so much for having me, Doug. The feeling's mutual. Thank you for having me, and I hope to catch up with you again.

Doug Downs (21:53):

If you'd like to send a message to my guest, Shayoni Lynn with Lynn Group, we've got a contact information in the show notes, Stories and Strategies is a co-production of JGR Communications and Stories and Strategies, podcasts. We're launching a listener feedback series where I'm asking for five minutes of your time in exchange for $50, 50 bucks for five minutes, get together on a Zoom call. All I want to ask you is how you found the podcast, how you would describe the podcast, just good audience research that goes into any PR or marketing campaign. There's a link in the show notes, first 10 people that we get to interview. Thanks again to our producer Emily Page. And lastly, do us a favor forward this episode to one friend. Thanks for listening.