When the Media Quote isn’t Human: Fake Quotes – Real Damage

What if the expert quote you just read in a news article wasn’t written by a human — but by AI?
That’s already happening.
A PR tool called Synapse is selling agencies the ability to fire off automated expert pitches to journalists, complete with research, personal-sounding anecdotes, and polished email copy — all with minimal human input. It promises one person can do the work of five and crank out twenty media pitches an hour.
But is this innovation, or is it a warning sign for the future of public relations?
In this episode, we’re unpacking what Synapse means for PR and media. We’ll explore why this kind of automation raises ethical alarms, how journalists are likely to respond, and what PR professionals need to do right now to protect trust, credibility, and the real value we bring to the table.
Listen For
6:12 Creepy or Clever? How Synapse Targets Reporters
7:48 Fabricated Experts: Ethical Red Line Crossed
10:35 Should the PR Industry Be Regulated?
11:16 How Journalists Will Fight Back With Closed Networks
22:40 Don’t Blame the AI—Blame Ourselves
Stories and Strategies Website
Curzon Public Relations Website
Apply to be a guest on the podcast
Connect with us
LinkedIn | X | Instagram | You Tube | Facebook | Threads | Bluesky | Pinterest
Request a transcript of this episode
06:12 - How Synapse Targets Reporters
07:48 - Fabricated Experts: Ethical Red Line Crossed
10:35 - Should the PR Industry Be Regulated?
11:16 - How Journalists Will Fight Back With Closed Networks
22:40 - Don’t Blame the AI—Blame Ourselves
David Olajide (00:01):
You've probably heard this story before. A young apprentice left alone in his master's workshop decides to take a short court. Disney Fantasia made it famous, but the lesson behind it is older than cartoons and just as relevant today, take a listen.
Farzana Baduel (00:22):
The Sorcerer's apprentice was eager, ambitious, and tired of the endless jaws. Day after day, he hauled water, scrubbed floors, stacked firewood, dreaming of the day he would have real power. Once when the left one afternoon, the apprentice for his chance, he took the old spell book down from its high shelf, traced his finger over the fading script and whispered the incantation with a shutter and a glow. The broom at his feet stood upright, grabbed the pale, and began to fetch water on its own. The apprentice watched wide-eyed as his burden disappeared. No more backbreaking work, no more sweat just results. Fast, easy, efficient. But soon the bucket overflowed. The floors began to flood. Panicking. The apprentice shouted at the broom, grabbed it, even smashed it in half, only to watch two brooms rise where one had stood both working twice as fast. What had been a triumph turned into chaos and the water kept coming.
(01:27):
By the time the sorcerer returned, the workshop was an ocean with a single word. The master calmed the room, the brooms dropped, and the apprentice collapsed at his feet, drenched and ashamed. The sorcerer wasn't angry, he was disappointed. Power without wisdom, he reminded the boy is no power at all. And so the apprentice learned that the tools we use where the magic muscle or machine are only as good as the hands and minds that guide them. A shortcut might save you time, but it won't save you responsibility without care, without judgment, without understanding, even the best tool can turn on its master. Today on stories and strategies, the rise of AI and public relations. When we hand our work over to artificial intelligence, are we gaining power or flooding? The workshop Can pr survive if we trade real human connection for synthetic speed? My name is Farzana Baduel
Doug Downs (02:42):
And my name is Doug Downs. Now, recently, the UK's Press Gazette uncovered the inner workings of Synapse. It's an AI tool sold by Lithuania based PR agency, Wellstone pr. Now, synapse is designed to automatically respond to journalist requests for expert comments on platforms like Hero help a reporter out quoted and response source. That's one thing that it does, and I've got a problem with that by pulling information from books, podcasts, and reports. The tool also generates personal sounding email pitches, so it creates quotes in response to a reporter's query, and it can create pitches. Sometimes even inventing anecdotes with minimal human input. Wellstone claims that one PR editor using Synapse can handle the workload of five people writing 20 pitches an hour. It's not about the quality, it's about the quantity, right? It's a numbers game. Just flood the area and something's liable to stick. And this might sound like an efficiency win for agencies. It raises serious questions, Farzana about ethics, trust, and the future of pr and where we're going here.
Farzana Baduel (03:54):
And Doug, we are going to dive into this. I was absolutely shocked when I read the article on Press Gazette. Now we're going to break it down into three different areas. Number one, we're going to talk about efficiency meets ethical risks. That's going to be a dilemma that in the PR industry we're going to face more and more. Number two, we're going to discuss how are journalists going to respond to this, because this is just the start. And then the third, we're going to talk about the future of pr. So is it going to be, is the human center nature of pr? Is it going to be lost to automation? So let's dive in. First of all, efficiency meets ethical risk. So in terms of efficiency, wow. And also in terms of cost saving. When you're running an agency, instead of having a team of five crafting pictures, getting approvals and editing and all of that, you've got this machine that just churns out pictures.
(04:46):
And yes, it is super efficient, which means it's going to cost next to nothing compared to the cost of humans in order to hire them and to have space for them, office space and all the auxiliary costs involved in hiring humans. But the ethics, I mean, I could see why agencies are drawn to using tools in terms of cost savings and speed and getting the media coverage. But what about the ethics? You're a former journalist, Doug. Tell me about how it would feel for you to actually know if a PR was using this, how would you feel? I mean,
Doug Downs (05:26):
Yeah, I was a journalist for 12 of my 15 years in broadcasting, and there's kind of this sacred as a journal. There's this sacred, we are soldiers for democracy, and that's never gone away. And in its purest form, it's actually quite beautiful that there's no other entity whose design, whether you think the media is behaving this way is a whole other question, but by its design, it's supposed to be the watchdog. You Can't buy it. You cannot favor from it by spending more money in theory, it simply holds those to account who need to be accountable. So that's the sacred territory of a journalist.
Farzana Baduel (06:08):
Doug, how does it work though? I mean, so how do signups works?
Doug Downs (06:12):
So it'll make pitches. So let's say an agency is hired to pitch Farzana badge. Well, to be interviewed in your newspaper talking about public relations, what the AI tool would likely do is scrape that editor or that reporter and understand what their interests are. Something I preach to my media training clients before you make a pitch, what else is this reporter covered? Do they lean left? Do they lean right? Do they talk a lot about hot dogs? Do they have a daughter? Are they married? Where do they live? Well, how do you know the
Farzana Baduel (06:45):
Reporter sounds a bit creepy.
Doug Downs (06:46):
It, yeah, yeah, every breath you take, I'm watching you. Yeah, but the AI can do some of that and probably really effectively, because journalists tend to post a lot, right? So on the pitch side, I wouldn't like it as a journal, but there's a discussion and an argument, and I can't label it unethical, but if ai, the other thing Synapse is doing is when I, as a journalist say, let's say I post on Harrow on help a reporter out, look, I need an expert on this peanut butter and jam. And if suddenly AI is able to generate a quote and say this, I have this expertise. My name is Jerome Jones, I am a peanut butter and jam expert, and here's the real secret to it, and it makes up a quote that's completely fabricated. No one's actually being quoted. There is no Jerome Jones. And that's so they're
Farzana Baduel (07:47):
Just making the experts up. They're
Doug Downs (07:48):
Just making that crap up. And even if they're not making
Farzana Baduel (07:51):
Up the expert's, terrible,
Doug Downs (07:52):
If they're just making up the quote to tailor it to the ask unethical IO, unethical. Yeah,
Farzana Baduel (08:01):
I mean, let's face it, journalists have a pretty low trust in PRS anyway, let alone in this tool that is making things up and making a person up completely. I mean, that is deception of the highest order. I mean, I'm surprised that I'd like to know who's using this tool on what level do they think it's acceptable? I just don't understand. I'm absolutely shocked because it's one thing using automation to create pictures and using chatt to create research and all that, but another thing to actually make up an expert in order to, and then it's like this whole new world of post-truth. I mean, does it even matter anymore? Do people have to be real? I mean, it's extraordinary.
Doug Downs (08:53):
Well, Brett Farloe, the CEO at HALO, a reporter out was on our last episode and he mentioned that what he sees is journals are surrounded like seven to one, 14 to one by PR people, you and me. So journalists already feel completely surrounded. Now there's AI out there. Can you imagine the attack that that feels like to a journalist who feels they are a soldier for democracy? They feel like democracy is under attack.
Farzana Baduel (09:28):
It feels like they're under siege. Really when I think about it, these journalists, they already are underpaid upon PRS and already be outnumber them. And then on top of that, we're now using AI to supercharge and they don't have the resources to fact check like they used to and go back and they assume that the PR is sending them the information that the experts are real. And here we have a tool that PR agencies are using that are making up the experts on the fly. What I don't get, obviously I'm the present that for the and public relations, and we have a code of conduct. Also our other industry body in the UK, PRCA, have a code of conduct. The problem is that our industry is not regulated. And so therefore PR agencies don't have to sign up to a code of conduct and therefore can operate outside. And when they operate outside, this is what we see. So should we be regulated? That's a whole other chapter.
Doug Downs (10:35):
That's a whole other. So onto our chapter two, how journalists will respond to this. I think they're going to put up walls and defense mechanisms, abandoning open calls, relying only on closed vetted expert network. So you can only get into the network if they know who FARZANA is and you've passed a screening. So there are already platforms like Source E that do this, and they push for video verified experts to cut through the noise because they're just getting flooded with all of this noise. If a journalist was willing to accept pure AI input, it would just use chat GPT themselves. They don't need to try to get a quote
Farzana Baduel (11:13):
Completely. Doug, you mentioned source C. What is source C? I've never heard of them.
Doug Downs (11:16):
Source C is just one of the closed networks that journalists can use where it'll video verify, it'll go on with farzana, have a conversation with farzana, a live person and say, okay, yes, you are certified. You can join this closed network. So that's one example of where journalists are starting to turn. I think you just see more and more of it. So it creates a business opportunity for a smart software entrepreneur,
Farzana Baduel (11:43):
Right? I mean, it's hard enough to get hold of journalists nowadays because it's not as if they're waiting around the landlines off the newsroom. And if you managed to get their mobile phone, I mean that's really, you have to be a part of this vetted inner a circle in order to be able to communicate to them. And of course they get hundreds and hundreds of emails. And so in a way, we're shooting ourselves in the foot actually just firing these sort of AI driven pictures full of fake experts is, yeah, I completely agree. I think journalists are just going to draw up the bridges and say, right, you guys can't be trusted. I mean, should we as an industry be looking to try and hold people to account because it will affect the rest of us even let's say someone like myself, I would never dream of doing this, but if there are people operating in our industry, should we be able to say to the people you are destroying it for the rest of us just don't do it.
Doug Downs (12:41):
But
Farzana Baduel (12:43):
There's no sort of industry body that can do it if they're not a member
Doug Downs (12:48):
For now. And the more fake noise that we send out there, the more the real voices get drowned out. And it's like anything else like misinformation, the fake noise will win. It will win because it has AI that's stirring it up. So then I start thinking we have genuine small town experts, whatever small town looks like that could be a small group, it could be a niche expertise. And you're going to have people who've earned their personal brand is tied to this, and they are a known brand in this niche expertise. I think you start to see more and more that personal branding, and we're already leading this way more and more personal branding becomes that much more important. You can trust me, I'm real if my name's connected to it on this little piece of knowledge. It's real and it's authentic. And I think more and more journalists are going to turn to those folks.
Farzana Baduel (13:47):
Back in the day when Edward Bernas was kicking around and reinventing PR for the modern age, they started working on something called astroturfing. So creating these fake illusionary environments for journalists on press trips going off to South America.
Doug Downs (14:05):
Oh gosh. And
Farzana Baduel (14:05):
Pretending there was communism when there wasn't. So what you are actually seeing is you're seeing the full circle of astroturfing by using, but now we've got AI producing these fake people with their fake XX expert comments in order to bolster a client brand. And also what about the reputation of the brand? I mean, if this is found out, it will harm the brand that they are ironically trying to build trust with stakeholders.
Doug Downs (14:31):
Yeah, definitely. So not to pick on a brand, but off the top of my head, if Kool-Aid was caught generating fake quotes, I don't know the AI pitches, I don't see them as fake. I see them as ai. Maybe I'm naive, but the fake quotes is what gets under my skin. If that was attributed to Kool-Aid, I that would have a damaging impact on Kool-Aid overall. What else about the product is fake? What else about everything they've said and done over the last 20 years, well before AI is fake and how can I trust this brand in an age where authenticity is that magic buzz phrase that especially Gen Z and the millennials, they want to know that authenticity is a part of your brand, your organizational brand, and your personal brand. So if you're caught doing this, I have to think, I'd like to think it's extremely damaging. Another complicated farzana is that in the world of journos, they only have so much ability to give PR people crap for what we build because they know most of them someday they probably want to become us. They probably want to become part of the PR rank as they get into their thirties or maybe their forties, the majority of them end up going that way to make a better living to raise their families. And so they only want to give us so much flack.
Farzana Baduel (15:57):
Yeah, it's true. And I must say that I've seen journalists for the last 15 years coming into PR industry. I've worked with them as they've made the transition, and I do think they've improved our profession because they brought with them that sense of purpose, that sense of higher order, that sense of ensuring that what we put out, is it authentic? Is it true? Are we misleading? And so I think they've actually brought in a much more sort of like a mission orientated approach to public relations.
Doug Downs (16:23):
That takes us to chapter three, the future of PR and how PR teams can use AI responsibly. Fact is we're going to use it in some cases as a replacement for things. And I suppose that's a hold, like writers, sorry, I have a lot of friends. They're good writers and I'm hearing from them because jobs are either in jeopardy or jobs have been lost and are hard to win back. And it's sad. But AI is replacing some of that. Are we still a human-centered job function? Are we a blend of human oversight to AI production?
Farzana Baduel (17:05):
There's so many permutations. I mean, first of all, in the UK we've got this make it fair campaign. So we've got the news agencies that are saying to the government, listen, can you please ensure that these AI tools are scraping our content will pay us something? And if they don't, then the news agencies can actually just sort of prevent the AI tools from scraping, and then that means the AI tools will start scraping content from more lower quality sites, and then that gives rise to information and disinformation. And so for us, from a PR perspective, all of these, even the way that the UK government will regulate these AI tools will have an impact upon how we operate. Now, are we going to be more human centered or lost to automation? I think probably a bit of both. I think I met somebody yesterday and he said to me, he's creating this new AI first agency.
(17:59):
It's a PR agency in your pocket, and there's that little sections of award submissions and speaker pictures and writing a pitch for a client and tone of voice. And I was absolutely stunned. And he's building this and he's been building it for the last two years. And his sort of prediction to me was actually agencies are no longer going to be relied upon for tactical execution because that's going to be moved in-house because they're going to have these tools at their fingertips. But what BRGs are going to be useful is more strategic and more relationship orientated where agencies have this great vantage point of working across multiple clients and they can bring learnings that so you can start seeing shifts between agency and in-house. Agencies aren't going to dissipate overnight, but the work that they're going to do perhaps moves from tactical to strategic. And maybe that's not such a bad thing.
Doug Downs (18:54):
Good. Yeah, no, that sounds actually really promising. The perfect model is probably that AI is doing a lot of the work, but that it has human oversight. And that sounds kind of utopian. The problem with that concept is we don't trust people. If you look at the Edelman Trust barometer, we don't trust journalists. We don't trust business leaders. We don't trust PR people. They're just spin doctors. We don't trust people. We do trust machines
(19:27):
Unless we know the people. And then I think someone is just like me or is someone I understand why they are just themselves. So again, that takes me back to how important personal branding is going to and organizational branding for sure. But the humans within the organization, the ones that I can look to for leadership to understand what matters to them personally, and that's why the astronomer CEO fallout was so dramatic. You had this rich cat committing one of the ultimate sins, having an affair and getting embarrassed on the big screen and looking like an idiot while the two of them looked like idiots doing it. That's why that casted so well around the world because this is someone who we're supposed to trust and I have no trust in you whatsoever. And therefore I laugh at you and I feel for the families. But that's why it became such a strong me.
Farzana Baduel (20:29):
I think what drove that story was that it was human centered. It was centered around our fallibility as humans. The fact that it doesn't matter if you are a leader or a janitor of an organization. We have these human frailties, we have these Achilles heels. We are not perfect, and we are fascinated by each other. We were fascinated by he's fall from grace because that could have been us. That probably was us in a different time in a different way because people fail. And I think what's really interesting about the North American culture perhaps compared to the British culture is I love the way the North American culture is. They watch somebody fail spectacularly like a Martha Stewart, and then they give room for them to learn, compose themselves, and then get back in the ring. Whereas I think
Doug Downs (21:15):
We love a comeback,
Farzana Baduel (21:16):
Right? Yeah. Trump, for instance, great comeback in the White House and in the uk we're a little bit sniffy, we're a little bit, oh yes. But in 1984 they said that. And that's what I love actually. I love that culture. And I dunno, I know it's an American culture. It's also a Canadian culture thing that
Doug Downs (21:39):
I would say. So
Farzana Baduel (21:40):
They love the comeback kids.
Doug Downs (21:41):
I hate to say American Junior, but
Farzana Baduel (21:43):
It is, I love it.
Doug Downs (21:44):
I think we like a good comeback story. If someone's got the fortitude to get knocked down, it's a boxing idiom. It's not how hard you get knocked down, it's how quick you get up. And I think a lot of people subscribe to that story.
Farzana Baduel (21:58):
Well, I think human-centered or lost to automation, I think time will tell. And we are going to chart the transformation of our industry and see where it takes us.
Doug Downs (22:11):
And tools like Synapse, really these are a mirror. These reflect back on who we are. These aren't just evil tools that are out there. It took a human to write the code. It took a human to say, let's sell it. And it takes humans to buy into it. And we as the bigger society, it takes humans to accept that this is in our society. If we're all okay with that, don't blame synapse blame ourselves. Just look in the mirror for the fault.
Farzana Baduel (22:40):
And it's up to us to set boundaries for our profession. And that's what makes me think about countries like Zambia who are regulating the PR industry because they are aware of the impact of the work that we do without any guardrails whatsoever. And this is a clear example of what goes wrong without any guardrails in place. Here are the top three things we share today about Synapse. Number one, balancing efficiency with ethics and AI powered pr. We question the ethical implications of using AI tools that fabricate quotes, even if they boost efficiency and cut costs. Number two, impact on journalists and industry trust. We acknowledge that AI generated pitches can erode journalists trust in pr, putting our credibility at risk. Number three, the future of PR and our responsibility, we agreed, is up to us to maintain ethical standards and push the industry towards more strategic human-centered work.
Doug Downs (23:43):
I think we're going to see more and more tools being used and developed like this until somebody cries foul and creates a stink without a stink, without a crisis, it's just going to get worse. That's what I think. What about you?
Farzana Baduel (23:54):
Well, I mean, I only found out about this tool because of Gazette, and that's why it's so important, the
Doug Downs (24:00):
Media
Farzana Baduel (24:00):
Us. Yeah. It's important for us prs to support our journalist brethren to ensure that they are still out there keeping a watch on this, because I would've had no idea this is happening.
Doug Downs (24:14):
If you'd like to send a human message, our contact information is in the show notes. If it's nice and happy, send it to me. If you got a little something you want to say, send it to Farzana. She loves those stories and strategies is a co-production of Curzon Public Relations. JGR Communications and Stories and Strategies podcast. If you like this episode, leave a rating, possibly a review. Let me tell you what that does. It doesn't actually impact the algorithm in Apple or Spotify. It could put you on the Apple charts. It's pure social proof. When people stumble across the podcast, oh, 10 ratings, they must suck. They want to see how many ratings are on there. Your rating at 10 people left a rating. That's fantastic. It goes a long way. Thank you as always to our producers, David Olajide and Emily Page. And lastly, do us a favor forward this episode to one friend. Thanks for listening.