In an age of rapidly exchanged – and often unchecked and unsubstantiated – information, those involved in transport know as well as most the difficulties this brings. There are new challenges in presenting the public with the full facts and arguments about problems and their potential solutions, and ensuring they are not attacked and misrepresented unfairly.
In recent years, many local authorities have faced particular public fury and indignation when proposing and implementing Low Traffic Neighbourhoods and other traffic management schemes. Often this anger has been fuelled by misinformation – a false alternative narrative which is spread online, in order to mislead.
How best, then, can councils deal with such situations – and is there a proven approach and underlying methodology to apply?
Research shows you need to proceed with caution when publicly challenging misinformation.
Whereas the stated purpose of the social media platforms that facilitate this kind of debate is to connect us, let’s remember that the financial purpose of the companies behind them is to generate revenue for their shareholders by selling advertising space and audience data to digital advertisers.
The more time we all collectively spend scrolling through social media, the more money the platforms can make from advertising space. In order to maximise this time, the algorithms that recommend content to us – by placing it at the top of our news feeds – have to promote material which will draw us in.
Put simply, the result of this tends to prioritise content which generates strong emotion. In practice, this means promoting conflict, toxicity, and quite often – intentionally or not – misinformation.
With all this going on, it is pretty impossible to ignore what is being said, and rather essential to have an appropriate strategy in response.
While academic evidence shows that, in general, correcting misinformation with the facts is useful, we know that: a) it has a limited effect on behaviour change, and b) the academic research is taking place in a really simplified environment (i.e. a lab), and in reality the system is much more complex. Any time we correct a piece of misinformation publicly it can meanwhile have unforeseen knock on consequences, so we must respond carefully.
As an example, the importance of this is demonstrated in an analysis by academics at Queensland University of Technology on the result of the US Administration’s response to conspiracy theories about bioweapon labs in Ukraine. In its essence, their work shows that in trying to correct the misinformation, they simply exposed more people to it.
Here at Lynn’s The Misinformation Cell, we work closely with the UK Government Cabinet Office to implement their ‘Wall of Beliefs’ model for fighting misinformation. To use this model, you need to be able to answer three questions whenever you’re trying to make a decision about how to respond to a piece of mis or disinformation:
1) What is my definition of harm? To help with this it helps to know the target behaviours that you’re concerned about.
2) Does this piece of information cause direct harm to these target behaviours – or is it just designed to sow mistrust in an audience?
3) Is this misinformed belief superficial (i.e. it can be easily replaced) or is part of someone’s core beliefs (i.e. it can’t be easily replaced)?
Using the Wall of Beliefs matrix as a guide, the answers to these questions should help guide any decision about when to respond directly, when to focus on the underlying narratives, when to focus on managing behaviours or last, but definitely not least, when to simply watch and wait.
When a controversial matter is under discussion it is sadly very likely that ill-intentioned parties will get involved. Their Trolling and disinformation activities will have three core aims:
1) To give the appearance that fringe views are widely believed, using the ‘social proof effect’ to nudge new audiences towards their misleading beliefs.
2) To derail your messaging, moving you away from your key messages and instead forcing you to focus on rebutting theirs.
3) To drain your energy, damaging morale and draining creativity and focus from your team.
Thankfully, aim 2 and to a certain degree aim 3 are within our control. If we can stay on message despite the noise from social media, and we can maintain perspective internally about how unrepresentative some of these determined detractors really are, then we’re halfway to beating them.
To counter the ‘social proof’ that trolling creates, it is important to focus on building immunity within our audiences proactively – not reactively. When it comes to misinformation, proactivity is key because whoever ‘gets there’ first usually wins.
Sometimes the best response to a lie isn’t a fact, it’s a deeper truth. This is a mantra my team and me try to live by, and it’s paraphrased from one of President Obama’s speechwriters. It gets to one of the core truths about fighting misinformation: no fact check can compare to a compelling narrative.
The truth is that all of us mere mortals, regardless of how rational we try to be, make sense of the world through storytelling – and the disinformation spreaders understand this. Strategic communicators are beginning to understand this as well – Joe Biden’s presidential campaign team in 2020 deftly operationalised a storytelling approach to fighting disinformation.
At The Misinformation Cell, we do this in our own way, looking beyond the noise of social media through quantitative surveys to understand which damaging narratives are resonating with the public, and to understand which of our counter narratives might resonate too.
Sometimes we think misinformation is an ‘information problem’ when it is actually a ‘relationship problem’.
The most effective misinformation isn’t the wildest lies, but the misinformation that builds upon kernels of objective truth. For example, during the Covid-19 pandemic disinformation spreaders tried to leverage historic distrust between ethnic minority groups and health authorities to spread their harmful lies and propaganda about the safety of the vaccine.
In situations like these, corrective information won’t address the underlying truth that many communities who have been pushed to the fringes of our society have been underserved by government institutions and health authorities. You can’t reverse this with a fact check, you need to think about how to build relationships with these communities instead.
There’s a good chance that this consideration will often be relevant in transport situations – where customers or those affected will be nursing past grievances or remembering bad experiences just as you want to start a new conversation with them.
So try to understand the bigger picture, as when it comes to misinformation narratives, the biggest – and easiest – mistake you can make is to assume you need to respond to something simply because it’s getting traction on social media. This is a mistaken reaction that trolls are trying to constantly bait you
into making: responding to lies that you needn’t respond to, because actually your key audiences haven’t really been exposed to them.
In order to deal effectively with this new world of information, there is thus no substitute for quantitative attitudinal data from your key audiences. It is something that has considerable other benefits in the transport world too, so well worth the effort to obtain and analyse.
Changing behaviours and winning people’s confidence that we are doing something to help them isn’t easy. The range of factors that influence our beliefs and behaviours is very large.
And yet in focusing on the narrow scope of what is in our control, and marshalling our data about both our audience’s underlying attitudes, and what our proposals mean to them, means we can focus our energies in a more disciplined and productive way.
With such an approach we should have a fighting chance of getting people on our side, despite all the distractive noise out there these days.
The Wall of Beliefs: A toolkit for understanding false beliefs and developing effective counter-disinformation strategies
Stefan Rollnick is the head of The Misinformation Cell at Lynn – a 30-strong Cardiff based consultancy specialising in campaigns, behavioural science and anti-misinformation/disinformation strategy and research. Rollnick was brought in by Lynn to set up The Misinformation Cell in September 2021 after previously working as a disinformation analyst in the Office of the First Minister of Wales. He spoke on addressing misinformation at the recent ‘20's Plenty’ event in Oxford.
TransportXtra is part of Landor LINKS
© 2023 TransportXtra | Landor LINKS Ltd | All Rights Reserved
Subscriptions, Magazines & Online Access Enquires
[Frequently Asked Questions]
Email: firstname.lastname@example.org | Tel: +44 (0) 20 7091 7959
Shop & Accounts Enquires
Email: email@example.com | Tel: +44 (0) 20 7091 7855
Advertising Sales & Recruitment Enquires
Email: firstname.lastname@example.org | Tel: +44 (0) 20 7091 7861
Events & Conference Enquires
Email: email@example.com | Tel: +44 (0) 20 7091 7865
Press Releases & Editorial Enquires
Email: firstname.lastname@example.org | Tel: +44 (0) 20 7091 7875
Web design london by Brainiac Media 2020