Published Jan 09 2020

Amid an environmental catastrophe, an ongoing threat to the media landscape

The horrific tragedy of recent and ongoing Australian bushfires has highlighted the connection between threats to the natural environment and the media environment, through which misinformation and disinformation is spreading like, well, wildfire.

High-profile online provocateurs in Australia and abroad have taken the opportunity to circulate misleading claims about the causes of the devastating fires, blaming arsonists and the Greens in order to deflect concerns about climate change. Moreover, as QUT researcher Timothy Graham has discovered, a significant portion of Twitter accounts pushing disinformation about the bushfires look suspiciously like automated accounts. Amplifying these messages are right-wing print and TV outlets that take their cues from the online frenzy.

A threat to denialism

The intensity of the misinformation campaigns reflects the magnitude of the threat posed to climate change denialism by the ongoing Australian tragedy. The somewhat bitter hope that emerges from the trauma and loss is that an alarming reality may have finally caught up with ongoing attempts to obfuscate and repress it. In the face of the catastrophe, the impact of climate change might be taken seriously enough to mobilise a meaningful policy response.

The media clamour suggests a troubling alternative: that the very status of “reality” has become so compromised and reconfigured that the hope of an actionable consensus forming around it may be a vain one. This outcome would not be a particularly surprising one, since it’s the deliberate result of propaganda strategies that exploit the combined technological and commercial logics of the online media environment.

The hijacking of hope

Once upon a time, in what now seems a somewhat dimly remembered past, the promise of the World Wide Web was to facilitate democratic deliberation by fostering an informed and participatory public. The ease of information access meant that everyone (with access) would be able to inform themselves about the issues and topics that interested and affected them. The hope was that truth would emerge from the “friction-free” marketplace of ideas to confront entrenched forms of power (columnist and strategic communications expert Peter Lewis chronicles the rise and fall this hope in his recent book).

But the entrenched powers had other plans. During the first George W. Bush administration in the US, political consultant and pollster Frank Luntz advised Republicans that they were losing the messaging battle on climate change, and if they wanted to prevent environmental regulation, they needed to foment doubt – “you need to continue to make the lack of scientific certainty a primary issue in the debate”.

Instead of trying to prove that climate change wasn’t happening, Luntz argued, they simply had to claim the science “wasn’t settled” – a now familiar refrain. This strategy had the advantage of sounding reasonable as long as what “settled” meant could be avoided. If, for example, “settled” meant that every person who claimed some kind of scientific credential would agree, then even the fact that the Earth is round would remain “unsettled”.

Sowing the seeds of doubt

The online environment lends itself to strategies that foment doubt not just because it allows anyone with access to circulate whatever misinformation or disinformation they like, but also because it provides a constant reminder that every news account is incomplete – one can always dig deeper, because the online trove of information is, for all practical purposes, limitless.


Read more: Image-based abuse: the disturbing phenomenon of the 'deepfake'


Judged against an impossible standard of completeness invoked by the seemingly infinite Web, every account is necessarily partial, and thus subject to doubt. The goal of many disinformation campaigns is to open new rabbit holes to fall into – a debate over the role played by arson leads to a comparison of arson arrests over the years, and then to one about the impact of school holidays on an available pool of young people getting into mischief. The trees keep sprouting up to obscure the forest. Debunking the plethora of often-contradictory narratives becomes a seemingly endless process.

The US defence think tank, The Rand Corporation, dubbed the propaganda strategy that relies on fomenting uncertainty by multiplying competing narratives the “firehose of falsehood”. The goal of such an approach is to undermine the ability of a coherent counter-narrative that would challenge existing entrenched interests to emerge.

It’s also, at a more foundational level, an attempt to discredit the institutions and practices that society has developed over time to establish what counts as reliable, shared information. The impact of online disinformation campaigns isn’t simply to spread false – and perhaps easily discredited – information, but to cast doubt on all forms of mediated representation.

Although Rand associated this strategy with Russian misinformation campaigns, it’s become ubiquitous. It’s not impossible that some of the bots and humans amplifying bushfire disinformation have connections to Russia’s notorious troll campaigns, but plenty of the trolls are homegrown, and some are high-profile international provocateurs. They’ve seized on the Australian case because it taps into polarising political fault lines.

Trolling for cash

There’s an economic logic to this kind of trolling: the online economy monetises attention, and high-profile controversy earns clicks and attracts readers and viewers. Climate change deniers on YouTube will make more money because of the attention they get on this hot-button issue, just as publishers will sell more papers and networks will get more viewers.

Moreover, the pace and style of social media interaction discourage the forms of deliberation that might counter disinformation. The exchanges of online interlocutors all too often replicate the familiar skirmishes of professional opinion punditry: the goal is not to enlighten, but to entertain and attract attention.

The exchanges of online interlocutors all too often replicate the familiar skirmishes of professional opinion punditry: the goal is not to enlighten, but to entertain and attract attention.

There’s also a political logic to the “firehose of falsehood” – though this strategy might be more accurately dubbed the “firehose of falsehood, partial truths, misleading information, and, every now and then, a grain of truth”.

The demobilisation of the ability to “speak truth to power” in a coherent fashion is a fundamentally conservative (with a small “c”) strategy. It favours existing power relations by undermining evidentiary challenges. Sowing ongoing confusion about climate change, its causes and impacts, favours the fossil fuel interests that already wield both economic and political power.

The broader goal of misinformation strategies is to undermine accountability at the very moment it's most urgently needed.

About the Authors

  • Mark andrejevic

    Professor, Communications and Media Studies, Faculty of Arts

    Mark contributes expertise on the social and cultural implications of data mining, and online monitoring. He writes about monitoring and data mining from a socio-cultural perspective, and is the author of three monographs and more than 60 academic articles and book chapters. His research interests encompass digital media, surveillance and data mining in the digital era. He is particularly interested in social forms of sorting and automated decision-making associated with the online economy. He believes regulations for controlling commercial and state access to and use of personal information is becoming an increasingly important topic.

Other stories you might like