Susan Carland is back for a second series of the What Happens Next? podcast.
The first three episodes take a closer look at what fake news is and how the term, which began as a critique of news that's considered to be inaccurate, has now become a term that's widely used to dismiss any news that you don't like.
So when Donald Trump says fake news, that doesn't mean that it's factually wrong, that means that he just doesn't like it.
In this podcast episode Susan talks to Monash School of Media, Film and Journalism experts Mark Andrejevic and Johan Lidberg to find out what fake news is doing to our society and politics, and what we can do to change it.
Susan Carland (SC): Welcome to a brand new series of What Happens Next? I'm Dr Susan Carland. This time on the podcast, we're looking at fake news. Fake news has played a role in a series of recent events, including the US elections, Australia's bushfire disaster and the Covid-19 outbreak. We're talking to experts about what happens if we don't do something to stop fake news and misinformation campaigns, what role social media platforms play and why public interest journalism is critical to democracy. We’ll also get their best tips for avoiding fake news and what we can all do to get better quality information.
[News grabs - various speakers]
SC: Australia's bushfire disaster, Russian interference in the US election and now Covid-19 have generated major misinformation campaigns led by social media and picked up by mainstream news. During the bushfires, for example, researchers at Queensland University of Technology discovered that many social media accounts pushing disinformation about the bushfires were actually bots.
And then amplifying these messages were agenda driven print and TV outlets. Taking their cues from the online frenzy. In this episode Monash School of Media, Film and Journalism experts Mark Andrejevic and Johan Lidberg ponder the possibility of a future where verifying information becomes almost impossible.
Joining us first is media expert Mark Andrejevic.
Mark Andrejevic (MA): I'm Mark Andrejevic. I'm a Professor of media studies in the School of Media Film and Journalism at Monash University and I write about the social, cultural, political impact of digital media technology.
SC: Professor Marc Andrejevic, welcome.
MA: Thank you, it’s great to be here.
SC: When we actually talk about fake news or disinformation campaigns or even alternate facts, what do we actually mean?
MA: Well, it's interesting how that term has changed in its use, and Donald Trump probably gets some credit for that. You know, originally Fake News starts as a critique of news that's considered to be inaccurate, misleading, misinformation or disinformation. It's become a term that's used now to dismiss any news that you don't like. So when Donald Trump says fake news, that doesn't mean that it's factually wrong, that means that he just doesn't like it.
SC: I reject it.
MA: I reject it, I don't accept it. That then, has become a rallying cry for those supporters. So in general I try to stay away from that term fake news and use misinformation or disinformation instead. You know, the danger is when you start talking about fake news, then it looks a little bit like you're talking like Trump is, in dismissing the things that you don't like or you disagree with. But it originally has that quite straightforward, meaning that this is inaccurate. But that's changed because of the public discourse.
SC: It's almost like no one can really agree on what's accurate anymore. What even is true news?
MA: This is the anxiety of our moment, you know, the institutions that we relied upon and the practises and procedures that we could engage in, in order to come to some way of adjudicating between what's true and what's not. What we all have as our shared reality and what we all dismiss - those are all in crisis, and I think there are a variety of causes for that, but certainly the online information environment is one of them. If you can go online and you can find all the material that you want to support your counterfactual view, that helps you just put out a flood of information and say, ‘who's to decide between all of this? I'll decide what I want. I'll pick the information that I want’.
And so what's really crucial is I think we're going to engage in that process of reinventing, rediscovering and recreating the institutions and the procedures that we need in order to adjudicate between what's accurate and what's not, what's factual and what's not.
SC: How do we do that when we are also living in a time where trust in institutions has never been lower?
MA: It's not going to be an easy task. My anxiety or my worry is that we're going to have to really experience what happens when you sacrifice or no longer have those institutions or practises in place, and you know, we may be headed there right, and it's not a pretty picture, but it's I don't think it's going to be an easy task. And if you look historically coming up with shared systems for adjudicating, what counts as reality that we can all agree upon. Those go through long term crises and require maybe 100 years of or 200 years of struggle before new systems emerge. If you think about the rise of the scientific paradigm that you know, it took place after several hundred years of warfare in order to figure out what might it mean if we could imagine a society in which you could deliberate about reality instead of fighting about it? But that's a relatively short historical interval, and we exist under different conditions now.
SC: That's what I was going to say, it's a different system as well.
MA: Now we exist in this fully globalised world where you might try some procedure in one place, it's not going to work somewhere else, so you would end up having to imagine some global possibility of developing institutions like this. I also don't think we can go back, right. Historically, things move along, and if you say no, we want to resuscitate some shared notion of the Enlightenment or some shared universal religious way of thinking about the world, I'm not sure we can go back that way. We’re going to have to move forward instead.
SC: Yeah, move forward in the direction of something that none of us really know what that looks like, so it's very difficult to know for even going in the right direction.
MA: Yeah, the direction at the moment doesn't look very good. We’re in this moment of information about Coronavirus spreading around and I look at the things that people are putting up there and I'm thinking, ‘Oh my God’.
SC It seems like there's this human instinct in these moments to sort of go to the almost absurd. Why do you think humans seem to have an inclination towards, quote unquote, fake news or disinformation? Why, why can't we sort of calmly assess the facts?
MA: I think about what's going on in the US, and it's quite pathological there, the unwillingness to recognise the shared forms of social interdependence that make society possible. You know, the mythos of the United States is built on a version of rugged individualism which has been adapted to the contemporary logics of neoliberal individualism and the message there is ‘you are on your own, you're not interdependent on others, it's your responsibility’ and technology that reproduces or reinforces that one-sidedness, I think, leads to this feeling, on the one hand, a fantasy of kind of individual power, and on the other hand, an anxiety about these forms of interdependence that tends to create antipathy towards others, right? And so what does that have to do with misinformation or disinformation? Well, it encourages, I think, people to see truths that they don't like or truths that they disagree with as impositions that others are imposing upon them that actually curtail their autonomy or their freedom. Their freedom is to decide their world, and if you tell them ‘no, you're wrong, actually, here's the evidence, here's the science, we have social systems that have developed to address your incorrect view of the world’, that's described as elitism and an incursion on their autonomy and independence.
SC: Going back to the matter of disinformation, what role do you think mainstream and social media need to play in correcting this disinformation confusion?
MA: You know, probably my number one concern in response to the role that social media plays in misinformation, disinformation is to look at the economic model that drives them. What we've done is we've turned over our information environment to a fully commercialised structure whose priorities are not public service. That's not their role, right? Their role is to make money. That's what commercial entities do. That's why we've learned that the algorithmic systems that they use to shape our information environment are driven by imperatives that do not have public interest or the public good at heart. And this is the moment that we're encountering now, with the kind of backlash against Facebook and other social media platforms, we realise ‘wait, you're shaping our information environment’, and it's important to note that they're shaping it. You know, when you go on Google or when you go on Facebook or Instagram, you kind of imagine that you're seeing some natural feed, but you're not. There's too much information. It has to be sculpted and how it's sculpted is according to what information, based on the huge amounts of data they have, is the information that's most likely to foster engagement and sharing.
If you prioritise engagement in sharing, you're not necessarily prioritising truth, accuracy, civic information. You're not prioritising the things that you would want to prioritise in an information system for a healthy democratic society, and so I think probably step one is revisiting the economic model that sacrifices the notions of public interests and public good to purely commercial imperatives. And this is not unique to social media. I come from the US, where commercial media has been dominant for a long time, and it's interesting to see that shift. I also worked in journalism, so I've had some experience firsthand seeing what it means to work in a commercial media environment. I was working in a print newspaper, and the interesting thing about the print newspaper, and I think much even broadcast here or broadcasting in the U.S, there was a notion of a public interest and public good, even in those commercial platforms. They understood that they had a civic role to play as well as a commercial role. Now what's happened is, thanks to the rise of online platforms, a lot of the local media is dying out and being killed. Their commercial models don't work anymore. The local folks who were doing local coverage of people they knew and places they cared about, those people don't have the forum that they once had, and instead we have these platforms that come in and people find online communities in connections that may not have anything to do with their local community and the people they live alongside with. But maybe it's some community of interest that they find online. And maybe it's one that doesn't really care at all about their local community. Maybe it's one that's interested in, you know, getting them all excited about conspiracy theories, and that's an interesting way to spend their time. And it displaces reading about what's going on in their community, and you lose the social and civic role that the information institutions that we once had.
Also, I think it's important to note that the folks who run these tech platforms, they don't come out of a history or tradition of of thinking about the role that news and information play in a democratic society, they come out of a startup tech culture which doesn't necessarily have any commitment to or interest in those questions. Somebody like Mark Zuckerberg, he now has one of the largest media platforms in the world - unprecedented to have that type of reach globally for a media organisation, but he started out wanting to make money off of an app that was basically ‘rate people if they're hot or not’, it didn't start with any kind of civic commitment or some notion of what information means in a democratic society, it was like ‘how can you make money with a tech platform?’
SC: Well, then, how realistic is it for us to ask our social media platforms to move away from the purely commercial model of sell ads, get more information, keep people engaged and instead tell them the news and information that they need to know from trusted sources, which won't be as lucrative? Is it really realistic to expect that to happen?
MA: No, I don't think so. I think what we need to do is imagine alternatives to the economic structure that we have.
I think one of the things, if we think about how do we rebuild institutions that we need to be able to function as a society, I think one of the things we need to consider very seriously is what does it mean to take a public service tradition that's quite well developed in a country like Australia and imagine, what would it mean to have a public service social media platform? What would it mean to have a public service media organisation that was able to span the forms of media that have become those that shaped the information environment that we have?
That's not an easy thing to envision, right, because we know already what the kids like or what they're on or what people find addictive. To try to say ‘Yeah, well, important civic information is going to be as engaging and that people are going to spend as much time with it as they do on TikTok’ - that's a big ask, right? What you have to do, is at the same time also foster a sense of civic awareness and engagement in commitment on the part of the population. So I think the media and education have to work hand in hand. So I don't think it would be possible to fix things by saying, ‘Well, we'll have a public service social media platform, and that will provide accurate information and everybody will go to it and pay attention to it’. No, they won't.
But what we have to do is also, and this is something I think about a lot as an educator - is how do you build a sense of civic commitment and a sense of societal interdependence that’d going to make people engage in and be interested in the information that really shapes their lives in ways that the forms of easy, dopamine-inducing entertainment on social media don't. But that's a big ask. I'm not optimistic about any of it. I think those things would have to happen in order to have a better outcome, but I don't think those things will necessarily happen unless we really are smart enough to anticipate the impact of what happens if we don't, or are unfortunate enough to encounter the impact of what happens if you don't imagine an alternative. The goal is to get enough people who can imagine the possibility of what political change might be to act, not to get everybody, but to get a balance of popular will.
SC: Marc Andrejevic, thank you very much.
MA: My pleasure
SC: Johan Lindberg sees the role of freedom of information as crucial in this discussion.
SC:Unfortunately, due to Covid-19, we've had to adapt and do a number of these interviews by phone. So while occasionally the audio isn't as great as always, we promise you the content is!
Johan Lidberg: My name is Associate Professor Johan Lidberg in the School of Media, Film and Journalism at Monash University and I’m also the Deputy Head of Journalism.
SC: Johan Lidberg, thanks so much for joining us. I want to start by asking you, What do you see as the connection between freedom of information and fake news? How do they relate to each other?
JL: If we have freedom of information or access to information, as we call also call it in a broader term, functioning well so that it provides independent access to government-held information, then you have the option for anyone, including audience, public, listeners, journalists, people, opposition to verify and cross check claims that are being made that they think are a bit dubious so that I would say the most important connection and function of a far reaching access to information systems.
SC: So a sort of a fact checking capacity?
JL: Yeah, it means that if it works well, then government held information is easily discoverable, which is a huge issue because sometimes it isn't, then it's much easier for people to become their own fact-checkers and to become more, simply, media and information literate.
SC: Do you think also it could be when there is reduced media freedom people start to become a bit paranoid about the information they're getting or decide ‘well, I don't know how much I can trust what's being given to me, so I need to sort of, sort it out for myself’?
JL: Everyone lives incredibly busy lives, it doesn't matter what you work on or what you do. If you’ve got a family, if you've got a working life, you’ve got kids’ sport, that sort of stuff. It’s a big ask to ask people to sit down and become their own fact-checker. Some probably do, but I suspect they're in the minority.
SC: What role do you think mainstream media outlets should be playing in tackling the scourge of fake news?
JL: So the way I see it is that the future for professional journalism - and I'm a really big fan of collaborations between the audience, the public and professional journalists, I think that's really important - but we can never ask the citizen journalists and others to come up with original reporting because again they have day jobs, most of them. So I see a really important task in the future for professional journalism, a legacy media, is to become those verifiers and curators and make sure they absolutely do not re-publish stuff that they haven't verified properly. So that's going to be one of the most important tasks, I find, and also a really good trust builder with their audiences.
SC: What role then, would you say for social media, we're hearing recently about in amongst all this Covid-19 with people being sent home from work places like Facebook, Twitter, other social media platforms have had to send most of their staff home, so they're now relying on artificial intelligence to monitor new sources, and what's been found is sometimes accidentally, they have been blocking correct new sources, but letting the fake ones flourish because it's having a machine doing that, what impact does that have and what responsibility do social media platforms have?
JL: So there is actually quite a clear way ahead here. And that is that social media platforms own up to being publishers. If they do that, they will then have to adhere to, you know, the full legal framework, the full ethical framework that all publishers have to adhere to. Right now, they're getting around this by saying, ‘well, we're not the publisher. The people that published on our platforms are the publishers’ so they're getting around getting around that, and that is hugely problematic because that then means that they can pretty much let anything go. So they are suffering through the so-called ‘techlash’ now, which we saw after Cambridge Analytica and those things in Brexit and the 2016 election. But they have a huge responsibility and a major role, and it's really twofold. One is, you know, their role in a democratic system is to make sure that they don't undermine it the way that they do now. We clearly saw both their role in Brexit and the role in the US 2016 election undermine the functionality of liberal democracy, The second one is financial - they are also undermining the old ad market. So they have such a huge responsibility and they are clearly not owning up to it now. And you can see that a lot of people are bitterly disappointed in that.
SC: Why do you think fake news is proliferating now? I mean, obviously, we have the mechanics for it, people online more, we have social media but it also requires people being willing to believe it in the first place. Why is this happening now?
JL: Well, fake news, as I'm sure everyone is aware, has been around for a long, long time. Probably for as long as we’ve told stories. Fake news is just a new term for misinformation, disinformation. The main difference now, compared to say, the 1930s in Europe, is that we do have the ultrafast spread of information and it’s turbocharged by all the platforms and all the social media sharing and so on.
To answer your question which connects back the previous one about AI being moderators and so on, if the platforms would own up to being publishers we would see much less fake news, but it's quite tricky. The AI, I’m sure, will eventually become good editors and publishers. But they have quite a long way to go, and we could see that in that example that you were citing before - that they aren’t ready yet. So if the platforms own up to being publishers it’s going to cost them a lot of money. I'm a bit of a cynic and I think that’s why they haven’t because if they say okay we’re publishers they're gonna have to hire a lot of editorial staff to curate and be a proper publisher. And I think that's one of the reasons why it is spreading so fast - that had they owned up to being publishers, we would have had much, much less off that stuff. Then why do people believe it? Again, I don't think people have time to check.
SC: How much of a problem do you think fake news is at the moment?
JL: I think it's absolutely major. The biggest problem, I think, is that the term is used by leaders that should know better. So coined by the ‘fake news-er’ in chief in the White House’, and he then legitimises the term on then everyone thinks it's okay to use, to the point where fake news goes the full circle. So things that were fake news then proliferate and then it becomes real news and then it goes around again and becomes fake news. So a really good example was done at this inquiry in the UK, after the Brexit election, into fake news. Where that report actually made the recommendation, which was taken up by the UK Parliament, that the term fake news should not be used in the chamber, so they stopped using it. I think that's a really good example. And we should try to limit the use of the term fake news overall because it is undermining the very fabric of our liberal democratic society. I think the ‘liberal’ in democracy is under the pump around the globe.
SC: Paint me a picture. We are 50 years into the future. No one has been able or willing to rein in fake news, and it's gotten even worse than it is now. What does the world look like?
JL: So I think it's impossible to ponder that without thinking about AI, artificial intelligence, and 50 years into the future, provided that quantum computers come online reasonably soon, which I'm pretty sure they will, that's going to be the big, big break for AI that’s going to up their capacity to learn and to process to a point that we can't even comprehend. But I'm thinking maybe AI will become so potent at reining in fake news, if we let them, that could be our best shot at getting some sort of handle on it. Another one would be, of course, that we finally managed to convince Facebook and the others that they are publishers. But if we can't rein it in, it's going to undermine, in my view, trust in pretty much everything.
SC: Johan Lidberg, thank you so much for your time today.
JL: You're welcome.
SC: Clearly, something needs to change, but in the next episode we’ll chat with people who are tackling disinformation campaigns head on. Thanks to our guests today, Mark Andrejevic and Johan Lidberg. That's it for this episode. More information on what we discussed can be found in the show notes, and if you like these episodes, please write us a review. This helps other people find the show, and you can also let us know the topics you'd like us to look at in our crystal ball for maybe the next series. We'd love to hear your suggestions. I'll catch you next time on What Happens Next?
More about this episode:
The US elections, Australia’s bushfire disaster and now Covid-19 - major events plagued by online agenda-driven misinformation campaigns and the spread of fake news, amplified by mainstream media outlets with their own agenda. Monash School of Media, Film and Journalism experts Mark Andrejevic and Johan Lidberg ponder the possibility of a future in which it’s impossible to verify sources and objectivity of information due to lack of diversity in media ownership and ongoing resource cuts to journalism. They consider the potential effect on our democracy and our society if we don’t act to prevent these campaigns continuing to drive divisions with false information. They fear the very democratic institutions we rely on for a common understanding of the world are in a state of collapse.
To receive a fortnightly email wrap up of stories from Lens.