As we grapple with online dangers such as cyberbullying, pornography addiction, harassment, and scams, a new and deeply unsettling threat has emerged – deepfake technology.
Disinformation has grown to pandemic proportions, driven by digital networks and social media. Understanding its mechanics, from cognitive biases to advanced digital technologies, is crucial in combatting its global impact.
It’s important to approach media coverage of AI ‘breakthroughs” such as DeepSeek with caution. Drawing lessons from the history of technology can help us avoid falling for either overly pessimistic or overly optimistic predictions about an uncertain future.
Listening to young people, not banning them from social media, is critical if we’re to effectively address the rise of deepfakes, and the spread of misogyny in high schools.
While AI and robotics reshape our reality, experts explore how these emerging tools could be used to create a more equitable future – from healthcare breakthroughs to Indigenous-led innovation.
In the season nine premiere of Monash’s podcast, learn how AI, deepfakes and humanoid robots are transforming human interaction and our perception of reality.
Monash's award-winning podcast, “What Happens Next?”, returns for a ninth season that explores pressing global issues including reality in the digital age, climate change in the Indo-Pacific region and the ongoing struggle to eliminate gender-based violence.
As they improve, we’ll likely trust AI models with more and more responsibility. But if their autonomous decisions end up causing harm, our current legal frameworks may not be up to scratch.
Deepfakes are threatening privacy and security, and while detection methods using deep learning aim to combat the problem, there’s a long way to go.
One in seven Australians say they’ve engaged in tech-based workplace harassment – and it’s often designed to offend, humiliate and distress the victim.
Sexual deepfake abuse silences women, causing lasting harm, and laws to protect them are inconsistent. A global approach is vital if society truly wants to address the problem.
Deferring to AI to give us what we like further diminishes culture at a societal scale, and cultural difference globally.
Social media platforms have an incentive to promote whatever gets the most attention, regardless of its authenticity, but we're more reluctant to admit that the same is true of people.
The popularity of face-swapping software has resulted in the disturbing trend of 'deepfakes' that can be used for nefarious purposes.
AFL footballer Dane Swan is the latest victim of illegal photo-sharing. It's believed one in five Australians have had images of them distributed without their permission – and that's most likely an underestimate.
Dummy text