The immense possibilities and sizeable challenges created by rapid advances in Artificial Intelligence and data science were among the most hotly discussed issues at the 2018 Monash Global Leaders’ Summit.
As digital infrastructure collects information at a rate that has enabled the growth of institutions on a scale unprecedented in human history, AI is the technology needed to keep up with the magnitude of the society we’re building. AI is becoming increasingly expert at mining information sources and interpreting meaning and making predictions, so its development at a cutting edge level will decide the shape of societies to come, how economies work and how we organise our relations with each other.
In this way, the Global Leaders heard, it is important that AI is developed not only in ways conducive to economic growth but in ways that also take into account civil rights and commitments and democracy itself. The challenge is to integrate AI into our society just like we’ve done with other valuable but dangerous technologies in the past, like electricity or cars.
A panel of four deans from the Law, Engineering, Arts and Information Technology faculties was assembled to address the Global Leaders Summit on how Monash is embracing and responding to the impact and disruption that AI and data science are bringing to every sector across the world.
Professor Jon Whittle, Dean of the Faculty of Information Technology, said that Monash University is ideally positioned to claim a leading role in this space. Monash, as the largest university in Australia, has a huge pool of talent. It also has the only Faculty of IT in the country that, with more than 160 academic staff, makes it one of the largest in the world, capable of educating undergraduate and post-graduate students at scale.
Professor Whittle revealed plans to create a Data Futures Institute (DFI) at Monash that will focus on interdisciplinary educational programs. These programs will train the next generation of data scientists not just as computer programmers but also across a number of different domains like engineering, law and the arts.
The DFI will also, crucially, include an industry engagement centre designed to better work with industry, “identifying problems they have that AI can potentially solve with a team of people who can respond in a timely fashion to these opportunities”.
“What distinguishes Monash is that our precinct is the largest generator of scientific data in the southern hemisphere,” said Professor Whittle. “So we have the data that many institutions don’t.
“Our other advantage is the interdisciplinary angle. AI and ethics is a space that hasn’t been claimed yet across the world. There’s a huge opportunity to claim that and we have the expertise here at Monash that enables us to cut through the hype of what AI means, what it can and can’t do.”
Good data V bad data
Professor Elizabeth Croft, Dean of the Faculty of Engineering, agreed that understanding what AI is, as well as what it does, will be crucial for future graduates.
“We’re told that these are gold rush times in terms of neural networks, in terms of recursive algorithms and in AI and we as a university have to address that at all levels, not just chase the same shiny balls that everybody else is chasing” she said.
“This means understanding AI and have it be explainable. We need to know what it’s learning and why it’s learning that and we need our practitioners not just to be turning the handles but to give them the ability to structure the neural networks. They need to know the difference between good data and bad data and to be able to utilise it for good.”
Professor Croft cited the power of companies like Uber to influence traffic control via the collection of data so that affluent neighbourhoods will get serviced and poor neighbourhoods won’t as a way AI can be skewed. She believes that we don’t just need to understand how AI works but “how it can work well and not work well”.
“We have to take a moment of reality here,” she said. “Neural networks are just big optimisation algorithms that allow computations we have never been able to do before. That’s the difference. It’s speed and algorithms, not magic. Educating our students across disciplines about what this really is means that we can apply it for good.”
The ethics of AI
This is where the humanities and social sciences join the cross-disciplinary approach, according to Professor Sharon Pickering, Dean of the Faculty of the Arts.
“When you think about the role of humanities and the social sciences, we’re in the business of better understanding and responding to complex social change,” she said.
“Working in an interdisciplinary fashion with colleagues across the university is something that Monash has always been good at. It’s no surprise that at the same time Monash pioneered breakthroughs in IVF, the university was also pioneering breakthroughs in bio-ethics. With AI it’s about asking a different set of questions and the challenge is to do it with a focus on ethics, governance and empathy.”
Professor Pickering believes that there are incredible opportunities to advance these factors over a much larger span as the first AI generation of students begin to approach university age.
“This group is less interested in a particular institution or degree, they’re interested in experiences,” she said.
“One of our opportunities is how we wrap AI around them in a digital ecosystem that becomes the hallmark of their experience with us so when they go out into the world, regardless of what they study, they can operate ethically and generally advance the governance arrangements around AI.”
Liberty, human rights and AI
Smart, nimble legal frameworks for dealing with AI and data collection are also essential, according to Professor Bryan Horrigan, Dean of the Monash’s Law Faculty. These laws have to be framed around social and ethical frameworks as well. He believes leadership in this space needs to come from universities and industry, not governments, because for data to have an impact it needs to be collected not just at scale but also across jurisdictions.
The Monash Law faculty is already implementing this approach in a project aimed at eliminating capital punishment.
“What’s really making a difference is bringing together data from across the world, doing some empirical work that shows to governments and politicians, country by country, that the population isn’t as crazy about executing people as they might think,” he said.
“Gathering intelligence and analysing it does make a huge amount of difference. It means that, just as we can eradicate diseases like dengue fever using technology, we can also do something to eradicate the death penalty.”
Duplicating human bias is one of the major issues with data science, AI and the law. Using sentencing data where there might be racial or gender biases can duplicate human error at scale and so there needs to be human interaction and ethical frameworks in the law around that.
“Google, Amazon and Uber are the new business models but the legal framework around these new businesses still needs to be developed,” he says.
“The same goes for driverless cars and drones, AI and business due diligence with banking and loans, privacy and health data, cyber crime and issues of liberty and human rights.”
The breakthrough and disruption in this space is also going to happen through research and education.
“Students are now in a situation where they can’t be just thrown in a room and taught stuff because what you could do with a student five years ago when they graduated you can now do through automation,” said Professor Horrigan. “Now they have to use human intelligence and other skills they learn from us.”
Professor Jordan Nash, Dean of the Faculty of Science, who opened the session on AI and Data Science, reminded the global leaders of the pace of the change, that what would take 20 minutes of computing time in 1990 now takes less than 20 nanoseconds.
“There’s another aspect,” he said. “With all the phones, laptops, smart watches and fitbits in this room, there are multiple terabytes of data coming out of here about what makes global leaders tick: what you do, what you want to buy, what your political leanings are.
“If I had access to that data and no scruples, I could learn a lot about you and influence you in ways that are not fully above board. I think that the ethics of how we play into this storm of data are really, really important. We have to ask ourselves: just because we can do something, does it mean we should do it?”