Do we look smart? Are we thinking smart? Is our smartphone smarter than we are?
Laureate Professor John Grundy wants to challenge what “smart” means in relation to software design. Should the word apply to an application that’s hard to use and doesn’t take human needs into account?
He’s received a prestigious $3 million Australian Research Council Laureate Fellowship to investigate the development of “human-centric” software – systems designed to accommodate the people who use them.
“Through this generous fellowship, I hope to grow my research team in order to explore how human issues, such as age, gender, culture, language, and mental and physical challenges impact the manufacturing process in products like smart homes and smart cities so they are a help, not a hindrance," to the people they are meant to serve, he says.
Most software design tends to begin with “what can be done” rather than focusing on what people actually need and want, he says.
It springs from the minds of developers who typically “are 20 or 30-something year-olds, English-speaking, well-educated men,” he explains. “They're very comfortable with technology, and they are a very small subset of our society's population. They tend to be more wealthy, for instance’ otherwise they usually wouldn’t have been able to get their university degrees.
“And they tend to be, because of the way we train them at the moment, very much more focused on technology design and implementation approaches – ‘What can be done, and how to do it right’ – rather than, ‘What is the right thing to be doing? What is actually needed by the users of the software?’”
Biases beyond software design
The idea that the biases of designers influence how products are developed is not only restricted to software, he says. He gives the example of car crash test dummies, which have traditionally been the size of the average male (explaining why women are more likely to be injured in a collision). And historically, most drugs have been trialled on male subjects only because the hormonal cycle makes female testing more complex.
Professor Grundy first became aware of this “blind spot” issue while developing accounting software for a private New Zealand firm in the 1980s. A woman who worked for the company would “often point out some incredibly glaring errors we made around the design of the interface and the business logic of the systems”, but the design process being used didn’t allow the product to be modified to take her observations into account.
On one occasion, a system was being demonstrated to a client, who pointed out some “serious errors around the poor use of language in the interface, and we thought, ‘Oh this is a bit late when we're actually trying to sell the product’.”
But the real question was: “Why didn't we pick that up much earlier in the process and do something about it?”
One of his hopes for his new ARC Laureate program is “to develop techniques so that when we design, build and test the software, these human-centric issues are captured and modelled, and used all the way through the process”.
He gives the example of software designed to support elderly people at home. These clients have special needs – age, cognitive, physical challenges, and also cultural, because many did not grow up with computers, or they may not have English as a first language. Or they may be uncomfortable with surveillance that is well-meaning – for example, that can tell whether they’ve fallen, or remembered to take their medicine – but comes across as overly restrictive or pervasive.
“We often forget these complex human responses at the design and implementation stages,” he says. “We sometimes even forget about them when we test and evaluate the product. But we need to preserve that focus all the way through. And so we need techniques – new software engineering techniques – to help us do this.”
More women software designers needed
Broadening the cohort who become software designers would also help. Professor Grundy says currently only 20 to 25 per cent of IT students are women – shockingly even less than when he was an undergraduate, and a statistic he’s hoping to improve – while the percentage of women software designers working in industry is even lower.
“And if you look at age, at culture, or language or educational background, again we've got a huge segment of the population that's very under-represented in the teams that build these solutions.”
At the same time, software designers are “interested in what I call smart cities, smart homes, smart buildings, smart transport, etcetera – technologies that are, by their very nature, trying to provide support to all of our very diverse community members”.
And that includes people who may not be comfortable with technology in the first place – and who may even be afraid of it.
“An amazing number of people in Australia don't have access to the internet or smartphone technologies, and people forget this,” he says. “And a significant number are not as confident with using leading edge technology as others are. It's a human-centric issue that again, we technologists often forget about.”
Bridging the digital divide
The “digital divide” is a term for people who aren’t able to access the internet – often for financial reasons. How do we stop these people from being left behind in an age when government services are provided via online forms, and when computers are mandatory in classrooms?
“I've done some studies around the very different kinds of personalities people have, and how that affects them interacting with software solutions,” Professor Grundy says. “I've done work on emotions, language, culture, a little bit on age, a little bit on gender, and so on.
“And I've done a lot of industrial consulting and applied research over the last 20 years. These issues keep coming up.”
This laureate fellowship “is a chance to bring it all together and advance practical tools and techniques for the software industry, and ultimately to make software better for people”.
To receive a fortnightly email wrap up of stories from Lens.