by David Binning

UNSW is teaching AI systems how to second-guess us

Mar 18, 2020
Computers and Peripherals

ai vendor relationship management artificial intelligence hand on virtual screen
Credit: ipopba / Getty Images

Researchers at the University of New South Wales are developing AI systems capable of ‘second-guessing’ humans, using data such as facial expression and body language to assist people for instance who are immobilised with communication difficulties. 

Dr Lina Yao, senior lecturer with UNSW’s department of Engineering, is the project’s lead overseeing development of a prototype human-machine interface system designed to capture the intent behind human movement.

Using EEG [electroencephalogram] devices, it’s possible to learn and predict what a person would do, she explained.  

“While wearing one of these devices, whenever the person makes a movement, their brainwaves are collected which we can then analyse.

“Later we can ask people to think about moving with a particular action – such as raising their right arm. So not actually raising the arm, but thinking about it, and we can then collect the associated brain waves.”

Yao’s vision is for these brain waves to be analysed and used to move devices like wheelchairs, or even to communicate a request for assistance.

“Someone in an intensive care unit may not have the ability to communicate, but if they were wearing an EEG device, the pattern in their brainwaves could be interpreted to say they were in pain or wanted to sit up, for example,” she said.

“So an intent to move or act that was not physically possible, or not able to be expressed, could be understood by an observer thanks to this human-machine interaction.”

The technology for achieving this already exists, with the UNSW team hoping to bring all the parts together into a workable system.

Yao stresses this area of AI is about creating ‘partners’ helping humans in need, as opposed to mere tools.

“What we’re doing in these early phases is to help machines learn to act like humans based on our daily interactions and the actions that are influenced by our own judgment and expectations – so that they can be better placed to predict our intentions,” she said. “In turn, this may even lead to new actions and decisions of our own, so that we establish a cooperative relationship.”

UNSW is also looking at applying AI in emergency rescue situations.

Inputting physical human data into AI systems marks an important step in teaching them how to think and act more like us. But does that mean we should start worrying about the sci-fi vision of autonomous evil computers such Hal in Stanley Kubrick’s ‘2001:  A Space Odyssey’? 

Not yet, Yao reckons. Interestingly, however, she notes it might not be too long before we have robots and other sorts of intelligent machines capable of observing us in slightly disconcerting ways. She envisages AI systems built using more subtle human behaviours including so-called micro-expressions, which include things like physical clues someone is deliberately trying to conceal an emotional reaction.

Data like these, along with body language and gestures could enable machines to figure out for instance whether people belong to certain human categories such as ‘peer’, ‘bystander’ or ‘competitor’, much faster than we could ourselves.