Researchers at the University of New South Wales are developing AI systems capable of \u2018second-guessing\u2019 humans, using data such as facial expression and body language to assist people for instance who are immobilised with communication difficulties.\u00a0\nDr Lina Yao, senior lecturer with UNSW\u2019s department of Engineering, is the project\u2019s lead overseeing development of a prototype human-machine interface system designed to capture the intent behind human movement.\nUsing EEG [electroencephalogram] devices, it\u2019s possible to learn and predict what a person would do, she explained.\u00a0\u00a0\n\u201cWhile wearing one of these devices, whenever the person makes a movement, their brainwaves are collected which we can then analyse.\n\u201cLater we can ask people to think about moving with a particular action \u2013 such as raising their right arm. So not actually raising the arm, but thinking about it, and we can then collect the associated brain waves.\u201d\nYao\u2019s vision is for these brain waves to be analysed and used to move devices like wheelchairs, or even to communicate a request for assistance.\n\u201cSomeone in an intensive care unit may not have the ability to communicate, but if they were wearing an EEG device, the pattern in their brainwaves could be interpreted to say they were in pain or wanted to sit up, for example,\u201d she said.\n\u201cSo an intent to move or act that was not physically possible, or not able to be expressed, could be understood by an observer thanks to this human-machine interaction.\u201d\nThe technology for achieving this already exists, with the UNSW team hoping to bring all the parts together into a workable system.\nYao stresses this area of AI is about creating \u2018partners\u2019 helping humans in need, as opposed to mere tools.\n\u201cWhat we\u2019re doing in these early phases is to help machines learn to act like humans based on our daily interactions and the actions that are influenced by our own judgment and expectations \u2013 so that they can be better placed to predict our intentions,\u201d she said. \u201cIn turn, this may even lead to new actions and decisions of our own, so that we establish a cooperative relationship.\u201d\nUNSW is also looking at applying AI in emergency rescue situations.\nInputting physical human data into AI systems marks an important step in teaching them how to think and act more like us. But does that mean we should start worrying about the sci-fi vision of autonomous evil computers such Hal in Stanley Kubrick\u2019s \u20182001:\u00a0\u00a0A Space Odyssey\u2019?\u00a0\nNot yet, Yao reckons. Interestingly, however, she notes it might not be too long before we have robots and other sorts of intelligent machines capable of observing us in slightly disconcerting ways. She envisages AI systems built using more subtle human behaviours including so-called micro-expressions, which include things like physical clues someone is deliberately trying to conceal an emotional reaction.\nData like these, along with body language and gestures could enable machines to figure out for instance whether people belong to certain\u00a0human categories such as \u2018peer\u2019, \u2018bystander\u2019 or \u2018competitor\u2019, much faster than we could ourselves.