Finding a companion’ in robots
Newcastle researchers are working on technology to help robots get in touch with their emotions to connect with its human user, whether in the home or residential aged care.
The robot achieves this by scanning and interpreting features of the surrounding environment including colour, facial expression and fractal dimension – how comfortable we feel in a particular environment.
Project leader, Aaron Wong, said: “The impact of environmental sensory stimuli, such as sight, sound, and smell and touch, are known to have an effect on the emotions and behaviour of people. In order for a robot to relate to humans, it must first be able to visualise and feel through the same modes of perception.
“Having the ability to sense the environment can assist robots to become friendlier in the eyes of a user, as they can better relate to how humans feel, based on the surrounding environment,” Mr Wong said.
Companion robot technology is one of several robot technologies under development at the University of Newcastle.
Mr Wong’s colleague, Associate Professor Stephan Chalup, said combining disciplines as diverse as neuroscience, applied mathematics, control engineering, mechatronics, computer science, software engineering, nursing and architecture, the research group collaborates with industry to deliver ‘cutting edge’ technologies.
“Through our NUbots project we are able to develop and test cutting edge technology for wider application such as in defence, healthcare, aged care and home automation,” Associate Professor Chalup said.
Companion robots are reportedly not yet on the market. However, a pilot study with a humanoid robot named DARwIn-OP proved the feasibility of the emotion sensing software.
How ‘companion’ robots work
A webcam fitted on the robot captures footage of its surroundings at 30 frames per second.
The software then analyses aspects such as colour, patterns, and facial expressions, with Mr Wong and his team teaching the robot to connect different environments with certain emotions.