TY - JOUR
T1 - Development of a bionic interactive interface for Owl robot using stereo vision algorithms
AU - Rogers, James
AU - Culverhouse, Philip
AU - Wickenden, Benjamin
AU - Li, Chunxu
PY - 2020/10/11
Y1 - 2020/10/11
N2 - With the requirements for improving life quality, companion robots have gradually become hotspot of application for healthy home living. In this article, a novel bionic human-robot interaction (HRI) strategy using stereo vision algorithms has been developed to imitate the animal vision system on the Owl robot. Depth information of a target is found via two methods, vergence and disparity. Vergence requires physical tracking of the target, moving each camera to align with a chosen object, and through successive camera movements (saccades) a sparse depth map of the scene can be built up. Disparity however requires the cameras to be fixed and parallel, using the position of the target within the field of view, of a stereo pair of cameras, to calculate distance. As disparity does not require the cameras to move, multiple targets can be chosen to build up a disparity map, providing depth information for the whole scene. In addition, a salience model is implemented imitating how people explore a scene. This is achieved with feature maps, which apply filtering to the scene to highlight areas of interest, for example color and edges, which is purely a bottom-up approach based on Itti and Koch's saliency model. A series of experiments have been conducted on Plymouth Owl robot to validate the proposed interface.
AB - With the requirements for improving life quality, companion robots have gradually become hotspot of application for healthy home living. In this article, a novel bionic human-robot interaction (HRI) strategy using stereo vision algorithms has been developed to imitate the animal vision system on the Owl robot. Depth information of a target is found via two methods, vergence and disparity. Vergence requires physical tracking of the target, moving each camera to align with a chosen object, and through successive camera movements (saccades) a sparse depth map of the scene can be built up. Disparity however requires the cameras to be fixed and parallel, using the position of the target within the field of view, of a stereo pair of cameras, to calculate distance. As disparity does not require the cameras to move, multiple targets can be chosen to build up a disparity map, providing depth information for the whole scene. In addition, a salience model is implemented imitating how people explore a scene. This is achieved with feature maps, which apply filtering to the scene to highlight areas of interest, for example color and edges, which is purely a bottom-up approach based on Itti and Koch's saliency model. A series of experiments have been conducted on Plymouth Owl robot to validate the proposed interface.
U2 - 10.1002/adc2.54
DO - 10.1002/adc2.54
M3 - Article
SN - 2578-0727
VL - 0
JO - Advanced Control for Applications: Engineering and Industrial Systems
JF - Advanced Control for Applications: Engineering and Industrial Systems
IS - 0
ER -