Development of a bionic interactive interface for Owl robot using stereo vision algorithms

James Rogers, Philip Culverhouse, Benjamin Wickenden, Chunxu Li*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

10 Downloads (Pure)

Abstract

With the requirements for improving life quality, companion robots have gradually become hotspot of application for healthy home living. In this article, a novel bionic human-robot interaction (HRI) strategy using stereo vision algorithms has been developed to imitate the animal vision system on the Owl robot. Depth information of a target is found via two methods, vergence and disparity. Vergence requires physical tracking of the target, moving each camera to align with a chosen object, and through successive camera movements (saccades) a sparse depth map of the scene can be built up. Disparity however requires the cameras to be fixed and parallel, using the position of the target within the field of view, of a stereo pair of cameras, to calculate distance. As disparity does not require the cameras to move, multiple targets can be chosen to build up a disparity map, providing depth information for the whole scene. In addition, a salience model is implemented imitating how people explore a scene. This is achieved with feature maps, which apply filtering to the scene to highlight areas of interest, for example color and edges, which is purely a bottom-up approach based on Itti and Koch's saliency model. A series of experiments have been conducted on Plymouth Owl robot to validate the proposed interface.
Original languageEnglish
Number of pages0
JournalAdvanced Control for Applications: Engineering and Industrial Systems
Volume0
Issue number0
DOIs
Publication statusPublished - 11 Oct 2020

Fingerprint

Dive into the research topics of 'Development of a bionic interactive interface for Owl robot using stereo vision algorithms'. Together they form a unique fingerprint.

Cite this