Skill Acquisition and Controller Design of Desktop Robot Manipulator Based on Audio–Visual Information Fusion

Chunxu Li, Xiaoyu Chen, Xinglu Ma*, Hao Sun, Bin Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

<jats:p>The development of AI and robotics has led to an explosion of research and the number of implementations in automated systems. However, whilst commonplace in manufacturing, these approaches have not impacted chemistry due to difficulty in developing robot systems that are dexterous enough for experimental operation. In this paper, a control system for desktop experimental manipulators based on an audio-visual information fusion algorithm was designed. The robot could replace the operator to complete some tedious and dangerous experimental work by teaching it the arm movement skills. The system is divided into two parts: skill acquisition and movement control. For the former, the visual signal was obtained through two algorithms of motion detection, which were realized by an improved two-stream convolutional network; the audio signal was extracted by Voice AI with regular expressions. Then, we combined the audio and visual information to obtain high coincidence motor skills. The accuracy of skill acquisition can reach more than 81%. The latter employed motor control and grasping pose recognition, which achieved precise controlling and grasping. The system can be used for the teaching and control work of chemical experiments with specific processes. It can replace the operator to complete the chemical experiment work while greatly reducing the programming threshold and improving the efficiency.</jats:p>
Original languageEnglish
Pages (from-to)772-772
Number of pages0
JournalMachines
Volume10
Issue number9
DOIs
Publication statusE-pub ahead of print - 6 Sept 2022

Fingerprint

Dive into the research topics of 'Skill Acquisition and Controller Design of Desktop Robot Manipulator Based on Audio–Visual Information Fusion'. Together they form a unique fingerprint.

Cite this