Title: Whole-body multi-modal semi-autonomous teleoperation of mobile manipulator systems
Abstract: We propose a novel whole-body multi-modal semiautonomous teleoperation framework for mobile manipulator systems, which consists of: 1) Motion capture and whole-body motion mapping to allow the operator to intuitively teleoperate the mobile manipulator without being constrained by the master interface while also fully exploiting whole-body dexterity; 2) Slave robot autonomous control to allow the mobile manipulator to optimally track the operator's whole-body command, while taking into account the user-slave kinematic dissimilarity (slave robot's joint limit, joint velocity limit, and singularity); and 3) Visuo-haptic-vestibular feedback with HMD (Head Mounted Display) for 3D visual information, wearable cutaneous haptic device for manipulation force feedback, and actuated chair for vestibular feedback to reduce HMD-induced motion sickness. Performance of the proposed framework is validated with simulation of a ROV (remotely operated vehicle) manipulator system and some preliminary user studies.
Publication Year: 2015
Publication Date: 2015-05-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 18
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot