TY - GEN
T1 - Human-robot Interaction Control by using a General Sensory Model
AU - Lyu, Shangke
AU - Cheah, Chien Chern
AU - Yu, Xiang
AU - Guo, Lei
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/9
Y1 - 2020/10/9
N2 - In recent years, there has been an increasing interest in the study of human-robot interactions (HRI). In HRI tasks, the strengths of both human and robot can be utilized in task execution in a complimentary way. According to different scenarios of human-robot interaction, the interaction task requirements and the external sensory systems or configurations adopted are usually different. Majority of the existing works focus on developing the control methods for some specific applications or with specific sensors and few results has been presented to formulate different interaction task requirements and various sensory models in a general way. In this paper, a human-robot interaction task variable that is able to describe various interaction task requirements in a unified way is integrated with a general sensory model obtained from on an offline neural network based learning algorithm so that various external sensors can be directly used in the interaction control systems to provide various sensory information so as to enhance the perception capability. We present a robot controller by combining the human-robot interaction task variable and the general sensory model so as to achieve various human-robot interaction tasks based on various external sensors by simply adjusting the task parameters and training the system, without having to modify the sensory models or controller. Convergence analysis of the proposed offline neural network based learning algorithm is shown and experimental results are presented to illustrate the performance of the proposed method.
AB - In recent years, there has been an increasing interest in the study of human-robot interactions (HRI). In HRI tasks, the strengths of both human and robot can be utilized in task execution in a complimentary way. According to different scenarios of human-robot interaction, the interaction task requirements and the external sensory systems or configurations adopted are usually different. Majority of the existing works focus on developing the control methods for some specific applications or with specific sensors and few results has been presented to formulate different interaction task requirements and various sensory models in a general way. In this paper, a human-robot interaction task variable that is able to describe various interaction task requirements in a unified way is integrated with a general sensory model obtained from on an offline neural network based learning algorithm so that various external sensors can be directly used in the interaction control systems to provide various sensory information so as to enhance the perception capability. We present a robot controller by combining the human-robot interaction task variable and the general sensory model so as to achieve various human-robot interaction tasks based on various external sensors by simply adjusting the task parameters and training the system, without having to modify the sensory models or controller. Convergence analysis of the proposed offline neural network based learning algorithm is shown and experimental results are presented to illustrate the performance of the proposed method.
UR - https://www.scopus.com/pages/publications/85098073124
U2 - 10.1109/ICCA51439.2020.9264397
DO - 10.1109/ICCA51439.2020.9264397
M3 - 会议稿件
AN - SCOPUS:85098073124
T3 - IEEE International Conference on Control and Automation, ICCA
SP - 630
EP - 635
BT - 2020 IEEE 16th International Conference on Control and Automation, ICCA 2020
PB - IEEE Computer Society
T2 - 16th IEEE International Conference on Control and Automation, ICCA 2020
Y2 - 9 October 2020 through 11 October 2020
ER -