Industrial Manipulator Based Intelligent Assist System for Human-Robot Cooperative Assembly Tasks

Introduction

Staubli_gripper_absorber

In many operations, it is desirable to exploit the force capabilities of industrial robot-manipulators by directly combining them with the skills and incomparable sensomotoric abilities of a human operator for complex assembly tasks. However, traditional robot reprogramming or switching between robot programs many times during a hybrid human-robot cooperative assembly scenario is not applicable. Therefore, efforts should be allocated to introduce more degree of autonomy and intelligence in such industrial settings such as automatic object recognition and corresponding grasping algorithm engagement. Presumably, this will lead to reduced time involvement of robot operators in reprogramming the robots during these operations.

State of the art grippers are very primitive compared to the new very capable industrial manipulators. Aiming to achieve advanced robot functionalities in grasping and holding object with various shapes and properties, numerous multi-fingered robot hands have been developed and are available in the market. However, the programming of such advanced robot end effectors is still done manually depending on the object shape and properties prior the robot operation. Alternatively, in a human-robot cooperation scenario, it is desired that the robot learns to operate on the object in an interactive manner following the operator’s movements.

The project aims to create an intelligent robot assistive system for human-robot collaborative assembly tasks in industrial settings by integrating a state of the art industrial manipulator, an advanced multi-fingered robotic end effector equipped with a torque/force sensor and a range camera and machine learning algorithms. In this system, the human operator and the robot undertake complementary parts of an assembly task which they are good at. The human operator performs tasks more suitable for humans such as delicate assembly. The robot undertakes tasks like carrying or holding heavy mechanical parts or those requiring repetitive execution. Presumably, the system will increase the capabilities of the robot manipulators to perform assistive operations during various assembly tasks performed by a human operator and also decrease the time spent by the operators to program the robots.

As part of this research a wearable system for wireless dynamic tracking of a human arm position and orientation will be developed. The system will be further used for implementing robot collision avoidance algorithms while closely working with a human operator.

Current Results 

A novel low-cos 4-DOF wireless human arm motion tracker has been developed. The preliminary design utilizes a single inertial measurement unit coupled with an Unscented Kalman filter for the upper arm orientation quaternion and a potentiometer sensor for elbow joint angle estimations. The presented arm tracker prototype implements wireless communication with the control PC for sensor data transmission and real-time visualization using a Blender open source 3D computer graphics software and was verified with an Xsens MVN motion tracking system.

A real-time teleoperation system for an Universal Robots robotic arm using the updated version of the previously developed human arm motion tracker with a visualization utility was developed. An explicit model predictive robot controller (EMPC) is implemented for online generation of optimal robot trajectories matching operator’s wrist position and orientation, whilst adhering to the robot’s constraints. The EMPC proved to be superior to open-loop and naive PID controllers in terms of accuracy and safety.

Nonlinear model predictive control was utilized for implementing quaternion based robot inverse kinematics that allows online computation of optimal robot trajectories matching operator’s wrist position and orientation, whilst adhering to the robot’s workspace constraints. The NMPC is integrated into the Unity Engine and proved to be computationally effective for
realizing online obstacle avoidance algorithms.

3D point cloud of the ALARIS lab taken by a range camera.

lab1 lab2

3D point clouds of a ball and two cups constructed from a point clouds taken by a 2D Hokuyo range finder at different positions of the sensor.

Untitled-5

Project Publications

M. Rubagotti, T. Taunyazov, B. Omarali, A. Shintemirov, Model Predictive Control for Semi-Autonomous Robot Teleoperation with Obstacle Avoidance, IEEE/ASME Transactions on Mechatronics, (under review)

B. Omarali, T. Taunyazov, A. Bukeyev, A. Shintemirov, Real-Time Predictive Control of an UR5 Robotic Arm Through Human Upper Limb Motion Tracking, The 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI2017), Austria, 2017 ACM DL  pdf

T. Taunyazov, B. Omarali, A. Shintemirov, A Novel Low Cost 4-DOF Wireless Human Arm Motion Tracker System, 6th IEEE RAS & EMBC International Conference on Biomedical Robotics and Biomechatronics (BioRob2016), Singapore, 2016 IEEE Xplore

A. Begalinova, A. ShintemirovDesign of Embedded Gesture Recognition System for Robotic Applications, 2014 IEEE 8th International Conference on Application of Information and Communication Technologies (AICT 2014), Astana, Kazakhstan, October, 2014 IEEE Xplore pdf

A. Saudabayev, Y. Khassanov, A. Shintemirov, H. A. Varol, An Intelligent Object Manipulation Framework for Industrial TasksThe IEEE International Conference on Mechatronics and Automation (ICMA 2013), Takamatsu, Kagawa, Japan, August 2013, 1709-1713. IEEE Xplore