Master Theses Supervision

Spring 2019 Semester

MSc in Robotics thesis projects 2019

Development of a Low-cost Reconfigurable Adaptive Robotic Gripper with Detachable Fingers

Student: Alikhan Zhilisbayev
Supervisor: Dr. Almas Shintemirov

This thesis focuses on development of a low-cost robotic gripper with three reconfigurable modular (detachable) underactuated fingers. The project consists of three main parts: development of Prototype I and its mechanical assembly; improvement to Prototype II with assembly and system integration; control design and software development.

Design, Motion Planning and Control of a Skid-Steering Mobile Robot

Student: Roman Kruchinin
Supervisor: Dr. Almas Shintemirov

This master thesis present a control framework for simultaneous localization and mapping (SLAM) and motion control of a skid-steering robot. Firstly, an experimental mobile robot platform is presented and its limitations are discussed following up by the modified robot embedded control system design. Then, ROS based navigation stack is applied for robot simultaneous localization and mapping (SLAM) and global path generation to a destination point. Then, the control framework for global path following for a skid-steering robot is presented based on the nonlinear model predictive control methodology.The framework is tested in the Webots open-source robotics simulation software.


MSc in Computer Science thesis projects 2019

Recognition of 3D Objects for a Robot Arm Using Deep Learning

Student: Sergey Soltan
Main Supervisor: Dr. M. Fatih Demirchi (Dept. of Computer Science)
Co-Supervisor: Dr. Almas Shintemirov

Accurate object classification and position estimation is a crucial part for executing autonomous pick-and-place operations by a robot and can be realized using RGB-D sensors becoming increasingly available for use in industrial applications. In this master thesis project we present a novel unified framework for object detection and classification using a combination of point cloud processing and deep learning techniques. The proposed model uses two streams that recognize objects on RGB and depth data separately and combines the two in later stages to classify objects. Experimental evaluation of the proposed model including classification accuracy comparison with previous works demonstrates its effectiveness and efficiency, making the model suitable for real-time applications. The experiments performed on the publicly available Washington RGB-D object dataset show that the proposed framework has 98% fewer parameters compared to the state-of-the-art multimodel neural networks with the cost of approximately 5% drop in accuracy.


Spring 2017 Semester

MSc in Robotics thesis projects 2017

Intuitive Teleoperation of 6-DoF Universal Robots Manipulators in Constrained Workspace using Nonlinear Model Predictive Control

Student: Tasbolat Taynyazov
Supervisor: Dr. Almas Shintemirov
Co-supervisor: Dr. Matteo Rubagotti

This master thesis project present a novel control framework for intuitive teleoperation of the Universal Robots manipulators in a constrained environment using human upper limb tracking system. At first, the novel hardware and software designs of a 7-DOF wireless human upper limb tracking system are developed. The tracking system consists of separate individual sensor units for easy mounting on human arms or replacement. The presented experimental tests demonstrate the accuracy of the developed system that is achieved via careful sensor calibration and data filtering. Secondly, the Nonlinear Model Predictive Control (NMPC) technique is used to formulate the UR manipulators joint velocity control loop in order to integrate the human upper limb tracking system for smooth and accurate intuitive robot teleoperation through human operator’s motion tracking. The limitations of the UR manipulators are taken into account as the NMPC constraints while the UR forward kinematics problem is used in the optimization function for predictive control of the robot. The performance of the proposed UR manipulators teleoperation framework was experimentally demonstrated on a real UR5 manipulator equipped with a Robotiq 3-finger adaptive gripper. The proposed approach demonstrated superior performance  in completing pick-and-place experimental test operations compared to the same task execution using a standard UR teach peadant.

Mechanical Design and Kinematic Analysis of a Spherical Parallel Manipulator with Coaxial Input Shafts

Student: Iliyas Tursynbek
Supervisor: Dr. Almas Shintemirov

In this thesis a spherical parallel manipulator with coaxial input shafts (Coaxial SPM) is under study. It is a part of a bigger family of spherical parallel manipulators (SPM) with a special feature of unlimited roll rotation around its axis. This feature makes the Coaxial SPM of high interest for applications in motion control system. First, an approach for obtaining unique forward and inverse kinematics solutions is introduced, in order to relate the angular position of the manipulator servomotors to the position and orientation of Coaxial SPM mobile platform and vice versa. Then, a configuration space of the manipulator is defined by using a numerical procedure, in order to guarantee the absence of singularities and of collision between the manipulator links during the manipulator motion. Afterwards, the Cartesian space of the manipulator is generated. Results of these analyses are applied to the assembled mechanical prototype of Coaxial SPM for experimental verifications.


Fall 2016 Semester

MSc in Mechanical Engineering thesis projects 2017

Real-Time predictive Control of UR5 Robotic Arm Through Human Upper Limb Motion Tracking

Student: Bukeikhan Omarali
Supervisor: Dr. Almas Shintemirov
Co-supervisor: Dr. Hazrat Ali (Dept. of Mechanical Engineering)

This thesis reports the authors’ results on developing a real-time predictive control system for an Universal Robot UR5 robotic arm through human motion capture with a visualization environment built in the Blender Game Engine. The UR5 is a 6 degree of freedom serial manipulator commonly used in academia and light industry. It is a very safe robot by design that comes at a cost of a rather limited API with very little support of real-time operation. The motion tracking is performed by a wireless low-cost inertial motion capture setup produced in-house. The external controller incorporates an iTaSC SDLS IK solver and a Python wrapped C explicit model predictive controller generated using the Multi Parametric Toolbox. The visualisation provides the user with the feedback on the robot’s progress towards the target. It is planned to extend the visualisation to virtual reality in future. Tests have shown that the robot follows the operator’s wrist position and orientation with an average of 0.05sec. time lag in the case when the operator moves under the robot’s velocity and acceleration limits. When the operator moves too fast for the robot to keep up in real-time, the robot is able to catch up with the operator with little or no overshooting. Thesis results are described in a late-breaking report and demo accepted by the 12th annual IEEE/ACM international conference Human-Robot Interaction (HRI2017).