ARAS Visual Robotics Research Group

The visual robotics group started its research by equipment of one of our industrial robotic manipulator to a visual camera, in order to automatically track dynamical objects within the workspace of the robot. This mission was fully accomplished on our Mitsubishi 5DoF industrial robot, with a eye-in-hand configuration. The first algorithm developed to target moving objects was the use of EKF algorithm on the moving objects with marked features. Then the research group focuses on the development of tracking featureless objects, especially kernel-based visual servoing methods. In order to effectively compute the visual kernel of the pictures, Fourier transformation and Log-polar transformation are being used. Then we introduce a sliding mode controller design in kernel-based visual servoing.

The main goal is to track a target object without any guiding features like lines, point, etc. In the kernel-based approach, a sum of weighted image signal, or Fourier transform of image signal, is used as a measurement for tracking purpose which is known as kernel measurement. Tracking error is the difference between current and desired kernel measurement and it is used as input variables to an integral sliding mode controller. By binding kernel-measurement to sliding mode control, our configured system will outperform conventional kernel-based visual servoing system. The proposed method is implemented on the Mitsubishi industrial robot and is compared with conventional kernel-based visual servoing approach for different initial conditions. Furthermore, the stability of proposed algorithm is analyzed via Lyapunov theory. The uncertainties such as image noise, image blur, and camera calibration errors can affect the stability of algorithm and will lead the target object partially or totally leave the image range, which derives a task failure. To reduce the effect of bounded uncertainties, the controller parameters are designed automatically based on sliding condition.

The other venue being examined in this research theme was the use of monocular and stereo vision on the perception of environment on a mobile robotic platform. This research were in collaboration to our Autonomous Robotics group which is currently being pursued.

Notable Alumni

Seyed Farokh Atashzar

Postdoctoral Research Associate
Canadian Surgical Technologies and Advanced Robotics (CSTAR), Canada.

Mahya Shabazi

Post Doc
University of Western Ontario, Canada
Collaborators

Our Collaborators with this research group include Prof. Farrokh Janabi-Sharifi, Ryerson University, Canada.

Alumni

Javad Ramezanzadeh, Fatemeh Bakhshandeh, Mahsa Parsapour, Parisa Masnadi, Aida Farahani, Seyed Farokh Atashzar, Mahya Shahbazi, Sahar Sedaghati, Homa Ammari, , Mehrnaz Salehian, Soheil Rayatdoost, Mohammad Reza Sadeghi. Farzaneh Sedaghat.

Related Publications

TitleAbstractYearTypePDFResearch Group
A Robust Approach Toward Kernel-Based Visual Servoing
Mahsa Parsapour and Hamid D. Taghirad
2017 5th RSI International Conference on Robotics and Mechatronics (ICRoM)
Abstract:

We introduce a robust controller for kernel-based visual servoing systems. In such systems, visual features are sum of weighted-image intensities via smooth kernel functions. This information along with its derivative are input to the controller in which we have developed a sliding mode approach to generate system commands. In vision-based systems, image uncertainties affect the tracking performance and stability, and the target object may get out of the field of view. Unless considerable image uncertainties appear, such systems are able to track the object within desired precision. Hence, we have investigated the effect of the image noise as the main source of uncertainty, and encapsulated its characteristics in a proper representation. In order to fulfill the sliding condition, some bounds over image uncertainty and tracking errors are considered, and the controller gains are tuned online to keep the tracking error bounded. An application of the proposed method is experimentally tested on an industrial robot.

2017ConferencePDFVisual Robotics
Visual Servoing Simulator by Using ROS and Gazebo
Parisa Masnadi Khiabani, Babak Sistanizadeh Aghdam, Javad Ramezanzadeh and Hamid D. Taghirad
International Conference on Robotics and Mechatronics
Abstract:

In this paper, a simulator for five degree of freedom (DOF) visual servoing robot is presented with eye-in-hand configuration. This simulator has been developed in Robot Operating System (ROS) and Gazebo environment. The designed simulator eases the process of testing and debugging visual servoing schemes, and robot controllers. Among different methods, one of the existing Image based visual servoing schemes, image moments, has been implemented to verify the functionality and performance of designed simulator.

2016ConferencePDFVisual Robotics
Kernel-based sliding mode control for visual servoing system
Mahsa Parsapour, Hamid D. Taghirad
IET Computer Vision
Abstract:

In this study, a new approach to design a controller for a visual servoing (VS) system is proposed. Kernel-measurement is used to track the motion of a featureless object which is defined as sum of weighted-image value through smooth kernel functions. This approach was used in kernel-based VS (KBVS). To improve the tracking error and expand the stability region, sliding mode control is integrated with kernel measurement. Proportional-integral-type sliding surface is chosen as a suitable manifold to produce the required control effort. Moreover, the stability of this algorithm is analysed via Lyapunov theory and its performance is verified experimentally by implementing the proposed method on a five degrees of freedom industrial robot. Through experimental results, it is shown that the performance of tracking error in the proposed method is more suitable than KBVS, for a wider workspace and when the object is placed near the boundary of the camera's field of view.

2015JournalPDFVisual Robotics
Visual Tracking using Kernel Projected Measurement and Log-Polar Transformation
Fateme Bakhshande, Hamid D. Taghirad
International Journal of Robotics Theory and Applications
Abstract:

Visual Servoing is generally contained of control and feature tracking. Study of previous methods shows that no attempt has been made to optimize these two parts together. In kernel based visual servoing method, the main objective is to combine and optimize these two parts together and to make an entire control loop. This main target is accomplished by using Lyapanov theory. A Lyapanov candidate function is formed based on kernel definition such that the Lyapanov stability can be verified. The implementation is done in four degrees of freedom and Fourier transform is used for decomposition of the rotation and scale directions from 2D translation. In the present study, a new method in scale and rotation correction is presented. Log-Polar Transform is used instead of Fourier transform for these two degrees of freedom. Tracking in four degrees of freedom is synthesized to show the visual tracking of an unmarked object. Comparison between Log-Polar transform and Fourier transform shows the advantages of the presented method. KBVS based on Log-Polar transform proposed in this paper, because of its robustness, speed and featureless properties

2015JournalPDFVisual Robotics
A 3D Sliding Mode Control Approach for Position Based Visual Servoing System
Mahsa Parsapour, Soheil RayatDoost, and Hamid D. Taghirad
Scientia Iranica
Abstract:

The performance of visual servoing systems can be enhanced through nonlinear controllers. In this paper, a sliding mode control is employed for such a purpose. The controller design is based on the outputs of a pose estimator which is implemented on the scheme of the Position-Based Visual Servoing (PBVS) approach. Accordingly, a robust estimator based on unscented Kalman observer cascading with Kalman ?lter is used to estimate the position, velocity and acceleration of the target. Therefore, a PD-type sliding surface is selected as a suitable manifold. The combination of the estimator and nonlinear controller provides a robust and stable structure in PBVS approach. The stability analysis is veri?ed through Lyapunov theory. The performance of the proposed algorithm is veri?ed experimentally through an industrial visual servoing system.

2015JournalPDFVisual Robotics
Kernel-based sliding mode control for visual servoing system
Mahsa Parsapour, Hamid D. Taghirad
IET Computer Vision
Abstract:

In this study, a new approach to design a controller for a visual servoing (VS) system is proposed. Kernel-measurement is used to track the motion of a featureless object which is defined as sum of weighted-image value through smooth kernel functions. This approach was used in kernel-based VS (KBVS). To improve the tracking error and expand the stability region, sliding mode control is integrated with kernel measurement. Proportional-integral-type sliding surface is chosen as a suitable manifold to produce the required control effort. Moreover, the stability of this algorithm is analysed via Lyapunov theory and its performance is verified experimentally by implementing the proposed method on a five degrees of freedom industrial robot. Through experimental results, it is shown that the performance of tracking error in the proposed method is more suitable than KBVS, for a wider workspace and when the object is placed near the boundary of the camera's field of view.

2014JournalPDFVisual Robotics
Position Based Sliding Mode Control for Visual Servoing System
M. Parsapour, S. RayatDoost, and H. D. Taghirad
2013 First RSI/ISM International Conference on Robotics and Mechatronics (ICRoM)
Abstract:

This paper presents a nonlinear controller for visual servoing system. Pose estimation is one of the fundamental issues in position-based visual servoing (PBVS) approach. A few researches have focused on controller synthesis under modeling uncertainty and measurement noise of estimated position. In this research, PD-type sliding surface is designed for tracking target. The control signal is obtained from the sliding surface and the stability of the algorithm is verified by Lyapunov theory. Moreover, a recent designed robust estimator based on unscented Kalman observer (UKO) cascading with Kalman filter (KF) is used to estimate the pose, velocity and acceleration of the target. The combination of the implemented estimator and the proposed controller provide a stable and robust structure in PBVS. The reported experimental results, verify the effectiveness of the proposed method in an industrial visual servoing system.

2013ConferencePDFVisual Robotics
Visual Tracking in Four Degrees of Freedom Using Kernel Projected Measurement
Fateme Bakhshande and Hamid D. Taghirad
2013 First RSI/ISM International Conference on Robotics and Mechatronics (ICRoM)
Abstract:

Visual Servoing is generally comprised of feature tracking and control. According to the literature, no attempt has already been made to optimize these two parts together. In kernel based visual servoing method, the main objective is to combine and optimize the entire control loop. By kernel definition, a Lyapanov candidate function is formed and the control input is computed so that the Lyapanov stability can be verified. This is performed in four degrees of freedom. In the present study, previous kernel algorithm from the recorded literature has been implemented. We have used the KBVS for our purpose such that an object without any marker is tracked. This method is chosen because of its robustness, speed and featureless properties. Furthermore, in order to show the visual tracking performance, all four degrees of freedom have been synthesized. Experimental results verifies the effectiveness of this method implemented for four degrees of freedom movements.

2013ConferencePDFVisual Robotics
Position Based Sliding Mode Control for Visual Servoing System
M. Parsapour, S. RayatDoost, and H. D. Taghirad
2013 First RSI/ISM International Conference on Robotics and Mechatronics (ICRoM)
Abstract:

This paper presents a nonlinear controller for visual servoing system. Pose estimation is one of the fundamental issues in position-based visual servoing (PBVS) approach. A few researches have focused on controller synthesis under modeling uncertainty and measurement noise of estimated position. In this research, PD-type sliding surface is designed for tracking target. The control signal is obtained from the sliding surface and the stability of the algorithm is verified by Lyapunov theory. Moreover, a recent designed robust estimator based on unscented Kalman observer (UKO) cascading with Kalman filter (KF) is used to estimate the pose, velocity and acceleration of the target. The combination of the implemented estimator and the proposed controller provide a stable and robust structure in PBVS. The reported experimental results, verify the effectiveness of the proposed method in an industrial visual servoing system.

2013ConferencePDFVisual Robotics
Robust solution to three-dimensional pose estimation using composite extended Kalman observer and Kalman filter
H.D. Taghirad S.F. Atashzar M. Shahbazi
IET Computer Vision
Abstract:

Three-dimensional (3D) pose estimation of a rigid object by only one camera has a vital role in visual servoing systems, and extended Kalman filter (EKF) is vastly used for this task in an unstructured environment. In this study, the stability of the EKF-based 3D pose estimators is analysed in detail. The most challenging issue of the state-of-the-art EKF-based 3D pose estimators is the possibility of its divergence because of the measurement and model noises. By analysing the stability of conventional EKF-based pose estimators a composite technique is proposed to guarantee the stability of the procedure. In the proposed technique, the non-linear-uncertain estimation problem is decomposed into a non-linear-certain observation in addition to a linear-uncertain estimation problem. The first part is handled using the extended Kalman observer and the second part is accomplished by a simple Kalman filter. Finally, some experimental and simulation results are given in order to verify the robustness of the method and compare the performance of the proposed method in noisy and uncertain environment to the conventional techniques.

2012JournalPDFVisual Robotics
Robust solution to three-dimensional pose estimation using composite extended Kalman observer and Kalman filter
H.D. Taghirad S.F. Atashzar M. Shahbazi
IET Computer Vision
Abstract:

Three-dimensional (3D) pose estimation of a rigid object by only one camera has a vital role in visual servoing systems, and extended Kalman filter (EKF) is vastly used for this task in an unstructured environment. In this study, the stability of the EKF-based 3D pose estimators is analysed in detail. The most challenging issue of the state-of-the-art EKF-based 3D pose estimators is the possibility of its divergence because of the measurement and model noises. By analysing the stability of conventional EKF-based pose estimators a composite technique is proposed to guarantee the stability of the procedure. In the proposed technique, the non-linear-uncertain estimation problem is decomposed into a non-linear-certain observation in addition to a linear-uncertain estimation problem. The first part is handled using the extended Kalman observer and the second part is accomplished by a simple Kalman filter. Finally, some experimental and simulation results are given in order to verify the robustness of the method and compare the performance of the proposed method in noisy and uncertain environment to the conventional techniques.

2011JournalPDFVisual Robotics
Menu