ARAS Public Webinars

ARAS is intended to promote knowledge to a broad and interested audience. For this reason, a monthly public webinar is organized, in which, the research findings of different research teams will be presented with a general overview, to be useful for the public.
#11 - ARAS Current and Future Research: Developed Products and Algorithms
ARAS Current and Future Research: Developed Products and Algorithms
Prof. Hamid D. Taghirad
Webinar Video

Abstract:

ARAS Research group originated in 1997 and is proud of its 24+ years of brilliant background, and its contributions to the advancement of academic education and research in the field of dynamical system analysis and control in the robotics application. ARAS current research themes includes parallel and cable robotics, surgical robotics and autonomous robotics. Many products and algorithms are developed in these research group. In this presentation, an overview on these research topics is given, and the developed products and algorithms are briefly introduced.

 Date: Monday, Oct. 11, 2021 (19 Mehr 1400)

Time:17:30-19:00 (+5:30 GMT Tehran local time) 10:30-12:00 (-4:00 GMT Canada Eastern Time Zone)

#10 - Cable-Driven Parallel Robots: Control Challenges and Solutions
Cable-Driven Parallel Robots: Control Challenges and Solutions

Abstract:

In recent decades, the growing need of the industry for high-speed robots with a large workspace has led to the development of cable-driven parallel robots (CDPR). Having significantly lower inertia than the rigid linkage robots makes CDPRs a unique alternative for applications with large workspace requirements and high-speed manipulation demands. However, by cables, we may exert only tensile force in the pulling direction. The positive tensile force of the cables in CDPRs is guaranteed either through a passive force, such as gravity, or using redundant cables in their structure.  This challenge motivates the researchers to develop desirable control algorithms for a CDPR to maintain positive tension in all the cables while having a suitable tracking performance. This talk addresses the control challenges in CDPRs and discusses some approaches that suitably employed for this class of robots. In particular, the first part of the talk demonstrates the CDPRs kinematics and their different types of the workspace as well as their dynamics. The second part of the talk focuses on the control challenges of the CDPRs and next, some solutions are introduced. In the end, the ongoing and future direction in the control of the CDPRs will be discussed.

 Date: Monday, July 4, 2021 (4 Mordad 1400),

Time: 18:00-19:30 (+4:30 GMT Tehran local time) ,9:30-11:30 (-4:00 GMT Canada Eastern Time Zone)

#9 – Safe Reinforcement Learning (SRL) Using Control Barrier Functions
 Safe Reinforcement Learning (SRL) Using Control Barrier Functions
Zahra Marvi
Webinar Teaser

Abstract: This talk presents a method to learn barrier-certified safe controllers for safety-critical systems while providing an optimal performance with the focus on reinforcement learning approach. The problem describes designing optimal controllers for systems with unknown dynamics through the interaction while safety specifications of the system such as state constraints must be satisfied. We first start by reviewing the basics of reinforcement learning in control such as the overall framework, Bellman equation, actor/critic approximations and sequential improvement of controller by means of reducing the prediction error. Then, different types of control barrier functions and their application for restricting the states of the system within a desired safe region and therefore safety guarantee are discussed. The safe reinforcement learning problem is then formulated by means of control barrier functions to have a safe performance. Safety, stability and optimality of the proposed method are discussed and finally the off-policy reinforcement learning algorithm to implement the proposed method is presented.

Date & Time: Monday, May 31, 2021 (10 Khordad 1400) | 18:00-19:30 (+4:30 GMT Tehran local time), 9:30-11:30 (-4:00 GMT Canada Eastern Time Zone)

#8 – Artificial Intelligence & Haptic Technology in Intraocular Surgery Training (IFEES)
 Artificial Intelligence and Haptic Technology in Intraocular Surgery Training (IFEES)

Abstract: Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Bibendum est ultricies integer quis auctor elit sed vulputate. Lacus sed viverra tellus in. Sed pulvinar proin gravida hendrerit. Egestas sed tempus urna et pharetra pharetra massa massa. Pharetra pharetra massa massa ultricies mi quis. Arcu cursus euismod quis viverra nibh cras. Blandit turpis cursus in hac habitasse platea dictumst quisque. Dolor sed viverra ipsum nunc aliquet bibendum enim facilisis gravida. At consectetur lorem donec massa sapien faucibus. Velit euismod in pellentesque massa placerat. Placerat orci nulla pellentesque dignissim enim sit amet. Cras pulvinar mattis nunc sed blandit libero.

Date & Time: April 21, 2021 | 17:30 (+4:30 GMT Tehran local time) | 9:30 (-4:00 GMT Canada Eastern Time Zone)

#7 – Safe and Resilient Autonomous Navigation in Highly Dynamic Environments
Safe and Resilient Autonomous Navigation in Highly Dynamic Environments 
Research Scientist, Department of Aeronautics and Astronautics, MIT

Abstract: Despite all recent advances in robotics and automation, building a resilient, safe, and practical autonomous system with the ability of interacting with the environment efficiently and overcoming the challenges of real-world scenarios is not trivial. This talk addresses three critical challenges in building a fully autonomous system including safety, transfer-ability, and intractability. In particular, the first part of the talk focuses on challenges of self-driving vehicles navigating in highly dynamic environments. A transferable and scalable algorithm is introduced which incorporates the environment context for predicting the motion behaviors of pedestrians in environments with high level of uncertainty. The presented framework is also able to continually learn when the data is available incrementally, leading to a real-time learning and inference paradigm. Furthermore, the extension of the context-based perception pipeline to multi-agent learning such as fleet of autonomous vehicles (AV) or smart nodes (IX) will be described. The second part of the talk demonstrates an example of an end-to-end distributed and scalable pipeline for collective transport of an unknown object by a team of robots with limited sensing. At the end, ongoing and future direction in safety and robustness of visual autonomous navigation systems will be discussed.

Date & Time:  Monday, April. 19, 2021 | 18:00-19:30 (+4:30 GMT Tehran local time) | 9:30-11:30 (-4:00 GMT Canada Eastern Time Zone)

#6 – 10th Translational Ophthalmology Research Center Seminar

10th Translational Ophthalmology Research Center Seminar

Intraocular surgery is a hot topic of research among researchers, who are looking into novel areas in which haptic systems and assistive technologies would be beneficial. Since the human eye is a highly delicate organ with minuscule anatomic structures, the ocular surgeries are needed to be performed under extra precision and higher manipulation capabilities in contrast to other surgeries. In fact, any minute surgeon miss-manipulation, which might be negligible in the majority of other surgical operations, might lead to disastrous complications and even blindness for the patient in the ocular surgeries. This fact underlines the importance of assistive technologies in eye surgical procedures. The assistive technologies aim at giving extra manipulation to the surgeon during the operation or help the novice surgeons to obtain the required skills before performing actual operations at the surgical room. This presentation reviews novel recent breakthroughs along with new areas in which haptic systems and assistive technologies might provide a viable solution to overcome current challenges. Furthermore, ARAS Haptic system developed for eye surgery training will be introduced as a novel system used in intraocular surgeries.

Date & Time: Friday, December 11th, 2020 | 15:00-21:00 (Tehran Local Time) |

#5 – ARAS Research on Swarm Robotics in Oil Spill Monitoring and Cleanup
 ARAS Research on Swarm Robotics in Oil Spill Monitoring and Cleanup

Oil spills are serious threats to the marine environment in the era of increasing environmental concern. Effective efforts have been made to tackle this problem, amongst them, the concept of using swarm robotics for Oil Spill Monitoring and Cleanup is a promising solution. Thanks to the recent progress in robotics, there are opportunities to use autonomous arial drones for labor-intensive environment monitoring purposes. Engaging multiple drones instead of one, not only increases the robustness and scalability of the system in time-sensitive and hazardous events such as oil spills, but also in spatiotemporal oil spill cases increases the accuracy of measurements by multi-sampling the environment concurrently. Marine environmental monitoring tasks can strongly benefit from these advantages, as the monitoring areas are typically large, and communication with a central control unit might not always be available. Among these obvious advantages, more work needs to be done to understand the capabilities and limitations of autonomous systems and the resources required in the marine environment and ensure their acceptable use to the regulatory agencies.

In this presentation we first review a number of arial drones developed in ARAS, which are very promising in such application. Then a review on swarm robotics concepts and potential research gaps are given. Finally, The application of oil spill monitoring and cleanup in Persian Gulf is given professional notice. Since the monitoring of the oil spill expansion on the water surface is a spatiotemporal problem, we also address the oil spill modeling by a Gaussian mixture model, which is based on NOAA’s advanced oil spill model (GNOME). Furthermore, a cooperative control framework developed for a group of Unmanned Aerial vehicles (UAVs) will be introduced as a novel strategy in oil spill monitoring in the Persian Gulf.

Date & Time: Monday, Nov. 30, 2020 (10 Azar 1399) | 18:00-20:00 (+3:30 GMT Tehran local time) | 9:30-11:30  (-5:00 GMT Canada Eastern Time Zone)

#4 – ARAS Research on Haptic Technology in Intraocular Surgeries
ARAS Research on Haptic Technology in Intraocular Surgeries

Intraocular surgery is a hot topic of research among researchers, who are looking into novel areas in which haptic systems and assistive technologies would be beneficial. Since the human eye is a highly delicate organ with minuscule anatomic structures, the ocular surgeries are needed to be performed under extra precision and higher manipulation capabilities in contrast to other surgeries. In fact, any minute surgeon miss-manipulation, which might be negligible in the majority of other surgical operations, might lead to disastrous complications and even blindness for the patient in the ocular surgeries.

This fact underlines the importance of assistive technologies in eye surgical procedures. The assistive technologies aim at giving extra manipulation to the surgeon during the operation or help the novice surgeons to obtain the required skills before performing actual operations at the surgical room. This presentation reviews novel recent breakthroughs along with new areas in which haptic systems and assistive technologies might provide a viable solution to overcome current challenges. Furthermore, ARAS Haptic system developed for eye surgery training will be introduced as a novel system used in intraocular surgeries.

Time & Date: Monday, Nov. 2, 2020  | 18:00-19:30 (+3:30 GMT Tehran local time)9:30-11:00 (-5:00 GMT Canada Eastern Time Zone)

#3 – ARAS Research on Artificial Intelligence and Deep Learning Methods in Autonomous Robotics
ARAS Research on Artificial Intelligence and Deep Learning Methods in Autonomous Robotics

Artificial intelligence has found its permanent place among cutting-edge researches in various applications. In particular, the deep learning approaches are very promising with optimized solutions for a variety of real-world problems. Considering the enhanced computational capability to execute these algorithms on embedded systems such as NVIDIA Jetson boards with an outstanding performance, deep learning approaches are progressively employed in autonomous robotics. As a major part of each autonomous robot, a camera plays a significant role to extract rich information on the surrounding environment.

Object detection and tracking is a necessary task for the autonomous robot to maneuver suitably in an unstructured environment. Furthermore, these methods are very promising in other applications like monitoring and evaluating a process. Regarding the tracking task, estimation of the depth is of high importance to realize a 3D object in the environment. It is usually critical to have a depth map of the robot’s frontal view, which may be considered as a more comprehensive requirement for autonomous vehicles. In this presentation, a review of the ARAS autonomous robotics group on the development, implementation, and optimization of deep learning approaches especially in image processing are addressed. We aim to focus on applications used for autonomous robots and vehicles and introduce some of our recent industrial projects.

Date & Time: Monday, Sept. 28, 2020 | 18:00-19:30 (+4:30 GMT Tehran local time) | 9:30-11:00 (-4:00 GMT Canada Eastern Time Zone)

#2 – ARAS Research on Parallel and Cable Robotics: Products and Algorithms
ARAS Research on Parallel and Cable Robotics: Products and Algorithms

Cable and parallel robotics have been gaining more attention among researchers due to their unique characteristics and applications. Simple structure, high payload capacity, agile movements, and simple structures are the main characteristics that nominate cable-robots from the other types of manipulators for many applications such as imaging, cranes, agriculture, etc. The ARAS Parallel and Cable Robotics (PACR) group is focused on the development of such novel manipulators and their possible applications. Interdisciplinary research fields such as dynamics and kinematic analysis using classic and modern approaches, development of easily deployable robots thorough robust controllers, implementation of novel self-calibration algorithms, and establishing modern and multi-sensor perception systems for them are among the active lines of research in this group. The theoretical results of this active research group are also directly incorporated for producing commercial products through the spin-off and startup companies originated from the team.

Kamalolmolk® robot is a representative of such products, which is a fast deployable edutainment cable-driven robot for calligraphy and painting (chiaroscuro) applications. Additionally, PACR exploits the simplicity of cable robots combined with SLAM and perception algorithms to create commercial inspection and imaging robots for various applications. In this webinar, the underlying concepts of such robots, and the current state-of-the-art development of the group will be presented. 

Date & Time: Monday, August 31, 2020 | 18:00-19:30 (+4:30 GMT Tehran local time) | 9:30-11:00 (-4:00 GMT Canada Eastern Time Zone)

#1 – ARAS eye surgery training system: A VR/AR Approach
ARAS eye surgery training system: A VR/AR Approach

The virtual reality and augmented reality are getting more interest as a training technique in the medical fields unlocking significant benefits such as safety, repeatability and efficiency. Furthermore, VR/AR based simulators equipped with a haptic device can be used in medical surgery training in order to achieve skill improvement and training time reduction. With haptics as part of the training experience it is observed that a 30% increase in the speed of skills acquisition and up to a 95% increase in accuracy is achieved. Six out of nine studies showed that tactile feedback significantly improved surgical skill training.Eye surgery training is considered for VR/AR based training in ARAS group as it is one of the most complex surgical procedures. The ARASH:ASiST haptic system is integrated into the eye surgery training system in conjunction with a physical simulation engine and the unity software to visualize the simulation results in Oculus VR headset. The hand motions of the expert surgeon is captured by a haptic system where later the motion data is used to train the hand motions of surgery students by force feedback.

In the developed eye surgery training system, two types of eye surgeries are simulated, namely the cataract and vitrectomy. In each type of eye surgery, the haptic system is used to simulate the surgery tool motion. The interaction of the virtual surgery tool with the 3D modeled eye is computed through the SOFA framework. The simulation results are transformed to the unity game engine in order to visualize the results in an Oculus VR headset.

Time & Date: Monday, August 3, 2020 | 18:00-19:30 (+4:30 GMT Tehran local time) | 9:30-11:00 (-4:00 GMT Canada Eastern Time Zone)

Menu