The Doctoral School in Science and Engineering is happy to invite you to Kuldeep Rambhai BARAD’s defence entitled.
Visual Intelligence for 6-DoF Robotic Grasping of Unknown Objects from Earth to Space
Supervisor: Assoc. Prof Miguel Angel OLIVARES MENDEZ
Abstract
Autonomous robotic manipulation is fundamental to realizing robots that will assist in homes and support space operations. This dissertation explores one of the elementary challenges in robotic manipulation– the ability of a robot to grasp objects that it must manipulate. While grasping known objects is well-studied, real-world manipulation often involves unknown objects. This work proposes an object-centric approach to learning vision-based 6-DoF grasp synthesis, sim-to-real transfer, and perception for dynamic grasping. The first major contribution is GraspLDM, a generative framework that effectively learns and samples from the complex distribution of object-centric 6-DoF grasp poses for unknown objects. By combining Variational Autoencoders (VAE) with latent diffusion models, GraspLDM retains benefits of VAEs while overcoming challenges like prior and posterior gap. GraspLDM achieves superior grasp quality compared to existing generative models, while affording flexibility of task-conditional generation unavailable in the latter. The second contribution is Grasp-O, a modular object-centric grasping system that enables reliable transfer of simulation-trained grasp synthesis models to physical robots. The system is used to validate sim-to-real transfer of GraspLDM, demonstrating approximately 80% success rate on 16 unknown objects in different poses across two robotic setups. The third contribution is a novel approach to simultaneous 3D reconstruction and 6-DoF pose tracking of unknown objects that can assist dynamic grasping. Using object-centric 3D Gaussian Splatting, reconstruction and tracking are accomplished under a single representation that also interfaces with grasp synthesis models like GraspLDM. The approach is validated on challenging scenarios involving spacecraft in unconstrained relative motion. Together, these contributions take a step towards general-purpose robotic manipulation that will benefit both terrestrial and space applications.