Monday, August 9, 2021 - 3:00pm to 4:30pm
Location:Virtual Presentation - ET Remote Access - Zoom
Speaker:ZHENG XU, Masters Student /ZHENG%20XU
Visual Tracking with Occlusion Handling in Simulated Space Environment
Peg-in-hole insertion is a task where the robot attempts to position the end-effector at a specific location. Conventional setups have a camera aligned with the end-effector to get an unoccluded view of the hole. In our specific setup for space satellite service, the base satellite attempts to insert a payload placed on a robot arm into the socket on the client satellite with visual feedback from a camera placed on the base satellite. Due to the limitation of this setup, the robot arm can block some part of the visual feedback and the socket may not be directly visible. In this work, we set up a simulated environment that best reproduces the real world environment and propose two methods of dealing with occlusion in this scenario based on computation resources. The first method uses knowledge of the pose of the robot arm to estimate occluded areas. The projection of the arm onto the image plane forms a mask of occluded regions and the tracker will avoid masked regions. The second method is similar to the previous method except the mask is generated from learning algorithms. Both methods are robust with various levels of occlusion. The first method requires very low computational complexity and can operate at a very high frequency. The second method can function without knowledge of arm specifications and poses and provides a better estimate of occlusion. Our observation shows that both methods improve tracking accuracy and reduce the rate at which the tracker loses track due to occlusion.
Howie Choset (Chair)
Matthew J. Travers
Zoom Participation. See announcement.