TY - JOUR
T1 - Development of a mobile robot for visually guided handling of material
AU - Tsay, James Tsing-Iuan
AU - Hsu, M. S.
AU - Lin, R. X.
PY - 2003/12/9
Y1 - 2003/12/9
N2 - Mobile robots frequently replace humans in handling and transporting wafer carriers in semiconductor production lines. A mobile robot is constructed in this paper. The developed mobile robot is primarily composed of a mobile base, a robot manipulator, and a vision system. Since the guidance control system of the mobile base inevitably causes positioning errors of the mobile base, this study employs the eye-in-hand vision system to provide visual information for controlling the manipulator of the mobile robot to grasp accurately stationary material during pick-and-place operations between a predefined station and the mobile robot. This work further proposes a position-based look-and-move task encoding control strategy for eye-in-hand vision architecture, that maintains all target features in the camera's field of view throughout the visual guiding. Moreover, the manipulator can quickly approach the material and precisely position the end-effector in the desired pose. Numerous techniques are required for implementing such a task, including image enhancement, edge detection, corner and centroid detection, camera model calibration method, robotic hand/eye calibration method, using a camera with controlled zoom and focus, and task encoding scheme. Finally, the theoretical results for the proposed control strategy are experimentally verified on the constructed mobile robot. Specific experimental demonstrations include grasping the target object with different locations on the station and grasping the target object tilted by different angles to the station.
AB - Mobile robots frequently replace humans in handling and transporting wafer carriers in semiconductor production lines. A mobile robot is constructed in this paper. The developed mobile robot is primarily composed of a mobile base, a robot manipulator, and a vision system. Since the guidance control system of the mobile base inevitably causes positioning errors of the mobile base, this study employs the eye-in-hand vision system to provide visual information for controlling the manipulator of the mobile robot to grasp accurately stationary material during pick-and-place operations between a predefined station and the mobile robot. This work further proposes a position-based look-and-move task encoding control strategy for eye-in-hand vision architecture, that maintains all target features in the camera's field of view throughout the visual guiding. Moreover, the manipulator can quickly approach the material and precisely position the end-effector in the desired pose. Numerous techniques are required for implementing such a task, including image enhancement, edge detection, corner and centroid detection, camera model calibration method, robotic hand/eye calibration method, using a camera with controlled zoom and focus, and task encoding scheme. Finally, the theoretical results for the proposed control strategy are experimentally verified on the constructed mobile robot. Specific experimental demonstrations include grasping the target object with different locations on the station and grasping the target object tilted by different angles to the station.
UR - http://www.scopus.com/inward/record.url?scp=0344033810&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0344033810&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:0344033810
SN - 1050-4729
VL - 3
SP - 3397
EP - 3402
JO - Proceedings - IEEE International Conference on Robotics and Automation
JF - Proceedings - IEEE International Conference on Robotics and Automation
T2 - 2003 IEEE International Conference on Robotics and Automation
Y2 - 14 September 2003 through 19 September 2003
ER -