作者
Michalis Logothetis, George C Karras, Shahab Heshmati-Alamdari, Panagiotis Vlantis, Kostas J Kyriakopoulos
发表日期
2018/10/1
研讨会论文
2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
页码范围
1-6
出版商
IEEE
简介
This paper presents the design of a vision-based object grasping and motion control architecture for a mobile manipulator system. The optimal grasping areas of the object are estimated using the partial point cloud acquired from an onboard RGB-D sensor system. The reach-to-grasp motion of the mobile manipulator is handled via a Nonlinear Model Predictive Control scheme. The controller is formulated accordingly in order to allow the system to operate in a constrained workspace with static obstacles. The goal of the proposed scheme is to guide the robot's end-effector towards the optimal grasping regions with guaranteed input and state constraints such as occlusion and obstacle avoidance, workspace boundaries and field of view constraints. The performance of the proposed strategy is experimentally verified using an 8 Degrees of Freedom KUKA Youbot in different reach-to-grasp scenarios.
引用总数
20182019202020212022202320241218553
学术搜索中的文章
M Logothetis, GC Karras, S Heshmati-Alamdari… - 2018 IEEE/RSJ International Conference on Intelligent …, 2018