التفاصيل البيبلوغرافية
العنوان: |
DexGraspNet 2.0: Learning Generative Dexterous Grasping in Large-scale Synthetic Cluttered Scenes |
المؤلفون: |
Zhang, Jialiang, Liu, Haoran, Li, Danshi, Yu, Xinqiang, Geng, Haoran, Ding, Yufei, Chen, Jiayi, Wang, He |
سنة النشر: |
2024 |
المجموعة: |
Computer Science |
مصطلحات موضوعية: |
Computer Science - Robotics, Computer Science - Computer Vision and Pattern Recognition |
الوصف: |
Grasping in cluttered scenes remains highly challenging for dexterous hands due to the scarcity of data. To address this problem, we present a large-scale synthetic benchmark, encompassing 1319 objects, 8270 scenes, and 427 million grasps. Beyond benchmarking, we also propose a novel two-stage grasping method that learns efficiently from data by using a diffusion model that conditions on local geometry. Our proposed generative method outperforms all baselines in simulation experiments. Furthermore, with the aid of test-time-depth restoration, our method demonstrates zero-shot sim-to-real transfer, attaining 90.7% real-world dexterous grasping success rate in cluttered scenes. |
نوع الوثيقة: |
Working Paper |
URL الوصول: |
http://arxiv.org/abs/2410.23004 |
رقم الانضمام: |
edsarx.2410.23004 |
قاعدة البيانات: |
arXiv |