ScaleADFG: Affordance-based Dexterous Functional Grasping via Scalable Dataset

Sizhe Wang,

Yifan Yang,

Yongkang Luo*,

Daheng Li,

Wei Wei,

Yan Zhang,

Peiying Hu,

Yunjin Fu,

Haonan Duan,

Jia Sun,

Peng Wang*
Institute of Automation, Chinese Academy of Sciences, Beijing, China
IEEE Robotics and Automation Letters (RA-L) 2025

*Corresponding author: {yongkang.luo, peng_wang}@ia.ac.cn

News

Oct 20, 2025: Our paper is accepted by IEEE Robotics and Automation Letters (RA-L).


Webpage updates in progress ...

Abstract

        Dexterous functional tool-use grasping is essential for effective robotic manipulation of tools. However, existing approaches face significant challenges in efficiently constructing large-scale datasets and ensuring generalizability to everyday object scales. These issues primarily arise from size mismatches between robotic and human hands, and the diversity in real-world object scales. To address these limitations, we propose the **ScaleADFG** framework, which consists of a fully automated dataset construction pipeline and a lightweight grasp generation network. Our dataset introduce an affordance-based algorithm to synthesize diverse tool-use grasp configurations without expert demonstrations, allowing flexible object-hand size ratios and enabling large robotic hands (compared to human hands) to grasp everyday objects effectively. Additionally, we leverage pre-trained models to generate extensive 3D assets and facilitate efficient retrieval of object affordances. Our dataset comprising five object categories, each containing over 1,000 unique shapes with 15 scale variations. After filtering, the dataset includes over 60,000 grasps for each 2 dexterous robotic hands. On top of this dataset, we train a lightweight, single-stage grasp generation network with a notably simple loss design, eliminating the need for post-refinement. This demonstrates the critical importance of large-scale datasets and multi-scale object variant for effective training. Extensive experiments in simulation and on real robot confirm that the ScaleADFG framework exhibits strong adaptability to objects of varying scales, enhancing functional grasp stability, diversity, and generalizability. Moreover, our network exhibits effective zero-shot transfer to real-world objects.

Real Robot Experiment

BibTeX

        
            @ARTICLE{ScaleADFG2025,
              author={Wang, Sizhe and Yang, Yifan and Luo, Yongkang and Li, Daheng and Wei, Wei and Zhang, Yan and Hu, Peiying and Fu, Yunjin and Duan, Haonan and Sun, Jia and Wang, Peng},
              journal={IEEE Robotics and Automation Letters},
              title={ScaleADFG: Affordance-based Dexterous Functional Grasping via Scalable Dataset},
              year={2025},
              pages={1-8},
              keywords={Functional Grasping; Dexterous Manipulation; Deep Learning in Grasping and Manipulation; Affordance},
              doi={10.1109/LRA.2025.3630895}
            }