--- title: FingerNet Demo emoji: 🖐️ colorFrom: pink colorTo: yellow sdk: gradio sdk_version: 5.49.1 app_file: app.py pinned: false license: bsd-3-clause --- # FingerNet Demo This is an interactive demo for FingerNet, a deep learning model for estimating force and shape of soft robotic fingers. The demo allows users to manipulate the motion of the finger as the model inputs and visualize the predicted force and shape outputs. In this demo, we provide two models: FingerNet and FingerNet-Surf, corresponding to different finger designs. More details about the models can be found in the [FingerNet](https://huggingface.co/asRobotics/fingernet) and [FingerNet-Surf](https://huggingface.co/asRobotics/fingernet-surf) model cards. ## Features - Select between different FingerNet models (FingerNet and FingerNet-Surf). - Adjust finger motion parameters using sliders. - Visualize predicted force and shape in real-time. ## Citation If you use this model in your research, please cite the following papers: ```bibtex @article{liu2024proprioceptive, title={Proprioceptive learning with soft polyhedral networks}, author={Liu, Xiaobo and Han, Xudong and Hong, Wei and Wan, Fang and Song, Chaoyang}, journal={The International Journal of Robotics Research}, volume = {43}, number = {12}, pages = {1916-1935}, year = {2024}, publisher={SAGE Publications Sage UK: London, England}, doi = {10.1177/02783649241238765} } ``` [](https://arxiv.org/abs/2308.08538) ```bibtex @article{wu2025magiclaw, title={MagiClaw: A Dual-Use, Vision-Based Soft Gripper for Bridging the Human Demonstration to Robotic Deployment Gap}, author={Wu, Tianyu and Han, Xudong and Sun, Haoran and Zhang, Zishang and Huang, Bangchao and Song, Chaoyang and Wan, Fang}, journal={arXiv preprint arXiv:2509.19169}, year={2025} } ```