Log in
Enquire now
‌

A Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps

OverviewStructured DataIssuesContributors

Contents

Is a
‌
Academic paper
0

Academic Paper attributes

arXiv ID
2301.058210
arXiv Classification
Computer science
Computer science
0
Publication URL
arxiv.org/pdf/2301.0...21.pdf0
Publisher
ArXiv
ArXiv
0
DOI
doi.org/10.48550/ar...01.058210
Paid/Free
Free0
Academic Discipline
Artificial Intelligence (AI)
Artificial Intelligence (AI)
0
Computer science
Computer science
0
Robotics
Robotics
0
‌
Human–computer interaction
0
Submission Date
February 1, 2023
0
February 2, 2023
0
January 14, 2023
0
January 18, 2023
0
Author Names
Yixin Zhu0
Ziyuan Jiao0
Zhenliang Zhang0
Song-Chun Zhu0
Zeyu Zhang0
Chenfanfu Jiang0
Hangxin Liu0
Minchen Li0
Paper abstract

In this work, we present a reconfigurable data glove design to capture different modes of human hand-object interactions, which are critical in training embodied artificial intelligence (AI) agents for fine manipulation tasks. To achieve various downstream tasks with distinct features, our reconfigurable data glove operates in three modes sharing a unified backbone design that reconstructs hand gestures in real time. In the tactile-sensing mode, the glove system aggregates manipulation force via customized force sensors made from a soft and thin piezoresistive material; this design minimizes interference during complex hand movements. The virtual reality (VR) mode enables real-time interaction in a physically plausible fashion: A caging-based approach is devised to determine stable grasps by detecting collision events. Leveraging a state-of-the-art finite element method (FEM), the simulation mode collects data on fine-grained 4D manipulation events comprising hand and object motions in 3D space and how the object's physical properties (e.g., stress and energy) change in accordance with manipulation over time. Notably, the glove system presented here is the first to use high-fidelity simulation to investigate the unobservable physical and causal factors behind manipulation actions. In a series of experiments, we characterize our data glove in terms of individual sensors and the overall system. More specifically, we evaluate the system's three modes by (i) recording hand gestures and associated forces, (ii) improving manipulation fluency in VR, and (iii) producing realistic simulation effects of various tool uses, respectively. Based on these three modes, our reconfigurable data glove collects and reconstructs fine-grained human grasp data in both physical and virtual environments, thereby opening up new avenues for the learning of manipulation skills for embodied AI agents.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date
No Further Resources data yet.

References

Find more entities like A Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.