Log in
Enquire now
‌

Onboard dynamic-object detection and tracking for autonomous robot navigation with RGB-D camera

OverviewStructured DataIssuesContributors

Contents

Paper abstractTimelineTable: Further ResourcesReferences
Is a
‌
Academic paper
1

Academic Paper attributes

arXiv ID
2303.001321
arXiv Classification
Computer science
Computer science
1
Publication URL
arxiv.org/pdf/2303.0...32.pdf1
Publisher
ArXiv
ArXiv
1
DOI
doi.org/10.48550/ar...03.001321
Paid/Free
Free1
Academic Discipline
Computer science
Computer science
1
Robotics
Robotics
1
Submission Date
July 5, 2023
1
February 28, 2023
1
November 17, 2023
2
November 23, 2023
2
Author Names
Kenji Shimada1
Zhefan Xu1
Yumeng Xiu1
Christopher Suzuki1
Xiaoyang Zhan1
Paper abstract

Deploying autonomous robots in crowded indoor environments usually requires them to have accurate dynamic obstacle perception. Although plenty of previous works in the autonomous driving field have investigated the 3D object detection problem, the usage of dense point clouds from a heavy Light Detection and Ranging (LiDAR) sensor and their high computation cost for learning-based data processing make those methods not applicable to small robots, such as vision-based UAVs with small onboard computers. To address this issue, we propose a lightweight 3D dynamic obstacle detection and tracking (DODT) method based on an RGB-D camera, which is designed for low-power robots with limited computing power. Our method adopts a novel ensemble detection strategy, combining multiple computationally efficient but low-accuracy detectors to achieve real-time high-accuracy obstacle detection. Besides, we introduce a new feature-based data association and tracking method to prevent mismatches utilizing point clouds statistical features. In addition, our system includes an optional and auxiliary learning-based module to enhance the obstacle detection range and dynamic obstacle identification. The proposed method is implemented in a small quadcopter, and the results show that our method can achieve the lowest position error (0.11m) and a comparable velocity error (0.23m/s) across the benchmarking algorithms running on the robots onboard computer. The flight experiments prove that the tracking results from the proposed method can make the robot efficiently alter its trajectory for navigating dynamic environments. Our software is available on GitHub as an open-source ROS package.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date
No Further Resources data yet.

References

Find more entities like Onboard dynamic-object detection and tracking for autonomous robot navigation with RGB-D camera

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.