Augmenting Computer Vision with IMU from Mobile Phone for Robust Robot Following of Human Leader

No Thumbnail Available

Authors

Baumgartner, Benjamin M.
Lowrance, Christopher J.

Issue Date

2018-10-05

Type

proceedings-article

Language

Keywords

Computer vision , Measurement units , Educational robots , Navigation , Conferences , Mobile handsets , Robustness

Research Projects

Organizational Units

Journal Issue

Alternative Title

Abstract

Robots are sometimes used as "mules" for carrying supplies for workers or soldiers. In these types of scenarios, the robot must autonomously follow the same path as the person who acts as the leader. This type of robot navigation behavior is generally referred to as leader-follower, and when the leader is a person instead of a vehicle, it is more specifically called "human-leader, robot-follower" (HLRF). Existing approaches to HLRF tend to be overly complex to implement. Computer vision offers a cost-effective solution; however, it can be problematic to rely solely on computer vision when multiple leader signatures of similar qualities are detected or when the leader signature is temporarily lost. This paper investigates the effect of fusing inertial measurement unit (IMU) data from the leader's cell phone with the robot's image data to improve the robustness of HLRF. We found that the addition of IMU data from the leader's mobile phone improved the reliability of the robot's following behavior, as well as the fuzzy controller's response in terms of settling time by up to 2.49 seconds.

Description

Citation

B. M. Baumgartner and C. J. Lowrance, "Augmenting Computer Vision with IMU from Mobile Phone for Robust Robot Following of Human Leader," 2018 IEEE MIT Undergraduate Research Technology Conference (URTC), Cambridge, MA, USA, 2018, pp. 1-5, doi: 10.1109/URTC45901.2018.9244789.

Publisher

IEEE

License

Journal

Volume

Issue

PubMed ID

ISSN

EISSN