Title
|
|
|
|
A performance analysis of invariant feature descriptors in eye tracking based human robot collaboration
| |
Author
|
|
|
|
| |
Abstract
|
|
|
|
For eye tracking applications in Human Robot Collaboration (HRC), it is essential for the robot to be aware of where the human gaze is located in the scene. Using feature detectors and feature descriptors, the human gaze can be projected to the image from which robot could know where a human is looking at. The motion that occurs during the collaboration may affect the performance of the descriptor. In this paper, we analyse the performance of SIFT, SURF, AKAZE, BRISK and ORB feature descriptor in a real scene for eye tracking in HRC where different variances co-exist. We use a robotic arm and two cameras to test the descriptors instead of directly testing on eye tracking glasses in order that different accelerations can be tested quantitatively. Results show that BRISK, AKAZE and SURF are more favourable considering accuracy, stability and computation time. |
| |
Language
|
|
|
|
English
| |
Source (journal)
|
|
|
|
2019 5th International Conference on Control, Automation and Robotics (ICCAR)
| |
Source (book)
|
|
|
|
5th International Conference on Control, Automation and Robotics (ICCAR), APR 19-22, 2019, Beijing, PEOPLES R CHINA
| |
Publication
|
|
|
|
New york
:
Ieee
,
2019
| |
ISBN
|
|
|
|
978-1-72813-326-3
| |
DOI
|
|
|
|
10.1109/ICCAR.2019.8813478
| |
Volume/pages
|
|
|
|
(2019)
, p. 256-260
| |
ISI
|
|
|
|
000589413000045
| |
Full text (Publisher's DOI)
|
|
|
|
| |
Full text (open access)
|
|
|
|
| |
Full text (publisher's version - intranet only)
|
|
|
|
| |
|