کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4960788 1446505 2017 8 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Recognizing Grabbing Actions from Inertial and Video Sensor Data in a Warehouse Scenario
ترجمه فارسی عنوان
شناسایی اقدامات دستکاری از داده های سنسورهای ورودی و تصویری در یک سناریو انبار
کلمات کلیدی
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر علوم کامپیوتر (عمومی)
چکیده انگلیسی
Modern industries are increasingly adapting to smart devices for aiding and improving their productivity and work flow. This includes logistics in warehouses where validation of correct items per order can be enhanced with mobile devices. Since handling incorrect orders is a big part of the costs of warehouse maintenance, reducing errors like missed or wrong items should be avoided. Thus, early identification of picking procedures and items picked is beneficial for reducing these errors. By using data glasses and a smartwatch we aim to reduce these errors while also enabling the picker to work hands-free. In this paper, we present an analysis of feature sets for classification of grabbing actions in the order picking process. For this purpose, we created a dataset containing inertial data and egocentric video from four participants performing picking tasks, modeled closely to a real-world warehouse environment. We extract features from the time and frequency domain for inertial data and color and descriptor features from the image data to learn grabbing actions. By using three different supervised learning approaches on inertial and video data, we are able to recognize grabbing actions in a picking scenario. We show that the combination of both video and inertial sensors yields a F-measure of 85.3% for recognizing grabbing actions.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Procedia Computer Science - Volume 110, 2017, Pages 16-23
نویسندگان
, , , ,