کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
8918013 1642810 2017 16 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Patient handling activity recognition through pressure-map manifold learning using a footwear sensor
ترجمه فارسی عنوان
تشخیص فعالیت های دستکاری بیمار از طریق یادگیری چند منظوره با استفاده از سنسور کفش
کلمات کلیدی
فعالیت دستکاری بیمار، یادگیری اقدام چندگانه، فشار پاستار، کفی هوشمند، سلامت موبایل،
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر شبکه های کامپیوتری و ارتباطات
چکیده انگلیسی
The risk of overexertion injury caused by patient handling and movement activities causes chronic pain and other physical and social impairments among the nursing force. The accurate recognition of patient handling activities (PHA) is the first step to reduce injury risk for caregivers. The current practice on workplace activity recognition is neither accurate nor convenient to perform. In this paper, we propose a novel solution comprising a smart footwear device and an action manifold learning framework to address the challenge. The wearable device, called Smart Insole, is equipped with a rich set of sensors and can provide an unobtrusive approach to obtain and characterize the action information of patient handling activities. Our proposed action manifold learning (AML) framework extracts the intrinsic signature structure by projecting raw pressure data from a high-dimensional input space to a low-dimensional manifold space. This framework not only performs dimension reduction but also reduces motion artifacts, which is robust against the noise and inter-class/intra-class variation in PHA recognition. To validate the effectiveness of the proposed framework, we perform a pilot study with eight subjects including eight common activities in a nursing room. The intrinsic dimensionality of the manifold is estimated by comparing the residual variances of different dimensionality settings. The experimental results show the overall classification accuracy achieves 86.6%. Meanwhile, the qualitative profile and load level can also be classified with accuracies of 98.9% and 88.3%, respectively.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Smart Health - Volumes 1–2, June 2017, Pages 77-92
نویسندگان
, , , , ,