کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
6695293 | 1428269 | 2018 | 8 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
Convolutional neural networks: Computer vision-based workforce activity assessment in construction
ترجمه فارسی عنوان
شبکه های عصبی انعطاف پذیر: ارزیابی فعالیت های نیروی کار مبتنی بر رایانه در ساخت و ساز
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
کلمات کلیدی
تجزیه و تحلیل فعالیت، شبکه های عصبی انعقادی، دیدگاه کامپیوتر، ساخت و ساز، تفسیر ویدئو،
موضوعات مرتبط
مهندسی و علوم پایه
سایر رشته های مهندسی
مهندسی عمران و سازه
چکیده انگلیسی
Computer vision approaches have been widely used to automatically recognize the activities of workers from videos. While considerable advancements have been made to capture complementary information from still frames, it remains a challenge to obtain motion between them. As a result, this has hindered the ability to conduct real-time monitoring. Considering this challenge, an improved convolutional neural network (CNN) that integrates Red-Green-Blue (RGB), optical flow, and gray stream CNNs, is proposed to accurately monitor and automatically assess workers' activities associated with installing reinforcement during construction. A database containing photographs of workers installing reinforcement is created from activities undertaken on several construction projects in Wuhan, China. The database is then used to train and test the developed CNN network. Results demonstrate that the developed method can accurately detect the activities of workers. The developed computer vision-based approach can be used by construction managers as a mechanism to assist them to ensure that projects meet pre-determined deliverables.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Automation in Construction - Volume 94, October 2018, Pages 282-289
Journal: Automation in Construction - Volume 94, October 2018, Pages 282-289
نویسندگان
Hanbin Luo, Chaohua Xiong, Weili Fang, Peter E.D. Love, Bowen Zhang, Xi Ouyang,