留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

一种基于关键点的红外图像人体摔倒检测方法

徐世文 王姮 张华 庞杰

徐世文, 王姮, 张华, 庞杰. 一种基于关键点的红外图像人体摔倒检测方法[J]. 红外技术, 2021, 43(10): 1003-1007.
引用本文: 徐世文, 王姮, 张华, 庞杰. 一种基于关键点的红外图像人体摔倒检测方法[J]. 红外技术, 2021, 43(10): 1003-1007.
XU Shiwen, WANG Heng, ZHANG Hua, PANG Jie. Human Fall Detection Method Based on Key Points in Infrared Images[J]. Infrared Technology , 2021, 43(10): 1003-1007.
Citation: XU Shiwen, WANG Heng, ZHANG Hua, PANG Jie. Human Fall Detection Method Based on Key Points in Infrared Images[J]. Infrared Technology , 2021, 43(10): 1003-1007.

一种基于关键点的红外图像人体摔倒检测方法

详细信息
    作者简介:

    徐世文(1994-),男,四川省成都市人,硕士研究生,主要研究方向为计算机视觉与图像处理、深度学习。E-mail:1411761943@qq.com

    通讯作者:

    王姮(1971-),女,硕士,教授,主要研究方向为机器人技术及应用、自动化技术研究。E-mail:wh839@qq.com

  • 中图分类号: TP391.4

Human Fall Detection Method Based on Key Points in Infrared Images

  • 摘要: 针对已有人体摔倒检测方法在复杂环境场景下易受光照影响、适应性差、误检率高等问题,提出了一种基于关键点估计的红外图像人体摔倒检测方法。该方法采用红外图像,有效避免了光照等因素的影响,经过神经网络找到人体目标中心点,然后回归人体目标属性,如目标尺寸、标签等,从而得到检测结果。使用红外相机采集不同情况下的人体摔倒图像,建立红外图像人体摔倒数据集并使用提出的方法进行检测,识别率达到97%以上。实验结果表明提出的方法在红外图像人体摔倒检测中具有较高的精度与速度。
  • 图  1  Centernet网络整体架构

    Figure  1.  Overall network architecture of Centernet

    图  2  红外相机以及电源

    Figure  2.  Infrared camera and power supply

    图  3  红外数据录制场景

    Figure  3.  Infrared data recording scene

    图  4  部分人体摔倒数据集

    Figure  4.  Partial human fall dataset

    图  5  人体摔倒检测效果

    Figure  5.  Human fall detection effect

    图  6  本文算法的P-R曲线

    Figure  6.  The P-R curve of the algorithm in this paper

    表  1  对比实验结果

    Table  1.   Comparison of experimental results

    Algorithm Accuracy/(%) Time/(ms/frame)
    Yolo v3 96.9 0.0421
    Faster RCNN 95.7 0.441
    Ours 98.4 0.0462
    下载: 导出CSV
  • [1] Santos G, Endo P, Monteiro K, et al. Accelerometer-based human fall detection using convolutional neural networks[J]. Sensors, 2019, 19(7): 1644. doi:  10.3390/s19071644
    [2] Gia T N, Sarker V K, Tcarenko I, et al. Energy efficient wearable sensor node for IoT-based fall detection systems[J]. Microprocessors and Microsystems, 2018, 56: 34-46. doi:  10.1016/j.micpro.2017.10.014
    [3] Nadee C, Chamnongthai K. Multi sensor system for automatic fall detection[C]//2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA) of IEEE, 2015: DOI: 10.1109/APSIPA.2015.7415408.
    [4] Tzeng H W, CHEN M Y, CHE J Y. Design of fall detection system with floor pressure and infrared im age[C]//2010 International Conference on System Science and Engineering of IEEE, 2010: 131-135.
    [5] Kerdegari H, Samsudin K, Rahman Ramli A, et al. Development of wearable human fall detection system using multilayer perceptron neural network[J]. International Journal of Computational Intelligence Systems, 2013, 6(1): 127-136. doi:  10.1080/18756891.2013.761769
    [6] LIU Chengyin, JIANG Zhaoshuo, SU Xiangxiang, et al. Detection of human fall using floor vibration and multi-features semi-supervised SVM[J]. Sensors, 2019, 19(17): 3720(doi:  10.3390/s19173720).
    [7] Mazurek P, Wagner J, Morawski R Z. Use of kinematic and mel- cepstrum-related features for fall detection based on data from infrared depth sensors[J]. Biomedical Signal Processing and Control, 2018, 40: 102-110. doi:  10.1016/j.bspc.2017.09.006
    [8] MIN W, CUI H, RAO H, et al. Detection of human falls on furniture using scene analysis based on deep learning and activity characteristics[J/OL]. IEEE Access, 2018, 6: 9324-9335.
    [9] FENG W, LIU R, ZHU M. Fall detection for elderly person care in a vision-based home surveillance environment using a monocular camera[J]. Signal, Image and Video Processing, 2014, 8(6): 1129-1138. doi:  10.1007/s11760-014-0645-4
    [10] 邓志锋, 闵卫东, 邹松. 一种基于CNN和人体椭圆轮廓运动特征的摔倒检测方法[J]. 图学学报, 2018, 39(6): 30-35. https://www.cnki.com.cn/Article/CJFDTOTAL-GCTX201806005.htm

    DENG Zhifeng, MIN Weidong, ZOU Song. A fall detection method based on CNN and human elliptical contour motion features[J]. Journal of Graphics, 2018, 39(6): 30-35. https://www.cnki.com.cn/Article/CJFDTOTAL-GCTX201806005.htm
    [11] 卫少洁, 周永霞. 一种结合Alphapose和LSTM的人体摔倒检测模型[J]. 小型微型计算机系统, 2019, 40(9): 1886-1890. doi:  10.3969/j.issn.1000-1220.2019.09.014

    WEI Shaojie, ZHOU Yongxia. A human fall detection model combining alphapose and LSTM[J]. Minicomputer System, 2019, 40(9): 1886-1890. doi:  10.3969/j.issn.1000-1220.2019.09.014
    [12] 赵芹, 周涛, 舒勤. 飞机红外图像的目标识别及姿态判断[J]. 红外技术, 2007, 29(3): 167-169. doi:  10.3969/j.issn.1001-8891.2007.03.012

    ZHAO Qin, ZHOU Tao, SHU Qin. Target recognition and attitude judgment of aircraft infrared image[J]. Infrared Technology, 2007, 29(3): 167-169. doi:  10.3969/j.issn.1001-8891.2007.03.012
    [13] ZHOU X, WANG D, Krhenbühl P. Objects as points [J/OL]. [2019-04-25]. arXiv: 1904.07850(https://arxiv.org/abs/1904.07850).
    [14] YU F, WANG D, Shelhamer E, et al. Deep layer aggregation[J/OL]. [2019-01-04]. arXiv: 1707.06484(https://arxiv.org/abs/1707.06484)
    [15] Law H, DENG J. CornerNet: detecting objects as paired keypoints[J]. International Journal of Computer Vision, 2020, 128(3): 642-656. doi:  10.1007/s11263-019-01204-1
    [16] LIN T Y, Goyal P, Girshick R, et al. Focal loss for dense object detection[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2017: DOI: 10.1109/ICCV. 2017.324. http://fcv2011.ulsan.ac.kr/files/announcement/657/1708.02002.pdf
  • 加载中
图(6) / 表(1)
计量
  • 文章访问数:  375
  • HTML全文浏览量:  130
  • PDF下载量:  89
  • 被引次数: 0
出版历程
  • 收稿日期:  2020-02-18
  • 修回日期:  2020-02-21
  • 刊出日期:  2021-10-20

目录

    /

    返回文章
    返回