LeNet Convolutional Neural Network for Face Mask Usage Classification Using a Low-Cost Device

  • Radimas Putra Muhammad Davi Labib Institut Teknologi Nasional Malang
  • Aryuanto Soetedjo Institut Teknologi Nasional Malang
  • Sirojul Hadi Universitas Bumigora
  • Prama Diptya Widayaka Universitas Negeri Surabaya
Keywords: COVID-19, Classification, Face Mask, LeNet, Convolutional Neural Network


Background: One of the efforts to prevent the spread of the COVID-19 virus is to wear a face mask in public places.However, there are still many people who use masks in the wrong way, and some do not even wear masks when in public places. From these problems, we need an image-based classification system that can be employed to identify the use of face masks. The system built must use a low-cost device to be purchased by sundry groups. In previous studies, some classification systems for face mask use were designed using various methods, but there were limitations. The Convolutional Neural Network (CNN) method provides high accuracy. However, it has a heavy computational level and cannot be used in real-time on low-cost devices. In contrast, the haar-cascade method provides a fast processing time but is less accurate than the CNN method.
Objective: In this article, research was conducted on the development of image processing algorithms for the classification process of face mask use using low-cost devices.
Methods: The method used was CNN with LeNet architecture which has a light computational level. In the machine learning process, a dataset of 400 images was used, which was split into 240 images for training needs and 160 images for validation needs.
Result: This study produced a classification with an accuracy rate of 98.75%. The prediction process that is carried out using a low-cost device requires an average time of 0.235 seconds.
Conclusion: This research showed that the system can be run in real time


[1] Y. C. Wu, C. S. Chen, and Y. J. Chan, “The outbreak of COVID-19: An overview,” Journal of the Chinese Medical Association, vol. 83, no. 3, pp. 217–220, 2020.
[2] S. Annas, M. Isbar Pratama, M. Rifandi, W. Sanusi, and S. Side, “Stability analysis and numerical simulation of SEIR model for pandemic COVID-19 spread in Indonesia,” Chaos, Solitons and Fractals, vol.139, pp. 1–7, 2020.
[3] S. Manigandan, M. T. Wu, V. K. Ponnusamy, V. B. Raghavendra, A. Pugazhendhi, and K. Brindhadevi, “A systematic review on recent trends in transmission, diagnosis, prevention and imaging features of COVID19,” Process Biochemistry, vol. 98, no. August, pp. 233–240, 2020.
[4] D. K. Chu et al., “Physical distancing, face masks, and eye protection to prevent person-to-person transmission of SARS-CoV-2 and COVID-19: a systematic review and meta-analysis,” The Lancet, vol. 395, no.10242, pp. 1973–1987, 2020.
[5] W. Lyu and G. L. Wehby, “Community Use Of Face Masks And COVID-19: Evidence From A Natural Experiment Of State Mandates In The US,” Health affairs (Project Hope), vol. 39, no. 8, pp. 1419–1425, 2020.
[6] S. Feng, C. Shen, N. Xia, W. Song, M. Fan, and B. J. Cowling, “Rational use of face masks in the COVID-19 pandemic,” The Lancet Respiratory Medicine, vol. 8, no. 5, pp. 434–436, 2020.
[7] R. P. M. D. Labib, S. Hadi, and P. D. Widayaka, “Low Cost System for Face Mask Detection Based Haar Cascade Classifier Method,” MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer, vol. 21, no. 1, pp. 21–30, 2021.
[8] A. Ahmadi, A. C. Saputra, and A. Lestari, “Rancang Bangun Aplikasi New Normal Covid-19 Deteksi Penggunaan Masker Menggunakan Haar Cascade Classifier,” Jurnal Teknologi Informasi: Jurnal Keilmuan dan Aplikasi Bidang Teknik Informatika, vol. 15, no. 2, pp. 199–209, 2021.
[9] D. Giancini, E. Y. Puspaningrum, and Y. V. Via, “Identifikasi Penggunaan Masker Menggunakan Algoritma CNN YOLOv3-Tiny,” Prosiding Seminar Nasional Informatika Bela Negara, vol. 1, pp. 153–159, 2020.
[10] A. E. Sawy, H. El-Bakry, and M. Loey, “CNN for handwritten arabic digits recognition based on LeNet-5,” Advances in Intelligent Systems and Computing, vol. 533, pp. 565–575, 2019.
[11] G. Wei, G. Li, J. Zhao, and A. He, “Development of a LeNet-5 gas identification CNN structure for electronic noses,” Sensors (Switzerland), vol. 19, no. 1, 2019.
[12] T. Wang, C. Lu, G. Shen, and F. Hong, “Sleep apnea detection from a single-lead ECG signal with automatic feature-extraction through a modified LeNet-5 convolutional neural network,” PeerJ, vol. 2019, no. 9, pp. 1–17, 2019
[13] D. Rongshi and T. Yongming, “Accelerator Implementation of Lenet-5 Convolution Neural Network Based on FPGA with HLS,” 3rd International Conference on Circuits, System and Simulation, ICCSS 2019, pp. 64–67, 2019.
[14] R. Putra, M. Davi, S. Hadi, P. D. Widayaka, and I. Suryani, “Convolutional Neural Network for Cataract Maturity Classification Based on LeNet,” Jurnal Bumigora Information Technology (BITe), vol. 5, no. 2, pp. 97–106, 2022.
[15] D. Bau, J. Y. Zhu, H. Strobelt, A. Lapedriza, B. Zhou, and A. Torralba, “Understanding the role of individual units in a deep neural network,” Proceedings of the National Academy of Sciences of the United States of America, vol. 117, no. 48, pp. 30 071–30 078, 2020.
[16] S. Albawi, T. A. M. Mohammed, and S. Alzawi, “Layers of a Convolutional Neural Network,” Ieee, p. 16, 2019.
[17] H. Gholamalinezhad and H. Khosravi, “Pooling Methods in Deep Neural Networks, a Review,” IEEE, pp. 1–16, 2020.
[18] S. Kanai, Y. Yamanaka, Y. Fujiwara, and S. Adachi, “SigsofTmax: Reanalysis of the softmax bottleneck,” Advances in Neural Information Processing Systems, vol. 2018-Decem, no. NeurIPS, pp. 286–296, 2018.
[19] G. Liu and J. Guo, “Bidirectional LSTM with attention mechanism and convolutional layer for text classification,” Neurocomputing, vol. 337, pp. 325–338, 2019.
[20] A. F. Agarap, “Deep Learning using Rectified Linear Units (ReLU),” IEEE, no. 1, pp. 1–7, 2018
How to Cite
Labib, R., Soetedjo, A., Hadi, S., & Widayaka, P. (2023). LeNet Convolutional Neural Network for Face Mask Usage Classification Using a Low-Cost Device. Jurnal Bumigora Information Technology (BITe), 5(1), 9-16. https://doi.org/https://doi.org/10.30812/bite.v5i1.2952