DETECTION OF PNEUMONIA BY USING NINE PRE-TRAINED TRANSFER LEARNING MODELS BASED ON DEEP LEARNING TECHNIQUES
Pneumonia is a serious chest disease that affects the lungs. This disease has become an important issue that must be taken care of in the field of medicine due to its rapid and intense spread, especially among people who are addicted to smoking. This paper presents an efficient prediction system for...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | Arabic |
Published: |
University of Information Technology and Communications
2021-06-01
|
Series: | Iraqi Journal for Computers and Informatics |
Subjects: | |
Online Access: | https://ijci.uoitc.edu.iq/index.php/ijci/article/view/281 |
Summary: | Pneumonia is a serious chest disease that affects the lungs. This disease has become an important issue that must be taken care of in the field of medicine due to its rapid and intense spread, especially among people who are addicted to smoking. This paper presents an efficient prediction system for detecting pneumonia using nine pre-trained transfer learning models based on deep learning technique (Inception v4, SeNet-154, Xception, PolyNet, ResNet-50, DenseNet-121, DenseNet-169, AlexNet, and SqueezeNet). The dataset in this study consisted of 5856 chest x-rays, which were divided into 5216 for training and 624 for the test. In the training phase, the images were pre-processed by resizing the input images to the same dimensions to reduce complexity and computation. The images are then forwarded to the proposed models (Inception v4, SeNet-154, Xception, PolyNet, ResNet-50, DenseNet-121, DenseNet-169, AlexNet, SqueezeNet) to extract features and classify the images as normal or pneumonia. The results of the proposed models (Inception v4, SeNet-154, Xception, PolyNet, ResNet-50, DenseNet-121 DenseNet-169, AlexNet and SqueezeNet) give accuracies (98.72%, 98.94%, 98.88%, 98.72%, 96.2%, 94.69%, 96.29%, 95.01% and 96.10%) respectively. We found that the SeNet-154 model gave the best result with an accuracy of 98.94% with a validation loss (0.018103). When comparing our results with older studies, it should be noted that the proposed method is superior to other methods. |
---|---|
ISSN: | 2313-190X 2520-4912 |