Optimised CNN in conjunction with efficient pooling strategy for the multi‐classification of breast cancer

Abstract Tissue analysis using histopathological images is the most prevailing as well as a challenging task in the treatment of cancer. The clinical assessment of tissues becomes very tough as high variability in the magnification levels makes the situation worst for any pathologist to deal with th...

Full description

Bibliographic Details
Main Authors: Shallu Sharma, Rajesh Mehra, Sumit Kumar
Format: Article
Language:English
Published: Wiley 2021-03-01
Series:IET Image Processing
Online Access:https://doi.org/10.1049/ipr2.12074
Description
Summary:Abstract Tissue analysis using histopathological images is the most prevailing as well as a challenging task in the treatment of cancer. The clinical assessment of tissues becomes very tough as high variability in the magnification levels makes the situation worst for any pathologist to deal with the benign and malignant stages of cancer. One of the possible ways to address such a pathetic situation could be an advanced machine learning approach. Hence, a convolutional neural network (CNN) architecture is proposed to create an automated system for magnification independent multi‐classification of breast cancer histopathological images. This automated system offers high productivity and consistency in diagnosing the eight different classes of breast cancer from a balanced BreakHis dataset. The system utilises an efficient training methodology to learn the discerning features from images of different magnification levels. Data augmentation techniques are also employed to overcome the problem of overfitting. Additionally, the performance of CNN architecture has been improved in a significant manner by adopting an appropriate pooling strategy and optimisation technique. Based on that, we have achieved an accuracy of 80.76%, 76.58%, 79.90%, and 74.21% at the magnification 40X, 100X, 200X, and 400X, respectively. The proposed model outperforms the handcrafted approaches with an average accuracy of 80.47% at 40X magnification level.
ISSN:1751-9659
1751-9667