Kapur’s Entropy for Underwater Multilevel Thresholding Image Segmentation Based on Whale Optimization Algorithm

Multilevel thresholding is an effective and indispensable technology for image segmentation that has attracted extensive attention in recent years. However, the multilevel thresholding method has some disadvantages, such as a large computational complexity and low segmentation accuracy. Therefore, t...

Full description

Bibliographic Details
Main Authors: Zheping Yan, Jinzhong Zhang, Zewen Yang, Jialing Tang
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9127465/
Description
Summary:Multilevel thresholding is an effective and indispensable technology for image segmentation that has attracted extensive attention in recent years. However, the multilevel thresholding method has some disadvantages, such as a large computational complexity and low segmentation accuracy. Therefore, this paper proposes a whale optimization algorithm (WOA) based on Kapur’s entropy method to solve the image segmentation problem. The WOA can effectively balance exploration and exploitation to avoid falling into premature convergence and obtain the global optimal solution. To verify the segmentation performance of the WOA, a series of experiments on underwater images from the experimental pool of Harbin Engineering University are conducted, and the segmentation results are compared with those of the BA, the FPA, MFO, the MSA, PSO and WWO by maximizing the fitness value of Kapur’s entropy method. The fitness value, peak signal-to-noise ratio (PSNR), structure similarity index (SSIM), execution time and Wilcoxon’s rank-sum test are used to evaluate the overall performance of each algorithm. The experimental results reveal that the WOA is superior to the other comparison algorithms and has a higher segmentation accuracy, better segmentation effect and stronger robustness. In addition, the feasibility and efficiency of the WOA are verified.
ISSN:2169-3536