Applications of U-net to Cerebrospinal Fluid Segmentation in Magnetic Resonance Imaging

碩士 === 國立雲林科技大學 === 工業工程與管理系 === 107 === In the clinical medical of Spontaneous Intracranial Hypotension(SIH), it has been proved in the literature that it is necessary to observe the Cerebrospinal Fluid(CSF) of Magnetic Resonance Imaging(MRI) to diagnose by qualitative or quantitative methods. The...

Full description

Bibliographic Details
Main Authors: FAN,KAI-CHIEH, 范凱傑
Other Authors: FU,JA-CHIH
Format: Others
Language:zh-TW
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/handle/yy99k8
Description
Summary:碩士 === 國立雲林科技大學 === 工業工程與管理系 === 107 === In the clinical medical of Spontaneous Intracranial Hypotension(SIH), it has been proved in the literature that it is necessary to observe the Cerebrospinal Fluid(CSF) of Magnetic Resonance Imaging(MRI) to diagnose by qualitative or quantitative methods. The stage of recovery is absolutely related to the volume of CSF, the quantitative methods must consider the accuracy of CSF segmentation. Therefore, how to accurately segment is an important issue. This study uses U-Net in deep learning, which is a semantic segmentation model. The original design was applied to thresholding segmentation of biomedical images. Therefore, it is very suitable for this study, and compared with Global Entropy、K-means、Otus、Watershed. The study of MRI is divided into spinal and brain, and there are raw images and ground truth segmentation. The spinal is provided by TVGH, total is 8 samples(8018 images), and the brain uses the BrainWeb database, total is 18 kinds of noise combination samples (3312 images). We modified and refined the U-Net and conducted training for the spinal and brain respectively. Finally, the spinal model segmentation performance(IoU) is 0.9159, compared with the highest K-means in traditional algorithm performance, U-Net improved 0.0372, the brain model segmentation performance(IoU) is 0.9919, compared with the highest K-means in traditional algorithm performance, U-Net improved 0.0147.