Utilizing Deep Neural Networks for object-based image classification on hyperspectral images
碩士 === 國立中興大學 === 土木工程學系所 === 106 === Hyperspectral images have the feature of combining picture and spectrum. Each pixel contains hundreds of bands of information, which is very helpful for improving the classification accuracy of ground objects, but leads to a decrease in processing rate. In the t...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2018
|
Online Access: | http://ndltd.ncl.edu.tw/handle/p943uv |
id |
ndltd-TW-106NCHU5015064 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-106NCHU50150642019-05-16T01:24:29Z http://ndltd.ncl.edu.tw/handle/p943uv Utilizing Deep Neural Networks for object-based image classification on hyperspectral images 利用深度神經網路於高光譜影像物件式分類 Yi-Chin Hung 洪奕瑾 碩士 國立中興大學 土木工程學系所 106 Hyperspectral images have the feature of combining picture and spectrum. Each pixel contains hundreds of bands of information, which is very helpful for improving the classification accuracy of ground objects, but leads to a decrease in processing rate. In the traditional pixel-based classification, the number of data is rarely present the spatial correlation between objects. To effectively solve the problems caused by the hyperspectral imagery, this study proposes to classify hyperspectral images based on object-based and Deep Neural Network (DNN) algorithms. The research data include Indian Pines, Salinas images, which were taken from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), and their ground truth data. First, the hyperspectral image is subjected to Minimum Noise Fraction (MNF) to separate the noise in the data and provide useful information to reduce the number of calculations, then the spectral value of Object-Based Image Analysis (OBIA) us used to explore the spatial correlation. This method is suitable for high-resolution and complex images, because of the multiple pixels of the object and its fast classification. Simple Linear Iterative Clustering (SLIC) is used to divide the objects into various sizes and compactness based on the space and spectral information of each object, and finally classifies the object images with a deep neural network. DNN applied to the hyperspectral image object classification can really solve the salt and pepper effect with less computing time than the pixel-based classification, especially when the experimental image area is large. The classification accuracy and kappa value of the object-based classification are higher than the pixel-based classification. The proposed method is verified on Salinas image with a classification accuracy of up to 94.62% and less computation time. Ming-Der Yang 楊明德 2018 學位論文 ; thesis 58 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立中興大學 === 土木工程學系所 === 106 === Hyperspectral images have the feature of combining picture and spectrum. Each pixel contains hundreds of bands of information, which is very helpful for improving the classification accuracy of ground objects, but leads to a decrease in processing rate. In the traditional pixel-based classification, the number of data is rarely present the spatial correlation between objects. To effectively solve the problems caused by the hyperspectral imagery, this study proposes to classify hyperspectral images based on object-based and Deep Neural Network (DNN) algorithms.
The research data include Indian Pines, Salinas images, which were taken from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), and their ground truth data. First, the hyperspectral image is subjected to Minimum Noise Fraction (MNF) to separate the noise in the data and provide useful information to reduce the number of calculations, then the spectral value of Object-Based Image Analysis (OBIA) us used to explore the spatial correlation. This method is suitable for high-resolution and complex images, because of the multiple pixels of the object and its fast classification. Simple Linear Iterative Clustering (SLIC) is used to divide the objects into various sizes and compactness based on the space and spectral information of each object, and finally classifies the object images with a deep neural network. DNN applied to the hyperspectral image object classification can really solve the salt and pepper effect with less computing time than the pixel-based classification, especially when the experimental image area is large. The classification accuracy and kappa value of the object-based classification are higher than the pixel-based classification. The proposed method is verified on Salinas image with a classification accuracy of up to 94.62% and less computation time.
|
author2 |
Ming-Der Yang |
author_facet |
Ming-Der Yang Yi-Chin Hung 洪奕瑾 |
author |
Yi-Chin Hung 洪奕瑾 |
spellingShingle |
Yi-Chin Hung 洪奕瑾 Utilizing Deep Neural Networks for object-based image classification on hyperspectral images |
author_sort |
Yi-Chin Hung |
title |
Utilizing Deep Neural Networks for object-based image classification on hyperspectral images |
title_short |
Utilizing Deep Neural Networks for object-based image classification on hyperspectral images |
title_full |
Utilizing Deep Neural Networks for object-based image classification on hyperspectral images |
title_fullStr |
Utilizing Deep Neural Networks for object-based image classification on hyperspectral images |
title_full_unstemmed |
Utilizing Deep Neural Networks for object-based image classification on hyperspectral images |
title_sort |
utilizing deep neural networks for object-based image classification on hyperspectral images |
publishDate |
2018 |
url |
http://ndltd.ncl.edu.tw/handle/p943uv |
work_keys_str_mv |
AT yichinhung utilizingdeepneuralnetworksforobjectbasedimageclassificationonhyperspectralimages AT hóngyìjǐn utilizingdeepneuralnetworksforobjectbasedimageclassificationonhyperspectralimages AT yichinhung lìyòngshēndùshénjīngwǎnglùyúgāoguāngpǔyǐngxiàngwùjiànshìfēnlèi AT hóngyìjǐn lìyòngshēndùshénjīngwǎnglùyúgāoguāngpǔyǐngxiàngwùjiànshìfēnlèi |
_version_ |
1719174886315786240 |