Automatic Orchid Bottle Seedling Image Feature Extraction and Measurement based on Deep Mask Regions Convolutional Neural Networks
碩士 === 國立成功大學 === 電機工程學系 === 107 === This thesis aims to develop an automatic orchid bottle seedling image feature extraction and measurement algorithms based on mask regions convolutional neural networks (Mask R-CNN) for extracting the important growth features of orchid bottle seedlings to reach t...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2019
|
Online Access: | http://ndltd.ncl.edu.tw/handle/b4s94c |
id |
ndltd-TW-107NCKU5442160 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-107NCKU54421602019-10-26T06:24:16Z http://ndltd.ncl.edu.tw/handle/b4s94c Automatic Orchid Bottle Seedling Image Feature Extraction and Measurement based on Deep Mask Regions Convolutional Neural Networks 基於深度遮罩式區域卷積神經網路之自動化蘭花瓶苗影像表徵萃取與計算 Jing-LuneYang 楊景倫 碩士 國立成功大學 電機工程學系 107 This thesis aims to develop an automatic orchid bottle seedling image feature extraction and measurement algorithms based on mask regions convolutional neural networks (Mask R-CNN) for extracting the important growth features of orchid bottle seedlings to reach the goal of precise cultivation. In this study, to train and test the Mask R-CNN, orchid bottle seedling images from different view angles were obtained from an orchid plantation factory in the southern Taiwan. The original images collected from the factory were first labeled for their outlook contours such leaves and roots. These contours are called as masks. Then, the labeled images were distorted to increase the diversity of the training and testing images. Finally, these images with their corresponding masks were served as the golden standards for the network training. The Mask R-CNN-based image detection algorithm has been developed to extract the features of orchid bottle seedlings, including leaf, root, green root tip, white root tip, yellow leaf, green leaf effectively and automatically. Ten different Mask R-CNN models were constructed for performance comparisons. These ten models are the different layers of residual network (ResNet) including ResNet-26, ResNet-41, ResNet-50, ResNet-101, and ResNet-152 combined with fully convolutional network (FCN) and U-network (UNet), respectively. The experimental results show that the ResNet-101-UNet outperforms the other models with higher average precision (AP) of feature extraction at 77.89%, and its training time is 199 ms/image. In addition to the feature extraction, a feature measurement algorithm has been developed to measure/calculate the features, such as the number of leaves and the length, width, and area of each leaf from orchid bottle seedling images detected by the Mask R-CNN models. The experimental results show that the average percentage error of the area measurement of leaves is 16.47±6.41% due to the shading or blocking by other leaves or curly leaves, while the average percentage error of the length measurement of roots is 7.28±3.01%. The overall average errors of the feature measurements/calculations were satisfactory, and thus validated the effectiveness of the proposed methods for the feature extraction of orchid bottle seedlings. In the future, we hope these algorithms can be applied to the orchid plantation industry and reach the goal of precise cultivation of orchid bottle seedlings. Jeen-Shin Wang 王振興 2019 學位論文 ; thesis 69 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立成功大學 === 電機工程學系 === 107 === This thesis aims to develop an automatic orchid bottle seedling image feature extraction and measurement algorithms based on mask regions convolutional neural networks (Mask R-CNN) for extracting the important growth features of orchid bottle seedlings to reach the goal of precise cultivation. In this study, to train and test the Mask R-CNN, orchid bottle seedling images from different view angles were obtained from an orchid plantation factory in the southern Taiwan. The original images collected from the factory were first labeled for their outlook contours such leaves and roots. These contours are called as masks. Then, the labeled images were distorted to increase the diversity of the training and testing images. Finally, these images with their corresponding masks were served as the golden standards for the network training. The Mask R-CNN-based image detection algorithm has been developed to extract the features of orchid bottle seedlings, including leaf, root, green root tip, white root tip, yellow leaf, green leaf effectively and automatically. Ten different Mask R-CNN models were constructed for performance comparisons. These ten models are the different layers of residual network (ResNet) including ResNet-26, ResNet-41, ResNet-50, ResNet-101, and ResNet-152 combined with fully convolutional network (FCN) and U-network (UNet), respectively. The experimental results show that the ResNet-101-UNet outperforms the other models with higher average precision (AP) of feature extraction at 77.89%, and its training time is 199 ms/image. In addition to the feature extraction, a feature measurement algorithm has been developed to measure/calculate the features, such as the number of leaves and the length, width, and area of each leaf from orchid bottle seedling images detected by the Mask R-CNN models. The experimental results show that the average percentage error of the area measurement of leaves is 16.47±6.41% due to the shading or blocking by other leaves or curly leaves, while the average percentage error of the length measurement of roots is 7.28±3.01%. The overall average errors of the feature measurements/calculations were satisfactory, and thus validated the effectiveness of the proposed methods for the feature extraction of orchid bottle seedlings. In the future, we hope these algorithms can be applied to the orchid plantation industry and reach the goal of precise cultivation of orchid bottle seedlings.
|
author2 |
Jeen-Shin Wang |
author_facet |
Jeen-Shin Wang Jing-LuneYang 楊景倫 |
author |
Jing-LuneYang 楊景倫 |
spellingShingle |
Jing-LuneYang 楊景倫 Automatic Orchid Bottle Seedling Image Feature Extraction and Measurement based on Deep Mask Regions Convolutional Neural Networks |
author_sort |
Jing-LuneYang |
title |
Automatic Orchid Bottle Seedling Image Feature Extraction and Measurement based on Deep Mask Regions Convolutional Neural Networks |
title_short |
Automatic Orchid Bottle Seedling Image Feature Extraction and Measurement based on Deep Mask Regions Convolutional Neural Networks |
title_full |
Automatic Orchid Bottle Seedling Image Feature Extraction and Measurement based on Deep Mask Regions Convolutional Neural Networks |
title_fullStr |
Automatic Orchid Bottle Seedling Image Feature Extraction and Measurement based on Deep Mask Regions Convolutional Neural Networks |
title_full_unstemmed |
Automatic Orchid Bottle Seedling Image Feature Extraction and Measurement based on Deep Mask Regions Convolutional Neural Networks |
title_sort |
automatic orchid bottle seedling image feature extraction and measurement based on deep mask regions convolutional neural networks |
publishDate |
2019 |
url |
http://ndltd.ncl.edu.tw/handle/b4s94c |
work_keys_str_mv |
AT jingluneyang automaticorchidbottleseedlingimagefeatureextractionandmeasurementbasedondeepmaskregionsconvolutionalneuralnetworks AT yángjǐnglún automaticorchidbottleseedlingimagefeatureextractionandmeasurementbasedondeepmaskregionsconvolutionalneuralnetworks AT jingluneyang jīyúshēndùzhēzhàoshìqūyùjuǎnjīshénjīngwǎnglùzhīzìdònghuàlánhuāpíngmiáoyǐngxiàngbiǎozhēngcuìqǔyǔjìsuàn AT yángjǐnglún jīyúshēndùzhēzhàoshìqūyùjuǎnjīshénjīngwǎnglùzhīzìdònghuàlánhuāpíngmiáoyǐngxiàngbiǎozhēngcuìqǔyǔjìsuàn |
_version_ |
1719279465434972160 |