Neural Net Classification and Low Distortion Reconstruction to Halftone Images
碩士 === 國立中央大學 === 電機工程學系 === 85 === The objective of this thesis is to reconstruct gray-level images fromhalftone images. We develop high performance halftone reconstruction methodsfor several commonly used halftone technigues. For bett...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
1997
|
Online Access: | http://ndltd.ncl.edu.tw/handle/84929851886191587108 |
id |
ndltd-TW-085NCU00442065 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-085NCU004420652015-10-13T17:59:41Z http://ndltd.ncl.edu.tw/handle/84929851886191587108 Neural Net Classification and Low Distortion Reconstruction to Halftone Images 半色調影像類神經分類與低失真重建技術 Yu, Che Sheng 游哲生 碩士 國立中央大學 電機工程學系 85 The objective of this thesis is to reconstruct gray-level images fromhalftone images. We develop high performance halftone reconstruction methodsfor several commonly used halftone technigues. For better reconstructionquality, image classification based on halftone techniques is placed beforethe reconstruction process so that the halftone reconstruction process canbe fine tuned for each halftone technique. The classification is based onsimplified one-dimensional correlation of halftone iamges and processedwith neural networks. The classification method reached 100% accuracy inour experiments. For image reconstruction, we develop LMS adaptive filter(LMS) method, minimum mean square error (MMSE) method, and hybrid method.The hybrid method yields best reconstruction image quality and highprocessing speed. In addition, the LMS method generates optimal imagemasks which can then be applied to MMSE and hybrid methods to setup optimalreconstruction tables. Pao-Chi Chang 張寶基 1997 學位論文 ; thesis 114 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立中央大學 === 電機工程學系 === 85 === The objective of this thesis is to reconstruct gray-level
images fromhalftone images. We develop high performance halftone
reconstruction methodsfor several commonly used halftone
technigues. For better reconstructionquality, image
classification based on halftone techniques is placed beforethe
reconstruction process so that the halftone reconstruction
process canbe fine tuned for each halftone technique. The
classification is based onsimplified one-dimensional correlation
of halftone iamges and processedwith neural networks. The
classification method reached 100% accuracy inour experiments.
For image reconstruction, we develop LMS adaptive filter(LMS)
method, minimum mean square error (MMSE) method, and hybrid
method.The hybrid method yields best reconstruction image
quality and highprocessing speed. In addition, the LMS method
generates optimal imagemasks which can then be applied to MMSE
and hybrid methods to setup optimalreconstruction tables.
|
author2 |
Pao-Chi Chang |
author_facet |
Pao-Chi Chang Yu, Che Sheng 游哲生 |
author |
Yu, Che Sheng 游哲生 |
spellingShingle |
Yu, Che Sheng 游哲生 Neural Net Classification and Low Distortion Reconstruction to Halftone Images |
author_sort |
Yu, Che Sheng |
title |
Neural Net Classification and Low Distortion Reconstruction to Halftone Images |
title_short |
Neural Net Classification and Low Distortion Reconstruction to Halftone Images |
title_full |
Neural Net Classification and Low Distortion Reconstruction to Halftone Images |
title_fullStr |
Neural Net Classification and Low Distortion Reconstruction to Halftone Images |
title_full_unstemmed |
Neural Net Classification and Low Distortion Reconstruction to Halftone Images |
title_sort |
neural net classification and low distortion reconstruction to halftone images |
publishDate |
1997 |
url |
http://ndltd.ncl.edu.tw/handle/84929851886191587108 |
work_keys_str_mv |
AT yuchesheng neuralnetclassificationandlowdistortionreconstructiontohalftoneimages AT yóuzhéshēng neuralnetclassificationandlowdistortionreconstructiontohalftoneimages AT yuchesheng bànsèdiàoyǐngxiànglèishénjīngfēnlèiyǔdīshīzhēnzhòngjiànjìshù AT yóuzhéshēng bànsèdiàoyǐngxiànglèishénjīngfēnlèiyǔdīshīzhēnzhòngjiànjìshù |
_version_ |
1718027536607215616 |