Statistical Process Monitoring of the Tennessee Eastman Process Using Parallel Autoassociative Neural Networks and a Large Dataset
In this article, the statistical process monitoring problem of the Tennessee Eastman process is considered using deep learning techniques. This work is motivated by three limitations of the existing works for such problem. First, although deep learning has been used for process monitoring extensivel...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-07-01
|
Series: | Processes |
Subjects: | |
Online Access: | https://www.mdpi.com/2227-9717/7/7/411 |
id |
doaj-e8ecf485643d460ea8d3883fcb83a473 |
---|---|
record_format |
Article |
spelling |
doaj-e8ecf485643d460ea8d3883fcb83a4732020-11-25T00:48:18ZengMDPI AGProcesses2227-97172019-07-017741110.3390/pr7070411pr7070411Statistical Process Monitoring of the Tennessee Eastman Process Using Parallel Autoassociative Neural Networks and a Large DatasetSeongmin Heo0Jay H. Lee1Department of Chemical and Biomolecular Engineering, Korea Advanced Institute of Science and Technology (KAIST), 291 Daehak-ro, Yuseong-gu, Daejeon 34141, KoreaDepartment of Chemical and Biomolecular Engineering, Korea Advanced Institute of Science and Technology (KAIST), 291 Daehak-ro, Yuseong-gu, Daejeon 34141, KoreaIn this article, the statistical process monitoring problem of the Tennessee Eastman process is considered using deep learning techniques. This work is motivated by three limitations of the existing works for such problem. First, although deep learning has been used for process monitoring extensively, in the majority of the existing works, the neural networks were trained in a supervised manner assuming that the normal/fault labels were available. However, this is not always the case in real applications. Thus, in this work, autoassociative neural networks are used, which are trained in an unsupervised fashion. Another limitation is that the typical dataset used for the monitoring of the Tennessee Eastman process is comprised of just a small number of data samples, which can be highly limiting for deep learning. The dataset used in this work is 500-times larger than the typically-used dataset and is large enough for deep learning. Lastly, an alternative neural network architecture, which is called parallel autoassociative neural networks, is proposed to decouple the training of different principal components. The proposed architecture is expected to address the co-adaptation issue of the fully-connected autoassociative neural networks. An extensive case study is designed and performed to evaluate the effects of the following neural network settings: neural network size, type of regularization, training objective function, and training epoch. The results are compared with those obtained using linear principal component analysis, and the advantages and limitations of the parallel autoassociative neural networks are illustrated.https://www.mdpi.com/2227-9717/7/7/411process monitoringnonlinear principal component analysisparallel neural networksautoassociative neural networkbig data |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Seongmin Heo Jay H. Lee |
spellingShingle |
Seongmin Heo Jay H. Lee Statistical Process Monitoring of the Tennessee Eastman Process Using Parallel Autoassociative Neural Networks and a Large Dataset Processes process monitoring nonlinear principal component analysis parallel neural networks autoassociative neural network big data |
author_facet |
Seongmin Heo Jay H. Lee |
author_sort |
Seongmin Heo |
title |
Statistical Process Monitoring of the Tennessee Eastman Process Using Parallel Autoassociative Neural Networks and a Large Dataset |
title_short |
Statistical Process Monitoring of the Tennessee Eastman Process Using Parallel Autoassociative Neural Networks and a Large Dataset |
title_full |
Statistical Process Monitoring of the Tennessee Eastman Process Using Parallel Autoassociative Neural Networks and a Large Dataset |
title_fullStr |
Statistical Process Monitoring of the Tennessee Eastman Process Using Parallel Autoassociative Neural Networks and a Large Dataset |
title_full_unstemmed |
Statistical Process Monitoring of the Tennessee Eastman Process Using Parallel Autoassociative Neural Networks and a Large Dataset |
title_sort |
statistical process monitoring of the tennessee eastman process using parallel autoassociative neural networks and a large dataset |
publisher |
MDPI AG |
series |
Processes |
issn |
2227-9717 |
publishDate |
2019-07-01 |
description |
In this article, the statistical process monitoring problem of the Tennessee Eastman process is considered using deep learning techniques. This work is motivated by three limitations of the existing works for such problem. First, although deep learning has been used for process monitoring extensively, in the majority of the existing works, the neural networks were trained in a supervised manner assuming that the normal/fault labels were available. However, this is not always the case in real applications. Thus, in this work, autoassociative neural networks are used, which are trained in an unsupervised fashion. Another limitation is that the typical dataset used for the monitoring of the Tennessee Eastman process is comprised of just a small number of data samples, which can be highly limiting for deep learning. The dataset used in this work is 500-times larger than the typically-used dataset and is large enough for deep learning. Lastly, an alternative neural network architecture, which is called parallel autoassociative neural networks, is proposed to decouple the training of different principal components. The proposed architecture is expected to address the co-adaptation issue of the fully-connected autoassociative neural networks. An extensive case study is designed and performed to evaluate the effects of the following neural network settings: neural network size, type of regularization, training objective function, and training epoch. The results are compared with those obtained using linear principal component analysis, and the advantages and limitations of the parallel autoassociative neural networks are illustrated. |
topic |
process monitoring nonlinear principal component analysis parallel neural networks autoassociative neural network big data |
url |
https://www.mdpi.com/2227-9717/7/7/411 |
work_keys_str_mv |
AT seongminheo statisticalprocessmonitoringofthetennesseeeastmanprocessusingparallelautoassociativeneuralnetworksandalargedataset AT jayhlee statisticalprocessmonitoringofthetennesseeeastmanprocessusingparallelautoassociativeneuralnetworksandalargedataset |
_version_ |
1725256799132581888 |