Development of an activated sludge process control strategy using Bayesian and Markovian decision theory
Little has been done in the past to utilize activated sludge wastewater treatment plant monitoring data to assist in process control. This data consists of discrete measurements of process variables and continuously monitored parameters with specific characteristics which make statistical analysis d...
Main Author: | |
---|---|
Language: | English |
Published: |
University of British Columbia
2010
|
Online Access: | http://hdl.handle.net/2429/27211 |
id |
ndltd-UBC-oai-circle.library.ubc.ca-2429-27211 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-UBC-oai-circle.library.ubc.ca-2429-272112018-01-05T17:44:01Z Development of an activated sludge process control strategy using Bayesian and Markovian decision theory Vassos, Troy David Little has been done in the past to utilize activated sludge wastewater treatment plant monitoring data to assist in process control. This data consists of discrete measurements of process variables and continuously monitored parameters with specific characteristics which make statistical analysis difficult. The use of this data in the development of control strategies has also been restricted by the inadequacy of process models and the non-steady-state characteristics of most treatment facilities. Consequently, most successful process models are stochastic in nature and plant specific. The recent use of computers has resulted in increased monitoring data, due to the capability of monitoring parameters on a continuous basis. Although computers can be used to analyze the data, there is little incentive on the part of operators to improve the data quality, due to a lack of process control techniques to utilize the information gathered. This research focuses on techniques for incorporating decision theory concepts, involving probability and utility estimations, applied to the problem of activated sludge bulking due to filamentous microorganisms. Both single-state Bayesian and multi-state Markov Policy-Iteration techniques are heuristically applied to operations data obtained from a conventional activated sludge plant with chronic sludge bulking. Six years of monitoring data were used to establish an optimal sludge bulking control strategy using a combination of aeration basin dissolved oxygen concentration and food-to-microorganism ratio settings. Single state Bayesian decision analysis techniques were first applied to establish outcome probabilities in terms of sludge volume index (SVI) levels and effluent suspended solids (SS) concentrations for the two control alternatives. Based on utility matrices obtained from three plant operators and theoretical considerations, four optimal control policies were calculated. To obtain control policies which were dependent upon the current bulking state, or condition, a Markov technique was applied to the same data base but using utilities defined by the SVI level and control alternatives. This technique resulted in optimal policy recommendations which were more consistent with bulking theory than the former single-state Bayesian approach. The Markov Policy-Iteration technique was adapted to a process control strategy to allow the probability or transition matrix to be modified by experimental findings, resulting in a dynamic control strategy. The model was then applied to a simulated low dissolved oxygen bulking process condition and policy adjustments were examined to increase the rate of convergence of the model. Although the optimal policy vector was found to be relatively insensitive to the utility matrix structure, the rate of convergence was affected. Use of a policy adjustment technique, which allows the decision maker to provide input to the model, can eliminate the convergence problem. Difficulties in convergence can occur when the utility matrix, defined by the operator, is in conflict with the optimal decision policy. Applied Science, Faculty of Civil Engineering, Department of Graduate 2010-08-08T23:11:42Z 2010-08-08T23:11:42Z 1986 Text Thesis/Dissertation http://hdl.handle.net/2429/27211 eng For non-commercial purposes only, such as research, private study and education. Additional conditions apply, see Terms of Use https://open.library.ubc.ca/terms_of_use. University of British Columbia |
collection |
NDLTD |
language |
English |
sources |
NDLTD |
description |
Little has been done in the past to utilize activated sludge wastewater treatment plant monitoring data to assist in process control. This data consists of discrete measurements of process variables and continuously monitored parameters with specific characteristics which make statistical analysis difficult. The use of this data in the development of control strategies has also been restricted by the inadequacy of process models and the non-steady-state characteristics of most treatment facilities. Consequently, most successful process models are stochastic in nature and plant specific.
The recent use of computers has resulted in increased monitoring data, due to the capability of monitoring parameters on a continuous basis. Although computers can be used to analyze the data, there is little incentive on the part of operators to improve the data quality, due to a lack of process control techniques to utilize the information gathered.
This research focuses on techniques for incorporating decision theory concepts, involving probability and utility estimations, applied to the problem of activated sludge bulking due to filamentous microorganisms. Both single-state Bayesian and multi-state Markov Policy-Iteration techniques are heuristically applied to operations data obtained from a conventional activated sludge plant with chronic sludge bulking. Six years of monitoring data were used to establish an optimal sludge bulking control strategy
using a combination of aeration basin dissolved oxygen concentration and food-to-microorganism ratio settings. Single state Bayesian decision analysis techniques were first applied to establish outcome probabilities in terms of sludge volume index (SVI) levels and effluent suspended solids (SS) concentrations for the two control alternatives. Based on utility matrices obtained from three plant operators and theoretical considerations, four optimal control policies were calculated. To obtain control policies which were dependent upon the current bulking state, or condition, a Markov technique was applied to the same data base but using utilities defined by the SVI level and control alternatives. This technique resulted in optimal policy recommendations which were more consistent with bulking theory than the former single-state Bayesian approach.
The Markov Policy-Iteration technique was adapted to a process control strategy to allow the probability or transition matrix to be modified by experimental findings, resulting in a dynamic control strategy. The model was then applied to a simulated low dissolved oxygen bulking process condition and policy adjustments were examined to increase the rate of convergence of the model. Although the optimal policy vector was found to be relatively insensitive to the utility matrix structure, the rate of convergence was affected. Use of a policy adjustment technique, which allows the decision maker to provide input to the model, can eliminate the convergence problem. Difficulties in convergence can occur when the utility matrix, defined by the operator, is in conflict with the optimal decision policy. === Applied Science, Faculty of === Civil Engineering, Department of === Graduate |
author |
Vassos, Troy David |
spellingShingle |
Vassos, Troy David Development of an activated sludge process control strategy using Bayesian and Markovian decision theory |
author_facet |
Vassos, Troy David |
author_sort |
Vassos, Troy David |
title |
Development of an activated sludge process control strategy using Bayesian and Markovian decision theory |
title_short |
Development of an activated sludge process control strategy using Bayesian and Markovian decision theory |
title_full |
Development of an activated sludge process control strategy using Bayesian and Markovian decision theory |
title_fullStr |
Development of an activated sludge process control strategy using Bayesian and Markovian decision theory |
title_full_unstemmed |
Development of an activated sludge process control strategy using Bayesian and Markovian decision theory |
title_sort |
development of an activated sludge process control strategy using bayesian and markovian decision theory |
publisher |
University of British Columbia |
publishDate |
2010 |
url |
http://hdl.handle.net/2429/27211 |
work_keys_str_mv |
AT vassostroydavid developmentofanactivatedsludgeprocesscontrolstrategyusingbayesianandmarkoviandecisiontheory |
_version_ |
1718593305661407232 |