Dynamic Programming and Hamilton–Jacobi–Bellman Equations on Time Scales

Bellman optimality principle for the stochastic dynamic system on time scales is derived, which includes the continuous time and discrete time as special cases. At the same time, the Hamilton–Jacobi–Bellman (HJB) equation on time scales is obtained. Finally, an example is employed to illustrate our...

Full description

Bibliographic Details
Main Authors: Yingjun Zhu, Guangyan Jia
Format: Article
Language:English
Published: Hindawi-Wiley 2020-01-01
Series:Complexity
Online Access:http://dx.doi.org/10.1155/2020/7683082