Approximate Dynamic Programming Using Bellman Residual Elimination and Gaussian Process Regression
This paper presents an approximate policy iteration algorithm for solving infinite-horizon, discounted Markov decision processes (MDPs) for which a model of the system is available. The algorithm is similar in spirit to Bellman residual minimization methods. However, by using Gaussian process regres...
Main Authors: | Bethke, Brett M. (Contributor), How, Jonathan P. (Contributor) |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Aeronautics and Astronautics (Contributor) |
Format: | Article |
Language: | English |
Published: |
Institute of Electrical and Electronics Engineers,
2010-10-05T19:42:03Z.
|
Subjects: | |
Online Access: | Get fulltext |
Similar Items
-
Approximate Dynamic Programming Using Bellman Residual Elimination and Gaussian Process Regression
by: How, Jonathan P., et al.
Published: (2010) -
Approximate dynamic programming using model-free Bellman Residual Elimination
by: Bethke, Brett M., et al.
Published: (2011) -
Kernel-based approximate dynamic programming using Bellman residual elimination
by: Bethke, Brett (Brett M.)
Published: (2010) -
An Approximate Quadratic Programming for Efficient Bellman Equation Solution
by: Jianmei Su, et al.
Published: (2019-01-01) -
Agent capability in persistent mission planning using approximate dynamic programming
by: Bethke, Brett M., et al.
Published: (2010)