کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
5130094 | 1378658 | 2017 | 34 صفحه PDF | دانلود رایگان |

We consider a classical finite horizon optimal control problem for continuous-time pure jump Markov processes described by means of a rate transition measure depending on a control parameter and controlled by a feedback law. For this class of problems the value function can often be described as the unique solution to the corresponding Hamilton-Jacobi-Bellman equation. We prove a probabilistic representation for the value function, known as nonlinear Feynman-Kac formula. It relates the value function with a backward stochastic differential equation (BSDE) driven by a random measure and with a sign constraint on its martingale part. We also prove existence and uniqueness results for this class of constrained BSDEs. The connection of the control problem with the constrained BSDE uses a control randomization method recently developed by several authors. This approach also allows to prove that the value function of the original non-dominated control problem coincides with the value function of an auxiliary dominated control problem, expressed in terms of equivalent changes of probability measures.
Journal: Stochastic Processes and their Applications - Volume 127, Issue 5, May 2017, Pages 1441-1474