| Article ID | Journal | Published Year | Pages | File Type |
|---|---|---|---|---|
| 4645444 | Applied Numerical Mathematics | 2011 | 11 Pages |
Abstract
We propose an iterative method for pricing American options under jump-diffusion models. A finite difference discretization is performed on the partial integro-differential equation, and the American option pricing problem is formulated as a linear complementarity problem (LCP). Jump-diffusion models include an integral term, which causes the resulting system to be dense. We propose an iteration to solve the LCPs efficiently and prove its convergence. Numerical examples with Kouʼs and Mertonʼs jump-diffusion models show that the resulting iteration converges rapidly.
Related Topics
Physical Sciences and Engineering
Mathematics
Computational Mathematics
