Article ID Journal Published Year Pages File Type
9657021 Journal of Algorithms 2005 16 Pages PDF
Abstract
This paper describes the first algorithm to compute the greatest common divisor (GCD) of two n-bit integers using a modular representation for intermediate values U, V and also for the result. It is based on a reduction step, similar to one used in the accelerated algorithm [T. Jebelean, A generalization of the binary GCD algorithm, in: ISSAC '93: International Symposium on Symbolic and Algebraic Computation, Kiev, Ukraine, 1993, pp. 111-116; K. Weber, The accelerated integer GCD algorithm, ACM Trans. Math. Softw. 21 (1995) 111-122] when U and V are close to the same size, that replaces U by (U−bV)/p, where p is one of the prime moduli and b is the unique integer in the interval (−p/2,p/2) such that b≡UV−1(modp). When the algorithm is executed on a bit common CRCW PRAM with O(nlognlogloglogn) processors, it takes O(n) time in the worst case. A heuristic model of the average case yields O(n/logn) time on the same number of processors.
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, , ,