I know the Miller–Rabin primality test is probabilistic. However I want to use it for a programming task that leaves no room for error.
Can we assume that it is correct with very high probability if the input numbers are 64-bit integers (i.e. long long
in C)?
There are efficient deterministic variants of the MR test for 64-bit values - which do not rely on the GRH - having been exhaustively tested by exploiting GPUs and other known results.
I've listed the pertinent sections of a C program I wrote that tests the primality of any 64-bit value:
(n > 1)
, using Jaeschke's and Sinclair's bases for the deterministic MR variant. It makes use of gcc and clang's__int128
extended type for exponentiation. If not available, an explicit routine is required. Maybe others will find this useful...Note that the MR (sprp) test is slightly modified to pass values on an iteration where the base is a multiple of the candidate, as mentioned in the 'remarks' section of the website
Update: while this has fewer base tests than Niklas' answer, it's important to note that the bases:
{3, 5, 7, 11, 13, 17, 19, 23, 29}
provide a cheap test that allows us to eliminate candidates exceeding:29 * 29 = 841
- simply using the GCD.For
(n > 29 * 29)
, we can clearly eliminate any even value as prime. The product of the small primes:(3 * 5 * 7 * 11 * 13 * 17 * 19 * 23 * 29} = 3234846615
, fits nicely in a 32-bit unsigned value. Agcd(n, 3234846615)
is a lot cheaper than a MR test! If the result is not(1)
, then(n) > 841
has a small factor.Merten's (?) theorem suggests that this simple
gcd(u64, u64)
test eliminates ~ 68% of all odd candidates (as composites). If you're using M-R to search for primes (randomly or incrementally), rather than just a 'one-off' test, this is certainly worth while!Your computer is not perfect; it has a finite probability of failing in such a way as to produce an incorrect result to a calculation. Providing the probability of the M-R test giving a false result is greatly less than the probability of some other computer failure, then you are fine. There is no reason to run the M-R test for less than 64 iterations (a 1 in 2^128 chance of error). Most examples will fail in the first few iterations, so only the actual primes will be thoroughly tested. Use 128 iterations for a 1 in 2^256 chance of error.
Miller–Rabin is indeed probabilistic, but you can trade accuracy for computation time arbitrarily. If the number you test is prime, it will always give the correct answer. The problematic case is when a number is composite, but is reported to be prime. We can bound the probability of this error using the formula on Wikipedia: If you select
k
different bases randomly and test them, the error probability is less than 4-k. So even withk = 9
, you only get a 3 in a million chance of being wrong. And withk = 40
or so it becomes ridiculously unlikely.That said, there is a deterministic version of Miller–Rabin, relying on the correctness of the generalized Riemann hypothesis. For the range u up to 264, it is enough to check
a = 2, 3, 5, 7, 11, 13, 17, 19, 23
. I have a C++ implementation online which was field-tested in lots of programming contests. Here's an instantiation of the template for unsigned 64-bit ints:PowerMod
andMultiplyMod
are just primitives to multiply and exponentiate under a given modulus, using square-and-{multiply,add}.In each iteration of Miller-Rabin you need to choose a random number. If you are unlucky this random number doesn't reveal certain composites. A small example of this is that
2^341 mod 341 = 2
, passing the testBut the test guarantees that it only lets a composite pass with probability <1/4. So if you run the test 64 times with different random values, the probability drops below 2^(-128) which is enough in practice.
You should take a look at the Baillie–PSW primality test. While it may have false positives, there are no known examples for this and according to wikipedia has been verified that no composite number below 2^64 passes the test. So it should fit your requirements.
For n < 2^64, it is possible to perform strong-pseudoprime tests to the seven bases 2, 325, 9375, 28178, 450775, 9780504, and 1795265022 and completely determine the primality of n; see http://miller-rabin.appspot.com/.
A faster primality test performs a strong-pseudoprime test to base 2 followed by a Lucas pseudoprime test. It takes about 3 times as long as a single strong-pseudoprime test, so is more than twice as fast as the 7-base Miller-Rabin test. The code is more complex, but not dauntingly so.
I can post code if you're interested; let me know in the comments.