Prime Numbers

Prime Numbers Prime Numbers

thales.doa.fmph.uniba.sk
from thales.doa.fmph.uniba.sk More from this publisher
10.12.2012 Views

8.2 Random-number generation 403 is based on the primitive polynomial (mod 2) x 18 + x 5 + x 2 + x +1. (A polynomial over a finite field F is primitive if it is irreducible and if a root is a cyclic generator for the multiplicative group of the finite field generated by the root.) If one has a “current” bit x−1, and labels the previous 17 bits x−2,x−3,...,x−18, then the shifting logic appropriate to the given polynomial is to form a new bit x0 according to the logic x0 = x−18, x−5 = x−5 ∧ x0, x−2 = x−2 ∧ x0, x−1 = x−1 ∧ x0, where “∧” is the exclusive-or operator (equivalent to addition in the evencharacteristic field). Then all of the indices are shifted so that the new x−1—the new current bit—is the x0 from the above operations. An explicit algorithm is the following: Algorithm 8.2.7 (Simple and fast random-bit generator). This algorithm provides seeding and random functions for a random-bit generator based on the polynomial x 18 + x 5 + x 2 + x +1over F2. 1. [Procedure seed] seed() { h =2 17 ; // 100000000000000000 binary. m =2 0 +2 1 +2 4 ; // Mask is 10011 binary. Choose starting integer seed x in [1, 2 18 ]; return; } 2. [Function random returning 0 or 1] random() { if((x & h) = 0) { // The bitwise “and”of x, h is compared to 0. x =((x ∧ m)

404 Chapter 8 THE UBIQUITY OF PRIME NUMBERS to suitable random numbers. But if one lifts the requirement of statistically testable randomness as it is usually invoked, there is quite another way to use random sequences. It is to these alternatives—falling under the rubric of quasi-Monte Carlo (qMC)—to which we next turn. 8.3 Quasi-Monte Carlo (qMC) methods Who would have guessed, back in the times of Gauss, Euler, Legendre, say, that primes would attain some practical value in the financial-market analysis of the latter twentieth century? We refer here not to cryptographic uses— which certainly do emerge whenever money is involved—but quasi-Monte Carlo science which, loosely speaking, is a specific form of Monte Carlo (i.e., statistically motivated) analysis. Monte Carlo calculations pervade the fields of applied science. The essential idea behind Monte Carlo calculation is to sample some large continuous (or even discrete, if need be) space—in doing a multidimensional integral, say—with random samples. Then one hopes that the “average” result is close to the true result one would obtain with the uncountable samples theoretically at hand. It is intriguing that number theory—in particular primenumber study—can be brought to bear on the science of quasi-Monte Carlo (qMC). The techniques of qMC differ from traditional Monte Carlo in that one does not seek expressly random sequences of samples. Instead, one attempts to provide quasirandom sequences that do not, in fact, obey the strict statistical rules of randomness, but instead have certain uniformity features attendant on the problem at hand. Although it is perhaps overly simplistic, a clear way to envision the difference between random and qMC is this: Random points when dropped can be expected to exhibit “clumps” and “gaps,” whereas qMC points generally avoid each other to minimize clumping and tend to occupy previous gaps. For these reasons qMC points can be—depending on the spatial dimension and precise posing of the problem—superior for certain tasks such as numerical integration, min–max problems, and statistical estimation in general. 8.3.1 Discrepancy theory Say that one wants to know the value of an integral over some D-dimensional domain R, namely I = ··· f(x) d R D x, but there is no reasonable hope of a closed-form, analytic evaluation. One might proceed in Monte Carlo fashion, by dropping a total of N “random” vectors x =(x1,...,xD) into the integration domain, then literally adding up the corresponding integrand values to get an average, and then multiplying by the measure of R to get an approximation, say I ′ , for the exact integral I. On the general variance principles of statistics, we can expect the error to

404 Chapter 8 THE UBIQUITY OF PRIME NUMBERS<br />

to suitable random numbers. But if one lifts the requirement of statistically<br />

testable randomness as it is usually invoked, there is quite another way to<br />

use random sequences. It is to these alternatives—falling under the rubric of<br />

quasi-Monte Carlo (qMC)—to which we next turn.<br />

8.3 Quasi-Monte Carlo (qMC) methods<br />

Who would have guessed, back in the times of Gauss, Euler, Legendre, say,<br />

that primes would attain some practical value in the financial-market analysis<br />

of the latter twentieth century? We refer here not to cryptographic uses—<br />

which certainly do emerge whenever money is involved—but quasi-Monte<br />

Carlo science which, loosely speaking, is a specific form of Monte Carlo (i.e.,<br />

statistically motivated) analysis. Monte Carlo calculations pervade the fields<br />

of applied science.<br />

The essential idea behind Monte Carlo calculation is to sample some large<br />

continuous (or even discrete, if need be) space—in doing a multidimensional<br />

integral, say—with random samples. Then one hopes that the “average” result<br />

is close to the true result one would obtain with the uncountable samples<br />

theoretically at hand. It is intriguing that number theory—in particular primenumber<br />

study—can be brought to bear on the science of quasi-Monte Carlo<br />

(qMC). The techniques of qMC differ from traditional Monte Carlo in that one<br />

does not seek expressly random sequences of samples. Instead, one attempts to<br />

provide quasirandom sequences that do not, in fact, obey the strict statistical<br />

rules of randomness, but instead have certain uniformity features attendant<br />

on the problem at hand.<br />

Although it is perhaps overly simplistic, a clear way to envision the<br />

difference between random and qMC is this: Random points when dropped can<br />

be expected to exhibit “clumps” and “gaps,” whereas qMC points generally<br />

avoid each other to minimize clumping and tend to occupy previous gaps. For<br />

these reasons qMC points can be—depending on the spatial dimension and<br />

precise posing of the problem—superior for certain tasks such as numerical<br />

integration, min–max problems, and statistical estimation in general.<br />

8.3.1 Discrepancy theory<br />

Say that one wants to know the value of an integral over some D-dimensional<br />

domain R, namely<br />

<br />

I = ··· f(x) d<br />

R<br />

D x,<br />

but there is no reasonable hope of a closed-form, analytic evaluation. One<br />

might proceed in Monte Carlo fashion, by dropping a total of N “random”<br />

vectors x =(x1,...,xD) into the integration domain, then literally adding up<br />

the corresponding integrand values to get an average, and then multiplying<br />

by the measure of R to get an approximation, say I ′ , for the exact integral<br />

I. On the general variance principles of statistics, we can expect the error to

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!