Prime Numbers
Prime Numbers Prime Numbers
8.2 Random-number generation 403 is based on the primitive polynomial (mod 2) x 18 + x 5 + x 2 + x +1. (A polynomial over a finite field F is primitive if it is irreducible and if a root is a cyclic generator for the multiplicative group of the finite field generated by the root.) If one has a “current” bit x−1, and labels the previous 17 bits x−2,x−3,...,x−18, then the shifting logic appropriate to the given polynomial is to form a new bit x0 according to the logic x0 = x−18, x−5 = x−5 ∧ x0, x−2 = x−2 ∧ x0, x−1 = x−1 ∧ x0, where “∧” is the exclusive-or operator (equivalent to addition in the evencharacteristic field). Then all of the indices are shifted so that the new x−1—the new current bit—is the x0 from the above operations. An explicit algorithm is the following: Algorithm 8.2.7 (Simple and fast random-bit generator). This algorithm provides seeding and random functions for a random-bit generator based on the polynomial x 18 + x 5 + x 2 + x +1over F2. 1. [Procedure seed] seed() { h =2 17 ; // 100000000000000000 binary. m =2 0 +2 1 +2 4 ; // Mask is 10011 binary. Choose starting integer seed x in [1, 2 18 ]; return; } 2. [Function random returning 0 or 1] random() { if((x & h) = 0) { // The bitwise “and”of x, h is compared to 0. x =((x ∧ m)
404 Chapter 8 THE UBIQUITY OF PRIME NUMBERS to suitable random numbers. But if one lifts the requirement of statistically testable randomness as it is usually invoked, there is quite another way to use random sequences. It is to these alternatives—falling under the rubric of quasi-Monte Carlo (qMC)—to which we next turn. 8.3 Quasi-Monte Carlo (qMC) methods Who would have guessed, back in the times of Gauss, Euler, Legendre, say, that primes would attain some practical value in the financial-market analysis of the latter twentieth century? We refer here not to cryptographic uses— which certainly do emerge whenever money is involved—but quasi-Monte Carlo science which, loosely speaking, is a specific form of Monte Carlo (i.e., statistically motivated) analysis. Monte Carlo calculations pervade the fields of applied science. The essential idea behind Monte Carlo calculation is to sample some large continuous (or even discrete, if need be) space—in doing a multidimensional integral, say—with random samples. Then one hopes that the “average” result is close to the true result one would obtain with the uncountable samples theoretically at hand. It is intriguing that number theory—in particular primenumber study—can be brought to bear on the science of quasi-Monte Carlo (qMC). The techniques of qMC differ from traditional Monte Carlo in that one does not seek expressly random sequences of samples. Instead, one attempts to provide quasirandom sequences that do not, in fact, obey the strict statistical rules of randomness, but instead have certain uniformity features attendant on the problem at hand. Although it is perhaps overly simplistic, a clear way to envision the difference between random and qMC is this: Random points when dropped can be expected to exhibit “clumps” and “gaps,” whereas qMC points generally avoid each other to minimize clumping and tend to occupy previous gaps. For these reasons qMC points can be—depending on the spatial dimension and precise posing of the problem—superior for certain tasks such as numerical integration, min–max problems, and statistical estimation in general. 8.3.1 Discrepancy theory Say that one wants to know the value of an integral over some D-dimensional domain R, namely I = ··· f(x) d R D x, but there is no reasonable hope of a closed-form, analytic evaluation. One might proceed in Monte Carlo fashion, by dropping a total of N “random” vectors x =(x1,...,xD) into the integration domain, then literally adding up the corresponding integrand values to get an average, and then multiplying by the measure of R to get an approximation, say I ′ , for the exact integral I. On the general variance principles of statistics, we can expect the error to
- Page 362 and 363: 7.5 Counting points on elliptic cur
- Page 364 and 365: 7.5 Counting points on elliptic cur
- Page 366 and 367: 7.5 Counting points on elliptic cur
- Page 368 and 369: 7.5 Counting points on elliptic cur
- Page 370 and 371: 7.5 Counting points on elliptic cur
- Page 372 and 373: 7.5 Counting points on elliptic cur
- Page 374 and 375: 7.5 Counting points on elliptic cur
- Page 376 and 377: 7.5 Counting points on elliptic cur
- Page 378 and 379: 7.6 Elliptic curve primality provin
- Page 380 and 381: 7.6 Elliptic curve primality provin
- Page 382 and 383: 7.6 Elliptic curve primality provin
- Page 384 and 385: 7.7 Exercises 375 7.4. As in Exerci
- Page 386 and 387: 7.7 Exercises 377 (some Bj equals A
- Page 388 and 389: 7.7 Exercises 379 This reduction ig
- Page 390 and 391: 7.8 Research problems 381 multiply-
- Page 392 and 393: 7.8 Research problems 383 highly ef
- Page 394 and 395: 7.8 Research problems 385 is prime.
- Page 396 and 397: Chapter 8 THE UBIQUITY OF PRIME NUM
- Page 398 and 399: 8.1 Cryptography 389 is, if an orac
- Page 400 and 401: 8.1 Cryptography 391 Algorithm 8.1.
- Page 402 and 403: 8.1 Cryptography 393 just to genera
- Page 404 and 405: 8.1 Cryptography 395 where in the l
- Page 406 and 407: 8.2 Random-number generation 397 ar
- Page 408 and 409: 8.2 Random-number generation 399 Al
- Page 410 and 411: 8.2 Random-number generation 401 }
- Page 414 and 415: 8.3 Quasi-Monte Carlo (qMC) methods
- Page 416 and 417: 8.3 Quasi-Monte Carlo (qMC) methods
- Page 418 and 419: 8.3 Quasi-Monte Carlo (qMC) methods
- Page 420 and 421: 8.3 Quasi-Monte Carlo (qMC) methods
- Page 422 and 423: 8.3 Quasi-Monte Carlo (qMC) methods
- Page 424 and 425: 8.4 Diophantine analysis 415 [Tezuk
- Page 426 and 427: 8.4 Diophantine analysis 417 9262 3
- Page 428 and 429: 8.5 Quantum computation 419 We spea
- Page 430 and 431: 8.5 Quantum computation 421 three H
- Page 432 and 433: 8.5 Quantum computation 423 for a n
- Page 434 and 435: 8.6 Curious, anecdotal, and interdi
- Page 436 and 437: 8.6 Curious, anecdotal, and interdi
- Page 438 and 439: 8.6 Curious, anecdotal, and interdi
- Page 440 and 441: 8.7 Exercises 431 universal Golden
- Page 442 and 443: 8.7 Exercises 433 standards insist
- Page 444 and 445: 8.7 Exercises 435 of positive compo
- Page 446 and 447: 8.8 Research problems 437 element o
- Page 448 and 449: 8.8 Research problems 439 the Leveq
- Page 450 and 451: 8.8 Research problems 441 for every
- Page 452 and 453: Chapter 9 FAST ALGORITHMS FOR LARGE
- Page 454 and 455: 9.1 Tour of “grammar-school” me
- Page 456 and 457: 9.2 Enhancements to modular arithme
- Page 458 and 459: 9.2 Enhancements to modular arithme
- Page 460 and 461: 9.2 Enhancements to modular arithme
404 Chapter 8 THE UBIQUITY OF PRIME NUMBERS<br />
to suitable random numbers. But if one lifts the requirement of statistically<br />
testable randomness as it is usually invoked, there is quite another way to<br />
use random sequences. It is to these alternatives—falling under the rubric of<br />
quasi-Monte Carlo (qMC)—to which we next turn.<br />
8.3 Quasi-Monte Carlo (qMC) methods<br />
Who would have guessed, back in the times of Gauss, Euler, Legendre, say,<br />
that primes would attain some practical value in the financial-market analysis<br />
of the latter twentieth century? We refer here not to cryptographic uses—<br />
which certainly do emerge whenever money is involved—but quasi-Monte<br />
Carlo science which, loosely speaking, is a specific form of Monte Carlo (i.e.,<br />
statistically motivated) analysis. Monte Carlo calculations pervade the fields<br />
of applied science.<br />
The essential idea behind Monte Carlo calculation is to sample some large<br />
continuous (or even discrete, if need be) space—in doing a multidimensional<br />
integral, say—with random samples. Then one hopes that the “average” result<br />
is close to the true result one would obtain with the uncountable samples<br />
theoretically at hand. It is intriguing that number theory—in particular primenumber<br />
study—can be brought to bear on the science of quasi-Monte Carlo<br />
(qMC). The techniques of qMC differ from traditional Monte Carlo in that one<br />
does not seek expressly random sequences of samples. Instead, one attempts to<br />
provide quasirandom sequences that do not, in fact, obey the strict statistical<br />
rules of randomness, but instead have certain uniformity features attendant<br />
on the problem at hand.<br />
Although it is perhaps overly simplistic, a clear way to envision the<br />
difference between random and qMC is this: Random points when dropped can<br />
be expected to exhibit “clumps” and “gaps,” whereas qMC points generally<br />
avoid each other to minimize clumping and tend to occupy previous gaps. For<br />
these reasons qMC points can be—depending on the spatial dimension and<br />
precise posing of the problem—superior for certain tasks such as numerical<br />
integration, min–max problems, and statistical estimation in general.<br />
8.3.1 Discrepancy theory<br />
Say that one wants to know the value of an integral over some D-dimensional<br />
domain R, namely<br />
<br />
I = ··· f(x) d<br />
R<br />
D x,<br />
but there is no reasonable hope of a closed-form, analytic evaluation. One<br />
might proceed in Monte Carlo fashion, by dropping a total of N “random”<br />
vectors x =(x1,...,xD) into the integration domain, then literally adding up<br />
the corresponding integrand values to get an average, and then multiplying<br />
by the measure of R to get an approximation, say I ′ , for the exact integral<br />
I. On the general variance principles of statistics, we can expect the error to