Lazarus
Programming => General => Topic started by: MarkMLl on April 04, 2020, 03:22:04 pm

Can anybody point me at a primenumber test adequate for 32bit integers? That's small enough that trial division would probably be adequate, but I'm interested if anybody has anything better. I don't want to have to redo this, so would prefer an "is" rather than a "probably is".
I note https://surface.syr.edu/cgi/viewcontent.cgi?article=1159&context=eecs_techreports from Per BrinchHansen, but it's probably overkill since he's talking about ~150 digit numbers.
MarkMLl

Look up in this table?
http://www.umopit.ru/CompLab/primes32eng.htm

Ouch :) OK, something that can be contained in a simple program and doesn't require web access :)
Actually, one valid approach would probably be a table of the distance between primes, i.e. rather than having a table 3, 5, 7, 11, 13... store 2, 2, 4, 2... converting trial division to additive iteration.
After all, who would notice an extra 1/2Gb or so extra in a program these days?
(Somewhat later:)
There's some usefullooking stuff at https://rosettacode.org/wiki/Category:PrimTrial
MarkMLl

Ouch :) OK, something that can be contained in a simple program and doesn't require web access :)
Actually, one valid approach would probably be a table of the distance between primes, i.e. rather than having a table
3, 5, 7, 11, 13... store 2, 2, 4, 2... converting trial division to additive iteration.
You can download the complete table of 32bit integers and it fits in about 1GB when converted to binary.
So why would you need trial division at all. It is just binsearch in the array.

I think Marco's suggestion has merit.
If I faced that problem, I'd use the table Marco suggested and an "array[dword] of bit" and use the primes in the table to set their corresponding bit to 1.
After that, it is a direct access to the array element to determine if a number is prime or not. In addition to that, it would provide a reasonably quick way of determining the prime "floor" and "ceil" of any number that is not a prime.
The bit array would occupy about 1/2gig but you wouldn't need to have it all in memory if you sectioned the array into equal number of primes per range. 1024 ranges would create sections of less than 1 mb each. Each section could be a resource loaded on demand.
Just some thoughts...

Can anybody point me at a primenumber test adequate for 32bit integers? That's small enough that trial division would probably be adequate, but I'm interested if anybody has anything better. I don't want to have to redo this, so would prefer an "is" rather than a "probably is".
I note https://surface.syr.edu/cgi/viewcontent.cgi?article=1159&context=eecs_techreports (https://surface.syr.edu/cgi/viewcontent.cgi?article=1159&context=eecs_techreports) from Per BrinchHansen, but it's probably overkill since he's talking about ~150 digit numbers.
MarkMLl
How high do you need to go?
I've been playing with this and a have two routines, one that does trial division up to the SQRT, and one that uses a segmented sieve.
The first routine can get prime numbers up to 100,000,000 (100 mil) in less than 9 minutes, on a 5th generation mobile i5 processor.
The second routine can get prime numbers up to 10,000,000,000 (10 bil) in less than 2 minutes on the same system.

The first routine can get prime numbers up to 100,000,000 (100 mil) in less than 9 minutes, on a 5th generation mobile i5 processor.
The second routine can get prime numbers up to 10,000,000,000 (10 bil) in less than 2 minutes on the same system.
That's quite good. The real question is, how quickly does he need to establish primality.
Maybe I'm missing something but, how come the routine that goes up to 10billion takes a fifth/fourth of the time of the routine that only goes up to 100mil ?. Sounds like the second algorithm is a lot better than the first one.

Hi!
The lookup table from Marcov's reply #1 is unpacked 773 MB.
It is loaded from file into an array of dWord in around a second from a SDD  with blockread.
Winni

you can do it yourself in case if one day you decide to change the high limit
https://rosettacode.org/wiki/Sieve_of_Eratosthenes (https://rosettacode.org/wiki/Sieve_of_Eratosthenes)
Go to the above link, scroll down the page to get FreePascal solutionS
Get one, then alter it to write resulting table(s) of your size in a file (and maybe compressed by your hand)
the app using the table has just to read it back

you also have the FAST assembly versions of the Sieve algorithm for many CPU, so you can implement it with FreePascal in ASM mode, keeping it multiplatform

https://primesieve.org/

The lookup table from Marcov's reply #1 is unpacked 773 MB.
Winni
Using an array of bits, only 512 MB and, as a bonus, a very fast lookup.

Can anybody point me at a primenumber test adequate for 32bit integers?
How high do you need to go?
32bit, as stated. I've used a couple of routines from the Rosetta link I posted which appear to work adequately.
MarkMLl

So why would you need trial division at all. It is just binsearch in the array.
To avoid the overhead of a few hundred Mb which might only be needed once.
MarkMLl

To avoid the overhead of a few hundred Mb which might only be needed once.
MarkMLl
In that case, you could have the array of bits in a separate file which is loaded only when needed and unloaded afterwards. That would give you reasonable speed and reasonably efficient memory utilization since you'd free the array once you no longer need it.