Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and learn more about it. Thanks for reading, and for making the site better!

zorba1 writes "Manindra Agrawal et. al. of the Indian Institute of Technology Kanpur CS department have released a most interesting paper today. It presents an algorithm that determines whether a number is prime or not in polynomial time. While I haven't gone through the presentation in detail, it looks like a promising, albeit non-optimized, solution for the famous PRIMES in P problem."

No, because find primes was classified as NP, 3-SAT and others are NP-Complete which are a "superclass" of NP problems.

you have it backwards (0)

Anonymous Coward | about 12 years ago | (#4023301)

NP-complete problems form a subclass of NP problems.

Re:p=np? (1, Informative)

Anonymous Coward | about 12 years ago | (#4023319)

If by "superclass", you mean problems that are (thought to be) harder than the other set of problems, then yes. But technically speaking, NP-complete problems actually form a "sub"class of NP problems. And yes, primes is not _known to_ be a NP-complete problem, so this doesn't really affect complexity of 3-SAT directly.

mathematicians have been pondering over that prob for decades.. if that works out we'll have fresh prespective of univ..(remember the pendulum, algorithmic univ theories)

Big consequences related to encryption (-1, Interesting)

Anonymous Coward | about 12 years ago | (#4023184)

Well, encryption based on the multiplication of large primes, anyway.

Re:Big consequences related to encryption (5, Funny)

Well, encryption based on the multiplication of large primes, anyway.

Yeah... that step in key generation where you check whether a candidate key is prime or not will now be performed with 100% confidence instead of that annoying 99.999999999999999% confidence we used to have.

-a

Re:Big consequences related to encryption (2, Insightful)

Of course, there's still a larger chance of a cosmic ray flipping a bit in the processor or memory then the math being wrong, anyway.;)

This may seem like a strange question, but isn't a non-prime that passes the 99.99999999999% check just as good as a prime in encryption? I mean, seriously, is anyone really going to notice it's not prime? Sure, they could accidently stumble on the 'wrong' factors while trying decode it, but, seriously, halving the time to decode your message isn't a huge mistake, considering we're talking on the order of centuries at least.;)

Consider a simple public key encryption algorithm based on the fact proved in any beginning number theory book that for primes p, q

a^x = a (mod pq) if x = 1 (mod m)

where m = (p - 1)(q - 1)

Now choose your favorite number f and use Euclidean algorithm to efficiently find a number g such that

fg = 1 (mod m)

You may have to try another value of f if the Euclidean algorithm terminated before reaching 1, but it won't take many guesses. Now publish the number f and mod m as your public key and keep g private.

Someone sends text t to you by sending t^f (mod m). Now you just raise that message to the power g and reduce mod m to recover the original text. (This follows immediately by combining the above statements).

Finally, I'll get to the point. This algorithm is simply busted if p and q are not prime because t^fg will not equal t mod m unless you are very lucky. In fact, if you want to add a bunch of nines to your percentage certainty, just encrypt and decrypt a sample message text and verify that it works.

Re:Big consequences related to encryption (0)

Anonymous Coward | about 12 years ago | (#4023304)

You don't get it - the whole RSA algorithm is based on the difficulty of factorizing large numbers. If you can determine if a number is prime in polynomial time, you can break RSA in polynomial time. The factor might of course still be impractically large, but such a result would be sensational if true.
Now, on the other hand, I would rate the probability that this article holds to 0.1%....

Re:Big consequences related to encryption (3, Insightful)

Uh, no. Just because you can tell whether a number is prime in polynomial time, doesn't mean you can find the factors of a number in polynomial time, as I understand it.

Re:Big consequences related to encryption (0)

Anonymous Coward | about 12 years ago | (#4023366)

plzdiekthx

Re:Big consequences related to encryption (1, Funny)

Anonymous Coward | about 12 years ago | (#4023388)

That "proof" would have killed my CS professor.

Re:Big consequences related to encryption (5, Insightful)

Thank you. I'm glad someone finally pointed out that we already have a classical (as opposed to quantum) probabilistic algorithm for determining primality. Every other fool on this board is running around wearing his/her tin hat and shouting about RSA being defunct. All this does is push primality testing from the BPP complexity class into the P complexity class. It is significant in the sense that it weakens the argument for BPP being larger than P.

Of course, we also have a polynomial-time algorithm for prime factorization (Shor's Algorithm). It's just that it requires a quantum computer, which is difficult to build. So far, the biggest number factored is 15... 1024 bit keys will be safe for a while yet. I believe it's 15 - 20 years until they're broken, if Moore's Law holds for quantum computers in terms of maximum number of qubits possible (so far, it roughly has, but then, we're only at about 7 qubits).

prime (-1, Troll)

Anonymous Coward | about 12 years ago | (#4023196)

bool isprime(p) for i = 2 to sqrt(p)
if p mod i == 0
return false
endif endfor return true endfunc

If I'm not correct, that algo is O(n), thus polynomial, thus in P. But for very large p, that algo is impractical.

Re:I always thought is was in P (0)

Anonymous Coward | about 12 years ago | (#4023214)

congrats you foud a worse algorithm

Re:I always thought is was in P (0)

Anonymous Coward | about 12 years ago | (#4023215)

Not quite true. You are doing a considerable amount of operations when you calculate mod i, which means that you're performing it in exponential time.

Re:I always thought is was in P (0)

Anonymous Coward | about 12 years ago | (#4023227)

Er, I would call a simple divide "considerable".

Re:I always thought is was in P (5, Informative)

Anonymous Coward | about 12 years ago | (#4023226)

For P, it has to be polynomial in the size of _the input_. The input size here is log(n) since it requires log(n) bits to represent n. log(n)^12 hence is polynomial (which i believe their algo guarantees), whereas sqrt(n) is not.

What are you referring to as your input size n in O(n)? The number p? The correct size n is the number of digits of p. That algorithm is definitely non-polynomial for that (correct) n.

What category? Unfortunately, there's no mathematics Nobel Prize. I realize that they consider applications, but I don't see many in physics, chemistry, medicine, economics, peace, or literature.

What category? Unfortunately, there's no mathematics Nobel Prize. I realize that they consider applications, but I don't see many in physics, chemistry, medicine, economics, peace, or literature.

The Nobel Prize for literature of course.

Re:Nobel Prize Time (0)

Anonymous Coward | about 12 years ago | (#4023250)

how long until the algorithm gets implemented in something similar to primenet?

http://www.mersenne.org/primenet/

mmmmmh...massively parallel distributed O(n) prime number calculator...

The technical definition is kinda long and complex, but in essence it's like this. Given a problem of some size n, a polynomial time algorithm is guaranteed to give a solution in time proportional to a polynomial of n. If a polynomial-time algorithm exists that solves a problem, then the problem is said to be in polynomial time.

To give an example, say you've got a list of numbers and you want to know the sum. That can be done in linear time - ie, the time taken is proportional to the length of the list of numbers. The size of the problem (n) is defined by the length of the list and the time taken (T) is as follows: T = c1 * n + c0, where c1 and c0 are some fixed constants. The formula for T is a polynomial, and so the problem "LIST-SUM" is in polynomial time. It would still be in polynomial time if the formula for T was a polynomial with n^2, n^3, n^50 terms in it, or even terms like n^1.5 (because as n grows very large an n^1.5 term will always be smaller than an n^2 term).

Showing you an example of something outside polynomial time is a little more difficult, but some standard examples are SAT (the satisfiability problem) or the travelling-salesman problem, which you can read about in any book on the subject.

an NP-complete (NP=non-polynomial) problem is one that can be solved, but takes about 8*age_of_universe time to solve. To get around this, approximation algorithms are used, but these can never give a 100% guarantee of finding the correct solution, nor may provide the same solution if it were to execute on the same data twice.

a polynomial-time problem is one that can be solved within our lifetimes, guarantee 100% accuracy, and can always generate the same solution for the same data.

there's a LOT more to it. The book Intro to Algorithms has a good chapter on the topic of NP-completeness, which will explain the intricate and gory details.

ie, it can be completed by a nondeterministic machine in polynomial time. The main problem with NP algorithms is that there aren't any nondeterminisitic machines around. (A nondeterministic machine can attempt all paths to try to reach a conclusion at once whereas a deterministic machine can only try one at a time.)

Re:What's Polynomial Time? (0)

Anonymous Coward | about 12 years ago | (#4023350)

NP stands for non-deterministic polynomial rather than non-polynomial.

Ignore the parent post, since it is wrong. The previous poster did a much better job of explaining the concept of polynomial time.

An NP-complete problem does not take 8 times the age of the universe to solve. This completely missed the point. Every P or NP problem can be expressed in terms of a variable "n", which represents the input size. There are many practical problems where the best-known P algorithm is slower than the best NP algorithm for typical values of n. However, computational theory tells us that as n increases, the P algorithm will eventually beat the NP one.

Out of interest, will this finding have any impact on the effectiveness of present day cryptography?

Re:Cryptography (0)

Anonymous Coward | about 12 years ago | (#4023255)

No. As the abstract also describes, there exists a randomized algorithm for doing primality testing in polynomial time (and is probably much faster than this algorithm as of now). This is of great theoretical interest, but unless it leads to an algorithm for factoring, probably does not have any direct impact on crypto.

Out of interest, will this finding have any impact on the effectiveness of present day cryptography?

Probably not. While it is possible that this research could lead to results in speeding up factoring, a faster algorithm for determining whether a number is prime is not going to compromise the security of RSA.

Your RSA key pair is derived from 2 large primes. The way we generate keys is to randomly test large random numbers to see if any of them are prime. Ergo, we must already have an efficient formula for determining if a number is prime or not.

FYI, the most commonly used algorithm is Euler's formula. Euler's formula doesn't actually tell you if a number is prime, but it will usually give a non-zero output if the number is not prime, so if you run it enough times with different inputs, you can be 99.99999% sure that a number is prime. However, a small percentage of numbers are "pseudoprimes" -- numbers that are not prime but which will also satisfy Euler's formula. Therefore, after you discover a candidate prime, you should use a different (slower) formula to double-check.

Since this is fairly common knowledge among geeks who use encryption, I'm somewhat surprised that so many people here jumped to the same conclusion you did.

-a

sigs (0)

Anonymous Coward | about 12 years ago | (#4023379)

I was about to mod you up... but then I read your sig. Anyone with a sig about moderation needs to be bitchslapped.

They could have easily taken over the infrastructure of a modernized computer-bent, encryption-shielded society such as the US or Japan. If that is indeed the case, these guys deserve a Nobel Peace Prize for giving this powerful tool to all and not using it as a weapon of war.

This does, however, sound unlikely. Any mathematicians out there care to comment?

They could have easily taken over the infrastructure of a modernized computer-bent, encryption-shielded society such as the US or Japan.

Primality testing and factorization are not one and the same. It is possible to know that a number is not prime without knowing its factors. Breaking encryption requires factoring the product of two huge primes (it is already known that the number you're trying to factor is NOT prime, so Primes being in P is more or less useless by itself for this particular application), and factorization has yet to be shown to be in P.

I am by no means a heavy duty math cruncher or cypherpunk, but how exactly is this going to affect number and factoring? I don't know of any advanced prime number search algorythms, but Sieve of Erothenes (did I get that right?) solved in NP time. (Each number is check is evenly divisible by an earlier prime, and if none found, add to list of primes, lather rinse repeat)If primes can be found in P time, finding the first 50 prime numbers would take the same time as finding the first 50 three hundred digit primes.

While that may not be thrilling at first, let's use the RCA contest for money as an example. We get a 1024 bit number containing 200 digits in decimal formm, which is the product of exactly two prime numbers. We know then that: 1. We only need to find one prime to easily find the other. 2. The digits in the factors can total no more than 200 digits. 3. One of the factors contains less than 100 digits.

Start at 10^100 and count down using this algorythm, and youll find it in P time instead of NP time. It'll still take forever, literally and figuratively, but wouldn't it take significantly less time than before?

Cracking any specific key-length is P, but cracking RSA in general remains NP, since that method requires checking a number of potential primes proportional to 2^(N/2) or so where N is the key size.

Pssst. 3 is 3. One of the factors contain less than or exactly 100 digits.

Now wouldn't you feel silly if they both were exactly 100 digits prime and you spent all that time look at 99 digit primes and below?;)

Re:Crypto repercusions? (1, Funny)

Anonymous Coward | about 12 years ago | (#4023383)

>I am by no means a heavy duty math cruncher or cypherpunk . . . >Start at 10^100 and count down using this algorythm

You do realize that there are 99999999999999999999999999999999999999999999999999 9999999999999999999999999999999999999999999999999 numbers between 10^100 and 10^99, don't you?

Bah.... so what? I have developed an algorithm that will determine if any number less than a google is prime in O(1). Above a google it degrades pretty fast, though.

google v. [common] To search the Web using the Google search engine,
`www.google.com'. Google is highly esteemed among hackers for its
significance ranking system, which is so uncannily effective that many
users consider it to have rendered other search engines effectively
irrelevant. The name `google' has additional flavor for hackers because
most know that it was copied from a mathematical term for ten to the
hundredth power, famously first uttered by a mathematician's infant
child.

---------

googol
n : a cardinal number represented as 1 followed by 100 zeros
(ten raised to the power of a hundred)

They'll have had it for decades as soon as they have time to read through it and edit all the past documentation they have:)

primes (0)

Anonymous Coward | about 12 years ago | (#4023266)

ok so you can tell if soething is a prime pretty quick. Does this help you factor large numbers pretty quick? It's been a while... or else what is the point...

no (0)

Anonymous Coward | about 12 years ago | (#4023286)

It doesn't let you factor large numbers. This paper is almost purely of academic interest.

No major implications for crypto as far as I see.. (0)

I haven't seen the paper yet (slashdotted, go fig), but I'm guessing this doesn't have a big bearing on crypto systems or factoring. As the poster wrote, the algorithm is not optimized, and I imagine that it's _very_ inefficient. While P is faster than NP (for the most part), P!=Fast.... this algorithm could be polynomial on the length of input and still be exceedingly slow. Don't buy that quantum cryptography PCI card yet:-)

They've figured out how to determine if a given number is prime in P time. That, by itself, doesn't break modern public key cryptographic systems. In order to do that, you would need to be able to factor a given number into its two prime factors in P time (a different problem).

For those of you wondering about the implications for cryptography, this does not imply that composite numbers can be factored in polynomial time. This algorithm is simply a primality test -- that is, it tells you whether or not a number has any proper divisors (in polynomial time), but it doesn't tell you what these divisors actually are. Determining whether a number is prime has always been considerably easier than finding the prime factorization.

In fact, for schemes like RSA -- where the key is the product of two large primes -- we already know that the number is composite, by definition, so a more efficient primality test doesn't give us any new information.

Cheers, IT

Re:Factoring might still be NP (0)

Anonymous Coward | about 12 years ago | (#4023356)

You mean Factoring might still be *NP-Hard*. It is quite obviously in NP (a class that encompasses all polynomial time problems, and as such, not quite as interesting as the class of NP-hard or NP-complete problems).

What is 'x' and how is 'q' calculated? (4, Interesting)

From looking at the algo, I can't figure out what 'x' (or maybe it's a chi) is? Can someone help? I've looked it over, but couldn't find a definition of it. I'm also assuming that the 'if (r is prime)' line is a recursive call to itself? Also, how do we determine 'q' the 'largest prime factor of r-1' ? Another recursive call to get the factors? I must admit, I'm kind of lost by the algo, but it's still interesting.

It's funny when I read the comments, and I see all kinds of stuff that reminds me of my Discrete Structures class (we did the P and NP stuff at the end)...

Makes me wonder what this means for computer theory, but if you think about it, polynomial time can still be slow for very large n with very big powers... although not as bad as an exponential with large n's (assuming you go out far enough that the exponential will grow faster then the polynomial)

I'm dead tired and will look at the paper in the morning. But right now I have a problem with step 6: "Let q be the largest prime factor of r-1" Won't getting q boost the thing back into power n complexity?

I took a second look, and I'm pretty sure about it. To complete this algorithm for n, for every prime rn, you will need to find the largest prime factor of r-1.

the while condition, r < n, is a red herring. The loop will exit when it hits the inner break. It's guaranteed to finish in O(log^6 n) iterations (or so they say--I haven't verified the proof personally!).

There are 2 different problems: 1) Determining if a number is prime [is 909 prime?] 2) Determining the factors of a number [what are the factors of 909?]

This article claims to be able to solve problem 1 in Polynomial time.

However, problem 2 is MUCH harder, and that is the one which will break cryptography as we know it. This article does not claim to solve problem 2, so we're safe for now.

Re:Implications (0)

Anonymous Coward | about 12 years ago | (#4023359)

This is in fact incorrect. Back in the 80s a poly-time algorithm for factoring was given, assuming a poly-time primality test.

There have been comments as to the relation of this finding to the strength of modern prime depending crypto algorithms. Based on my understanding won't this increase the stength of crypto not decrease it.

Prime based encryption schemes (RSA,etc) are based on the complexity of factoring large numbers into primes, not the the ability to determine if a number is prime or not.

Incidently, many implementiotions make use of pseudo-primes, as the ability to validate primeness is (or was) cumbersome, and not actual primes. This should give implementations the ability to ensure key pair values are actual primes which would strengthen the resulting encryption.

helps RSA key generation - NOT factoring (1, Informative)

Anonymous Coward | about 12 years ago | (#4023332)

There seems to be a lot of confusion in the previous posts. This algorithm has nothing to do with factoring or breaking RSA encryption. It has everything to do with making it easier and more reliable to generate RSA keys.

Generating an RSA key pair requires choosing two very large numbers and making sure that they are both primes. This is very time consuming, and the best current algorithms only tell you that this number is a prime with 99.99...% probability (exact value depends on the number of iterations).

An efficient algorithm that was not probabalistic would be a very good thing.

We already have probabilistic algorithms that can tell whether a number is prime or not in polynomial time to any degree of certainty you wish.

All this would mean is that now instead of verifying that a number is prime with a (1-10^-10) level certainty in polynomial time, it could now be done with certainty, so there would be no revolution in cryptology, as some other posters suggest.

I wonder if this algorithm can be incorporated into the gimps software to locate mersenne primes more quickly. Then again, they are probably using other tricks that speed up the search that are based on the properties unique to a mersenne prime. Does anyone have any solid input as to whether this might be useful to those folks, because obviously I'm just speculating here.

The famous result by Miller 1976 (and indepdently rediscovered(?) by Rabin 1980) already
did that. The only difference is that their algorithm was in RP (randomized polynomial). Namely, if the algorithm says it is prime it might be wrong (with probablity half, say), and if it says that the number is not prime, then it is not prime for sure.

Now, if you have a number n, you run this algorithm, say 20*log(n) times. If the algorithm
says it is prime on all executions that it is prime, you know damn sure it is. If it says it isn't, you are sure it isn't. There is a rediclously tiny probablity that if the algorithm claims that it is prime in all executions, that it is still not prime. This probablity is so small, that it can be essentially ignored. Now, random bits are cheap nowadays, so this is quite satisfactory. This is in fact the algorithm that turned the RSA crypto system into a practical and useful algorithm, because suddently finding primes became easy.

To break RSA, and become really famous, one has to come up with a polynomial time algorithm for factoring. It might even be that RSA can be broken without factoring, but this is still an open question (I think).

Ahh, and BTW. Polynomial time means polynomial time in the size of the input. So if the number is n, the size of the input is O(log(n)), and the running time needs to be O( (log(n))^(O(1)) ).

Ok. End of boredom.

huh? (0)

Anonymous Coward | about 12 years ago | (#4023378)

This result, if true, is very interesting from a theory standpoint.

As far as practice, it's fairly irrelevant. Probabilistic primality testing can be done in constant time with bounded error.

The Miller-Rabin test [mit.edu] will tell you if a number is prime with at most 1/4 probability of error. That sounds ridiculous, but the catch is that you can iterate it using a random parameter. Do the test twice and your probability drops to 1/16. Do it fifteen times and your chances of being wrong are about one billionth.

If you're truly paranoid, do it 50 times. That'll bring the error rate of the algorithm magnitudes below the error rate of your hardware.

...that in praxis it doesn't really matter if problem is P or NP when P is 100000000000000000*n while NP solution could be 0.00000000000001*(2^n) - the complexity class is nice thing but not all.

## Huh (0, Flamebait)

## KingKire64 (321470) | about 12 years ago | (#4023174)

## Re:Huh (1, Offtopic)

## zapfie (560589) | about 12 years ago | (#4023201)

## Re:Huh (0, Troll)

## KingKire64 (321470) | about 12 years ago | (#4023216)

you guys need to relax

## frist ps0t (-1, Troll)

## Anonymous Coward | about 12 years ago | (#4023175)

## fp (-1, Offtopic)

## Anonymous Coward | about 12 years ago | (#4023176)

## first post (-1, Offtopic)

## Anonymous Coward | about 12 years ago | (#4023178)

## The only prime... (-1, Offtopic)

## Anonymous Coward | about 12 years ago | (#4023179)

## p=np? (0)

## freality (324306) | about 12 years ago | (#4023180)

## Re:p=np? (-1, Redundant)

## Anonymous Coward | about 12 years ago | (#4023212)

## Re:p=np? (1)

## RabeiUsura (572945) | about 12 years ago | (#4023282)

## you have it backwards (0)

## Anonymous Coward | about 12 years ago | (#4023301)

## Re:p=np? (1, Informative)

## Anonymous Coward | about 12 years ago | (#4023319)

But technically speaking, NP-complete problems actually form a "sub"class of NP problems.

And yes, primes is not _known to_ be a NP-complete problem, so this doesn't really affect complexity of 3-SAT directly.

## Re:p=np? (1)

## allsmileys4u (595247) | about 12 years ago | (#4023365)

## Big consequences related to encryption (-1, Interesting)

## Anonymous Coward | about 12 years ago | (#4023184)

## Re:Big consequences related to encryption (5, Funny)

## God! Awful (181117) | about 12 years ago | (#4023254)

Well, encryption based on the multiplication of large primes, anyway.

Yeah... that step in key generation where you check whether a candidate key is prime or not will now be performed with 100% confidence instead of that annoying 99.999999999999999% confidence we used to have.

-a

## Re:Big consequences related to encryption (2, Insightful)

## DavidTC (10147) | about 12 years ago | (#4023296)

This may seem like a strange question, but isn't a non-prime that passes the 99.99999999999% check just as good as a prime in encryption? I mean, seriously, is anyone really going to ;)

noticeit's not prime? Sure, they could accidently stumble on the 'wrong' factors while trying decode it, but, seriously, halving the time to decode your message isn't a huge mistake, considering we're talking on the order of centuries at least.## Re:Big consequences related to encryption (2)

## jpmorgan (517966) | about 12 years ago | (#4023331)

reallyweak.## Re:Big consequences related to encryption (2, Interesting)

## mpsmps (178373) | about 12 years ago | (#4023370)

Consider a simple public key encryption algorithm based on the fact proved in any beginning number theory book that for primes p, q

a^x = a (mod pq) if x = 1 (mod m)

where m = (p - 1)(q - 1)

Now choose your favorite number f and use Euclidean algorithm to efficiently find a number g such that

fg = 1 (mod m)

You may have to try another value of f if the Euclidean algorithm terminated before reaching 1, but it won't take many guesses. Now publish the number f and mod m as your public key and keep g private.

Someone sends text t to you by sending t^f (mod m).

Now you just raise that message to the power g and reduce mod m to recover the original text. (This follows immediately by combining the above statements).

Finally, I'll get to the point. This algorithm is simply busted if p and q are not prime because t^fg will not equal t mod m unless you are very lucky. In fact, if you want to add a bunch of nines to your percentage certainty, just encrypt and decrypt a sample message text and verify that it works.## Re:Big consequences related to encryption (0)

## Anonymous Coward | about 12 years ago | (#4023304)

Now, on the other hand, I would rate the probability that this article holds to 0.1%....

## Re:Big consequences related to encryption (3, Insightful)

## Goonie (8651) | about 12 years ago | (#4023335)

## Re:Big consequences related to encryption (0)

## Anonymous Coward | about 12 years ago | (#4023366)

## Re:Big consequences related to encryption (1, Funny)

## Anonymous Coward | about 12 years ago | (#4023388)

## Re:Big consequences related to encryption (5, Insightful)

## tbo (35008) | about 12 years ago | (#4023347)

All this does is push primality testing from the BPP complexity class into the P complexity class.It is significant in the sense that it weakens the argument for BPP being larger than P.Of course, we also have a polynomial-time algorithm for prime factorization (Shor's Algorithm). It's just that it requires a quantum computer, which is difficult to build. So far, the biggest number factored is 15... 1024 bit keys will be safe for a while yet. I believe it's 15 - 20 years until they're broken, if Moore's Law holds for quantum computers in terms of maximum number of qubits possible (so far, it roughly has, but then, we're only at about 7 qubits).

## prime (-1, Troll)

## Anonymous Coward | about 12 years ago | (#4023196)

## I always thought is was in P (0, Informative)

## Mr. Sketch (111112) | about 12 years ago | (#4023200)

bool isprime(p)

for i = 2 to sqrt(p)

if p mod i == 0

return false

endif

endfor

return true

endfunc

If I'm not correct, that algo is O(n), thus polynomial, thus in P. But for very large p, that algo is impractical.

## Re:I always thought is was in P (0)

## Anonymous Coward | about 12 years ago | (#4023214)

## Re:I always thought is was in P (0)

## Anonymous Coward | about 12 years ago | (#4023215)

## Re:I always thought is was in P (0)

## Anonymous Coward | about 12 years ago | (#4023227)

Er, I would call a simple divide "considerable".

## Re:I always thought is was in P (5, Informative)

## Anonymous Coward | about 12 years ago | (#4023226)

## mod parent up (2)

## orz (88387) | about 12 years ago | (#4023236)

## Re:I always thought is was in P (3, Insightful)

## Walker (96239) | about 12 years ago | (#4023234)

digitsof p. That algorithm is definitely non-polynomial for that (correct) n.## Re:I always thought is was in P (-1)

## SweetAndSourJesus (555410) | about 12 years ago | (#4023303)

I like wasting one of my two allowed posts on a senseless act of profanity.

## Nobel Prize Time (1, Offtopic)

## flonker (526111) | about 12 years ago | (#4023202)

## cripes (-1, Offtopic)

## Anonymous Coward | about 12 years ago | (#4023211)

## Re:Nobel Prize Time (0)

## Anonymous Coward | about 12 years ago | (#4023219)

Read more here:

http://mathworld.wolfram.com/FieldsMedal.h

## Re:Nobel Prize Time (0)

## Anonymous Coward | about 12 years ago | (#4023242)

## Re:Nobel Prize Time (2, Informative)

## RackinFrackin (152232) | about 12 years ago | (#4023229)

If I'm reading this correctly, we've got a nearly guaranteed winner of the Nobel Prize here.This would be more in line for a Fields Medal than a Nobel Prize.

## Re:Nobel Prize Time (0)

## Anonymous Coward | about 12 years ago | (#4023233)

## Re:Nobel Prize Time (0)

## Anonymous Coward | about 12 years ago | (#4023239)

## Re:Nobel Prize Time (2)

## EvanED (569694) | about 12 years ago | (#4023248)

## Re:Nobel Prize Time (1)

## orz (88387) | about 12 years ago | (#4023279)

What category? Unfortunately, there's no mathematics Nobel Prize. I realize that they consider applications, but I don't see many in physics, chemistry, medicine, economics, peace, or literature.The Nobel Prize for literature of course.

## Re:Nobel Prize Time (0)

## Anonymous Coward | about 12 years ago | (#4023250)

http://www.mersenne.org/primenet/

mmmmmh...massively parallel distributed O(n) prime number calculator...

## Re:Nobel Prize Time (1)

## amnesty (69314) | about 12 years ago | (#4023251)

Justin

## for the sake of our eyes (5, Informative)

## fatmav (148996) | about 12 years ago | (#4023221)

http://www.cse.iitk.ac.in/primality.ps [iitk.ac.in]

## What's Polynomial Time? (2, Interesting)

## Stephen VanDahm (88206) | about 12 years ago | (#4023223)

Steve

## Re:What's Polynomial Time? (2)

## orz (88387) | about 12 years ago | (#4023247)

T a)

for some value (can be any finite value) of k and a.

## Oops. HTML (2)

## orz (88387) | about 12 years ago | (#4023259)

T N ^ k + a

for some values (can be any finite value) of k and a.

Basically, it's a statement about how well an algorithm scales to REALLY large numbers.

## arg. I did it again (3, Informative)

## orz (88387) | about 12 years ago | (#4023269)

They're saying the the time T necessary to determine whether or not an N digit number is prime satisfies this equation:

T is less than N ^ k + a

for some values (can be any finite value) of k and a.

Basically, it's a statement about how well an algorithm scales to REALLY large numbers.

## Re:arg. I did it again (1, Informative)

## Anonymous Coward | about 12 years ago | (#4023287)

## say again (0)

## Anonymous Coward | about 12 years ago | (#4023263)

## Re:What's Polynomial Time? (5, Informative)

## Goonie (8651) | about 12 years ago | (#4023307)

n, a polynomial time algorithm is guaranteed to give a solution in time proportional to a polynomial ofn. If a polynomial-time algorithm exists that solves a problem, then the problem is said to be in polynomial time.To give an example, say you've got a list of numbers and you want to know the sum. That can be done in

linear time- ie, the time taken is proportional to the length of the list of numbers. The size of the problem (n) is defined by the length of the list and the time taken (T) is as follows: T = c1 * n + c0, where c1 and c0 are some fixed constants. The formula for T is a polynomial, and so the problem "LIST-SUM" is in polynomial time. It would still be in polynomial time if the formula for T was a polynomial with n^2, n^3, n^50 terms in it, or even terms like n^1.5 (because as n grows very large an n^1.5 term will always be smaller than an n^2 term).Showing you an example of something outside polynomial time is a little more difficult, but some standard examples are SAT (the satisfiability problem) or the travelling-salesman problem, which you can read about in any book on the subject.

## Re:What's Polynomial Time? (2, Informative)

## jeffy124 (453342) | about 12 years ago | (#4023321)

an NP-complete (NP=non-polynomial) problem is one that can be solved, but takes about 8*age_of_universe time to solve. To get around this, approximation algorithms are used, but these can never give a 100% guarantee of finding the correct solution, nor may provide the same solution if it were to execute on the same data twice.

a polynomial-time problem is one that can be solved within our lifetimes, guarantee 100% accuracy, and can always generate the same solution for the same data.

there's a LOT more to it. The book Intro to Algorithms has a good chapter on the topic of NP-completeness, which will explain the intricate and gory details.

## Re:What's Polynomial Time? (2, Informative)

## platipusrc (595850) | about 12 years ago | (#4023348)

ie, it can be completed by a nondeterministic machine in polynomial time. The main problem with NP algorithms is that there aren't any nondeterminisitic machines around. (A nondeterministic machine can attempt all paths to try to reach a conclusion at once whereas a deterministic machine can only try one at a time.)

## Re:What's Polynomial Time? (0)

## Anonymous Coward | about 12 years ago | (#4023350)

## Ignore parent (2)

## God! Awful (181117) | about 12 years ago | (#4023355)

An NP-complete problem does not take 8 times the age of the universe to solve. This completely missed the point. Every P or NP problem can be expressed in terms of a variable "n", which represents the input size. There are many practical problems where the best-known P algorithm is slower than the best NP algorithm

for typical values of n. However, computational theory tells us that as n increases, the P algorithm will eventually beat the NP one.-a

## Re:What's Polynomial Time? (1)

## MrCreosote (34188) | about 12 years ago | (#4023339)

## Cryptography (1)

## jesterzog (189797) | about 12 years ago | (#4023225)

Out of interest, will this finding have any impact on the effectiveness of present day cryptography?

## Re:Cryptography (0)

## Anonymous Coward | about 12 years ago | (#4023255)

## Re:Cryptography (3, Informative)

## God! Awful (181117) | about 12 years ago | (#4023324)

Out of interest, will this finding have any impact on the effectiveness of present day cryptography?

Probably not. While it is possible that this research could lead to results in speeding up factoring, a faster algorithm for determining whether a number is prime is not going to compromise the security of RSA.

Your RSA key pair is derived from 2 large primes. The way we generate keys is to randomly test large random numbers to see if any of them are prime. Ergo, we must already have an efficient formula for determining if a number is prime or not.

FYI, the most commonly used algorithm is Euler's formula. Euler's formula doesn't actually tell you if a number is prime, but it will usually give a non-zero output if the number is not prime, so if you run it enough times with different inputs, you can be 99.99999% sure that a number is prime. However, a small percentage of numbers are "pseudoprimes" -- numbers that are not prime but which will also satisfy Euler's formula. Therefore, after you discover a candidate prime, you should use a different (slower) formula to double-check.

Since this is fairly common knowledge among geeks who use encryption, I'm somewhat surprised that so many people here jumped to the same conclusion you did.

-a

## sigs (0)

## Anonymous Coward | about 12 years ago | (#4023379)

## If this is true... (1)

## Jon Howard (247978) | about 12 years ago | (#4023228)

They could have easily taken over the infrastructure of a modernized computer-bent, encryption-shielded society such as the US or Japan. If that is indeed the case, these guys deserve a Nobel Peace Prize for giving this powerful tool to all and not using it as a weapon of war.

This does, however, sound unlikely. Any mathematicians out there care to comment?

## Re:If this is true... (5, Informative)

## ipfwadm (12995) | about 12 years ago | (#4023311)

They could have easily taken over the infrastructure of a modernized computer-bent, encryption-shielded society such as the US or Japan.Primality testing and factorization are not one and the same. It is possible to know that a number is not prime without knowing its factors. Breaking encryption requires factoring the product of two huge primes (it is already known that the number you're trying to factor is NOT prime, so Primes being in P is more or less useless by itself for this particular application), and factorization has yet to be shown to be in P.

## Crypto repercusions? (2, Interesting)

## Toodles (60042) | about 12 years ago | (#4023235)

While that may not be thrilling at first, let's use the RCA contest for money as an example. We get a 1024 bit number containing 200 digits in decimal formm, which is the product of exactly two prime numbers. We know then that:

1. We only need to find one prime to easily find the other.

2. The digits in the factors can total no more than 200 digits.

3. One of the factors contains less than 100 digits.

Start at 10^100 and count down using this algorythm, and youll find it in P time instead of NP time. It'll still take forever, literally and figuratively, but wouldn't it take significantly less time than before?

Toodles

## Re:Crypto repercusions? (1)

## Eu4ria (110578) | about 12 years ago | (#4023277)

## I don't think so. (2)

## orz (88387) | about 12 years ago | (#4023309)

## Re:Crypto repercusions? (1)

## DavidTC (10147) | about 12 years ago | (#4023325)

3. One of the factors contain less than or exactly 100 digits.

Now wouldn't you feel silly if they both were exactly 100 digits prime and you spent all that time look at 99 digit primes and below? ;)

## Re:Crypto repercusions? (1, Funny)

## Anonymous Coward | about 12 years ago | (#4023383)

.

.

.

>Start at 10^100 and count down using this algorythm

You do realize that there are 9999999999999999999999999999999999999999999999999

## Oh yeah? Well... (0, Offtopic)

## God! Awful (181117) | about 12 years ago | (#4023243)

-a

## Re:Oh yeah? Well... (3, Funny)

## braindead (33893) | about 12 years ago | (#4023336)

I have developed an algorithm that will determine if any number less than a google is prime in O(1). Above a google it degrades pretty fast, though.Aha... is that based on a precomputed sieve with 10^100 elements, by any chance? ;-)

(it's googol, by the way. Google.com is obviously not prime, based on the fact that they haven't IPO'd yet)

## Re:Oh yeah? Well... (0)

## Anonymous Coward | about 12 years ago | (#4023349)

## Re:Oh yeah? Well... (2)

## Kwikymart (90332) | about 12 years ago | (#4023357)

`www.google.com'. Google is highly esteemed among hackers for its

significance ranking system, which is so uncannily effective that many

users consider it to have rendered other search engines effectively

irrelevant. The name `google' has additional flavor for hackers because

most know that it was copied from a mathematical term for ten to the

hundredth power, famously first uttered by a mathematician's infant

child.

---------

googol

n : a cardinal number represented as 1 followed by 100 zeros

(ten raised to the power of a hundred)

There is a HUGE difference between the two.

## Primes Are In P?? (0, Offtopic)

## dbretton (242493) | about 12 years ago | (#4023249)

Inquiring minds want to know!

## If this turns out to be true... (2, Interesting)

## angramainyu (139131) | about 12 years ago | (#4023253)

## Re:If this turns out to be true... (1)

## jasonditz (597385) | about 12 years ago | (#4023385)

## primes (0)

## Anonymous Coward | about 12 years ago | (#4023266)

## no (0)

## Anonymous Coward | about 12 years ago | (#4023286)

## No major implications for crypto as far as I see.. (0)

## bbuda (168824) | about 12 years ago | (#4023270)

## Cryptography (1)

## Erpo (237853) | about 12 years ago | (#4023283)

## Factoring might still be NP (5, Informative)

## IntelliTubbie (29947) | about 12 years ago | (#4023293)

doesn'ttell you what these divisors actually are. Determining whether a number is prime has always been considerably easier than finding the prime factorization.In fact, for schemes like RSA -- where the key is the product of two large primes -- we already know that the number is composite, by definition, so a more efficient primality test doesn't give us any new information.

Cheers,

IT

## Re:Factoring might still be NP (0)

## Anonymous Coward | about 12 years ago | (#4023356)

## What is 'x' and how is 'q' calculated? (4, Interesting)

## Mr. Sketch (111112) | about 12 years ago | (#4023297)

## Funny (2, Informative)

## Alizarin Erythrosin (457981) | about 12 years ago | (#4023298)

Makes me wonder what this means for computer theory, but if you think about it, polynomial time can still be slow for very large n with very big powers... although not as bad as an exponential with large n's (assuming you go out far enough that the exponential will grow faster then the polynomial)

Kudos to the team that discovered this

## I'm dead tired (3, Interesting)

## Henry V .009 (518000) | about 12 years ago | (#4023315)

"Let q be the largest prime factor of r-1"

Won't getting q boost the thing back into power n complexity?

## Re:I'm dead tired (2)

## Henry V .009 (518000) | about 12 years ago | (#4023354)

## Re:I'm dead tired (1)

## Henry V .009 (518000) | about 12 years ago | (#4023372)

Thank you, lameness filter. I think. I am tired and could have missed it.

## Re:I'm dead tired (1)

## epictetus (5123) | about 12 years ago | (#4023373)

## Re:I'm dead tired (1)

## Henry V .009 (518000) | about 12 years ago | (#4023389)

## Re:I'm dead tired (1)

## epictetus (5123) | about 12 years ago | (#4023362)

## If you're confused by "P" and "NP".... (5, Informative)

## Erpo (237853) | about 12 years ago | (#4023318)

## Implications (5, Informative)

## davemarmaros (598966) | about 12 years ago | (#4023322)

1) Determining if a number is prime [is 909 prime?]

2) Determining the factors of a number [what are the factors of 909?]

This article claims to be able to solve problem 1 in Polynomial time.

However, problem 2 is MUCH harder, and that is the one which will break cryptography as we know it. This article does not claim to solve problem 2, so we're safe for now.

## Re:Implications (0)

## Anonymous Coward | about 12 years ago | (#4023359)

## Impact on Crypto is Postive (2, Insightful)

## Majestik (101669) | about 12 years ago | (#4023330)

Prime based encryption schemes (RSA,etc) are based on the complexity of factoring large numbers into primes, not the the ability to determine if a number is prime or not.

Incidently, many implementiotions make use of pseudo-primes, as the ability to validate primeness is (or was) cumbersome, and not actual primes. This should give implementations the ability to ensure key pair values are actual primes which would strengthen the resulting encryption.

## helps RSA key generation - NOT factoring (1, Informative)

## Anonymous Coward | about 12 years ago | (#4023332)

Generating an RSA key pair requires choosing two very large numbers and making sure that they are both primes. This is very time consuming, and the best current algorithms only tell you that this number is a prime with 99.99...% probability (exact value depends on the number of iterations).

An efficient algorithm that was not probabalistic would be a very good thing.

## Nothing revolutionary (3, Interesting)

## CmdrSam (136754) | about 12 years ago | (#4023340)

All this would mean is that now instead of verifying that a number is prime with a (1-10^-10) level certainty in polynomial time, it could now be done with certainty, so there would be no revolution in cryptology, as some other posters suggest.

--Sam L-L

## gimps (1)

## sc00p18 (536811) | about 12 years ago | (#4023352)

## We already knew that... (5, Informative)

## FalafelXXX (598968) | about 12 years ago | (#4023367)

Now, if you have a number n, you run this algorithm, say 20*log(n) times. If the algorithm says it is prime on all executions that it is prime, you know damn sure it is. If it says it isn't, you are sure it isn't. There is a rediclously tiny probablity that if the algorithm claims that it is prime in all executions, that it is still not prime. This probablity is so small, that it can be essentially ignored. Now, random bits are cheap nowadays, so this is quite satisfactory. This is in fact the algorithm that turned the RSA crypto system into a practical and useful algorithm, because suddently finding primes became easy.

To break RSA, and become really famous, one has to come up with a polynomial time algorithm for factoring. It might even be that RSA can be broken without factoring, but this is still an open question (I think).

Ahh, and BTW. Polynomial time means polynomial time in the size of the input. So if the number is n, the size of the input is O(log(n)), and the running time needs to be O( (log(n))^(O(1)) ).

Ok. End of boredom.

## huh? (0)

## Anonymous Coward | about 12 years ago | (#4023378)

## Half empty? (1)

## legomad (596194) | about 12 years ago | (#4023381)

## Primality testing has never been hard (5, Informative)

## sasami (158671) | about 12 years ago | (#4023384)

As far as practice, it's fairly irrelevant. Probabilistic primality testing can be done in

constanttime with bounded error.The Miller-Rabin test [mit.edu] will tell you if a number is prime with at most 1/4 probability of error. That sounds ridiculous, but the catch is that you can iterate it using a random parameter. Do the test twice and your probability drops to 1/16. Do it fifteen times and your chances of being wrong are about one billionth.

If you're truly paranoid, do it 50 times. That'll bring the error rate of the algorithm magnitudes below the error rate of your

hardware.---

Dum de dum.

## Don't forget...! (1)

## Libor Vanek (248963) | about 12 years ago | (#4023396)