Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

D-Wave Large-Scale Quantum Chip Validated, Says USC Team

timothy posted about a year ago | from the or-they-don't dept.

Hardware 141

An anonymous reader writes "A team of scientists says it has verified that quantum effects are indeed at work in the D-Wave processor, the first commercial quantum optimization computer processor. The team demonstrated that the D-Wave processor behaves in a manner that indicates that quantum mechanics has a functional role in the way it works. The demonstration involved a small subset of the chip's 128 qubits, but in other words, the device appears to be operating as a quantum processor."

cancel ×

141 comments

Sorry! There are no comments related to the filter you selected.

intel dead? (-1)

Anonymous Coward | about a year ago | (#44138513)

siiiiiiiiiiii

Re:intel dead? (0)

Anonymous Coward | about a year ago | (#44139383)

I'm pretty sure that Intel CPUs still kick the shit out of this thing performance-wise. Quantum computing may one day become the thing, but for now it's just a novelty.

Re:intel dead? (2, Informative)

Shavano (2541114) | about a year ago | (#44139503)

It won't become the thing for general computing use. There are specific applications where quantum operations can compute faster, but if it's a matter of what computers are normally used for, standard digital computing hardware is the thing.

That said, quantum processor cores may become an accessory you can buy for your computer, complete with the software needed to set up quantum optimization problems, and high end scientific workstations might have them built in some day.

I'm waiting (-1)

Anonymous Coward | about a year ago | (#44138519)

ppl to die and things to explode due to "quantum" fail computing... they just admit they're aproaching it by the wrong interpretations...

It Still Doesn't Mean Much... (3, Interesting)

tibit (1762298) | about a year ago | (#44138567)

Yeah, quantum effects are directly noticeable in the way it operates. Yeah, yeah, whatever. The whole deal isn't about that. It's about whether those quantum effects are actually useful for something. Like, um, making it usefully faster than classical computers. I would be very happy even if they had shown "just" polynomial running time improvements, say executing an O(N^3) algorithm in O(N^2) time. Even that would be a big deal. Somehow, I'm very skeptical that anything of the sort will ever be shown for this particular architecture. I would so like to be wrong on that.

Re:It Still Doesn't Mean Much... (1)

jamesh (87723) | about a year ago | (#44138623)

Yeah, quantum effects are directly noticeable in the way it operates. Yeah, yeah, whatever. The whole deal isn't about that. It's about whether those quantum effects are actually useful for something. Like, um, making it usefully faster than classical computers. I would be very happy even if they had shown "just" polynomial running time improvements, say executing an O(N^3) algorithm in O(N^2) time. Even that would be a big deal. Somehow, I'm very skeptical that anything of the sort will ever be shown for this particular architecture. I would so like to be wrong on that.

That's where i'm at right now too. I wonder if the future may see my point of view in the same way as those who said people could never travel fast on a steam train because the air would be sucked out of the cabin...

Re:It Still Doesn't Mean Much... (1)

Anonymous Coward | about a year ago | (#44138631)

The D Wave computer has demonstrated it's ability to solve optimization problems incredibly fast, and that is incredibly useful for a lot of companies and scientists.

'incredibly' (-1)

Anonymous Coward | about a year ago | (#44138711)

You keep using that word, I do not think it means what you think it means.

incredibly /inkredbl/
Adverb

        Used to introduce a statement that is hard to believe; strangely: "incredibly, he was still alive".

Synonyms
unbelievably

Re:'incredibly' (2, Informative)

Anonymous Coward | about a year ago | (#44138833)

Fixed that for you, you left out the first/primary definition as shown below...

incredibly
Adverb

        1. To a great degree; extremely: "incredibly brave".
        2. Used to introduce a statement that is hard to believe; strangely: "incredibly, he was still alive".

Synonyms
unbelievably

Re:'incredibly' (0)

Anonymous Coward | about a year ago | (#44139059)

do you know what the word credible means... yeah english evolves... or devolves. whatever

Re: 'incredibly' (0)

Anonymous Coward | about a year ago | (#44139183)

yeah well the results aren't great either so i guess you fail twice

Re:It Still Doesn't Mean Much... (3, Insightful)

Shavano (2541114) | about a year ago | (#44139529)

No it hasn't.

Re:It Still Doesn't Mean Much... (0)

Anonymous Coward | about a year ago | (#44140045)

Can you demonstrate your ability to master the apostrophe?

Re:It Still Doesn't Mean Much... (2)

amaurea (2900163) | about a year ago | (#44140775)

Really? I thought it was 12,000 times slower [archduke.org] than a normal computer when solving the one problem it does best, while costing approximately as many times more than said normal computer. That isn't exactly "incredibly fast" or "incredibly useful", is it? Scientists aren't too happy about it either, because the science, if it exists, is not being published.

Re:It Still Doesn't Mean Much... (1)

gl4ss (559668) | about a year ago | (#44140897)

The D Wave computer has demonstrated it's ability to solve optimization problems incredibly fast, and that is incredibly useful for a lot of companies and scientists.

no it hasn't, even this report says it's just potentially possible for it to solve them faster than traditional cpu.

the article is a little light on details, but it just says that it uses some quantum effect in some way. you know what that means? it means that technically it's not a _total_ fraud (bang for buck it is a fraud still though).

Re:It Still Doesn't Mean Much... (1, Funny)

Anonymous Coward | about a year ago | (#44138661)

Scientists prove Intel silicon chip conducts electricity!

A team of scientists says it has verified that electrical effects are indeed at work in the Intel processor. The team demonstrated that the Intel processor behaves in a manner that indicates that electrons have a functional role in the way it works. The demonstration involved a small subset of the chip's silicon traces, but in other words, the device appears to be operating as a circuit.

Scientists prove abacus beads are mobile!

A team of scientists says it has verified that bead sliding is indeed at work in the abacus. The team demonstrated that the abacus behaves in a manner that indicates that beads moving back and forth have a functional role in the way it works. The demonstration involved a small subset of the abacus's wooden beads, but in other words, the device appears to be operating as an arithmetic aid.

Re:It Still Doesn't Mean Much... (1)

Anonymous Coward | about a year ago | (#44138749)

I always get confused by this O notation. But, why would a quantum computer would reduce the O notation. If you don't fix the algorithm, the processor would still take n or n^2 or n^3, no?

For what I thought a quantum computer would just actually make the time each "n" takes quite small. But I never thought it would make an O(n^3) run in O(n^2) time at all.

Re:It Still Doesn't Mean Much... (3, Informative)

tibit (1762298) | about a year ago | (#44139155)

You can't fight an exponential or even polynomial complexity merely by reducing constant factors. It doesn't matter what the constant factor is. All it takes is bumping, say, RSA from 4096 to 16384 bits. That's all you need to beat any conceivable reduction in the constant factor. Just think about it.

Re:It Still Doesn't Mean Much... (4, Informative)

firewrought (36952) | about a year ago | (#44139205)

Why would a quantum computer would reduce the O notation?

Because it's running in multiple worlds simultaneously? It's not just using 1's and 0's but superpositions of the two that are effectively in both states at once. Heh... I'm really don't understand this stuff, but the big deal about quantum computing is that it will make some previously intractable (e.g., non-polynomial) problems accessible to us. All problems in complexity class BQP [wikipedia.org] become, essentially, polynomial on a quantum computer. If you've got enough qbits, among other things.

Re:It Still Doesn't Mean Much... (1)

Anonymous Coward | about a year ago | (#44139517)

A quantum computer can solve public key encryption in O(1) while it takes classical computers O(N^3). Which is the difference between minutes and billions of years.

Re:It Still Doesn't Mean Much... (3, Informative)

Anonymous Coward | about a year ago | (#44139855)

Pedantic nitpick: Quantum computers cannot break public key (RSA) encryption in O(1) time; for a modulus N the time complexity is O(Log(n)^3).

Re:It Still Doesn't Mean Much... (1)

Anonymous Coward | about a year ago | (#44138945)

One small step dude. Maybe one day it will lead to a standard quantum computer. But like searching for life outside our planet, going to the moon was pretty damn cool and so is this.

Re:It Still Doesn't Mean Much... (2)

cavreader (1903280) | about a year ago | (#44139355)

Exactly! I am glad just to know that someone is actually working on projects like this. It's not just another generation of current CPU technology it is something new and in time they will either master the technology or abandon the technology if things don't work out. But either way it is just nice to know there are people skilled and dedicated enough to attempt these advances.

Re:It Still Doesn't Mean Much... (1)

Behrooz Amoozad (2831361) | about a year ago | (#44140229)

You just made the only analogy that could make me think this chip is made by J.J. Abrams.

Re:It Still Doesn't Mean Much... (0)

Anonymous Coward | about a year ago | (#44138959)

I would be very happy even if they had shown "just" polynomial running time improvements, say executing an O(N^3) algorithm in O(N^2) time.

This is, sadly, a very common misconception even among people who otherwise understand big-O notation very well. In the real world, there is no such thing as inferring the big-O behavior of an algorithm based on measurements. Big-O is a notation that concerns only what happens for infinitely large input (to be precise, arbitrarily large input). Whatever input you've got in the real world, it's not infinite. Your input is also unlikely to be worst-case and big-O is about worst case behavior unless you specify otherwise, though this is a lesser concern as it is at least possible to construct worst-case input for a given algorithm. In contrast, it's impossible to get your input to be infinite and then run a finite experiment. Big-O is relevant to the real world in certain ways, yes, but not in the way that you think. No empirical observation will allow you to determine the asymptotic complexity of any implementation of any algorithm.

Re:It Still Doesn't Mean Much... (1)

tibit (1762298) | about a year ago | (#44139163)

No empirical observation will allow you to determine the asymptotic complexity of any implementation of any algorithm.

Said someone who never tried such empirical observations. You're silly.

Re:It Still Doesn't Mean Much... (1)

ardor (673957) | about a year ago | (#44139247)

He is actually right. By definition, this cannot give you a priori data. And that is what big O is - a priori information.
In practice, with measurements, you can take a pretty good guess as to what the complexity is, but you can never know if it is actually correct. This is a common mistake that people make, and can easily cause them to draw incorrect conclusions.

Re: It Still Doesn't Mean Much... (0)

Anonymous Coward | about a year ago | (#44139265)

Wow you have no fucking clue. Big-O is about time complexity for an input size; it's upper-bound (not tight upper bound) in the worst case, but this doesn't mean infinite. It means as the input size (n) increases, the number of operations will increase, proportionally, less than or equal to the function given for O (for some minimum input size). It has nothing to do with infinite size or whatever. (It's also not a tight bound so any O(n**2) is for instance also O(n**3) but that's not particularly useful.)

Re: It Still Doesn't Mean Much... (0)

Anonymous Coward | about a year ago | (#44139659)

Wow you have no fucking clue. Big-O is about time complexity for an input size; it's upper-bound (not tight upper bound) in the worst case, but this doesn't mean infinite. It means as the input size (n) increases, the number of operations will increase, proportionally, less than or equal to the function given for O (for some minimum input size). It has nothing to do with infinite size or whatever. (It's also not a tight bound so any O(n**2) is for instance also O(n**3) but that's not particularly useful.)

You got it all wrong. Big-O is indeed about the tight upper bound, and the complexity of the input size. And as the number of operations increase, you bet your ass that it will be particularly useful. Oh you bet your ass.

Re: It Still Doesn't Mean Much... (1)

SnowZero (92219) | about a year ago | (#44140635)

You got it all wrong. Big-O is indeed about the tight upper bound, and the complexity of the input size. And as the number of operations increase, you bet your ass that it will be particularly useful. Oh you bet your ass.

GP is being an ass, and doesn't seem to understand what "asymptotic complexity" means. However, you are incorrect about big-O, which does not need to be a tight bound. You're thinking of big-theta. Wikipedia has a concise summary:
    https://en.wikipedia.org/wiki/Big_theta#Family_of_Bachmann.E2.80.93Landau_notations [wikipedia.org]

Re:It Still Doesn't Mean Much... (1)

FatdogHaiku (978357) | about a year ago | (#44139053)

They're still working on that small issue of knowing only "where exactly the data is" or "what the data is"... on the plus side all you need do to flip a bit is observe it (but beware of infinite recursion). Personally I'm looking forward to the "Schrodinger Class" of processor... in spite of the strict No Refunds policy for open boxes.

Re:It Still Doesn't Mean Much... (1)

Hentes (2461350) | about a year ago | (#44139881)

Theoretically, it should be able to find the minimum of a set of numbers in O(N^0.5) instead of O(N). This is faster than a CPU, but likely slower than an equivalently priced GPU cluster.

Re:It Still Doesn't Mean Much... (1)

gweihir (88907) | about a year ago | (#44140107)

Indeed. That that have not strongly indicates that they cannot, because this thing is not actually useful.

The question is (2, Interesting)

Anonymous Coward | about a year ago | (#44138571)

Can it help crack today's cryptosystems, in what way, and how fast.

If it is able to do it then someone is doing it and we need to act.

Re:The question is (1)

Anonymous Coward | about a year ago | (#44138649)

No. It cannot. It can't do anything even as close as to as well as conventional semiconductors.

The point is, that this might one day have the potential to be more than electrical circuits, but for now, it's really just an interesting research project.

Re:The question is (1)

Bengie (1121981) | about a year ago | (#44139525)

Boeing and NASA are using DWAVE computers to crunch very specific types of data almost 10,000 times faster. A little more than just "research"

Re:The question is (1)

gweihir (88907) | about a year ago | (#44140125)

10'000 is in the range that specialized chips give you over general-purpose computers. You get it a bit cheaper though with classical chips, but nobody is doing it as it is still not worthwhile doing.

Re:The question is (4, Informative)

MightyYar (622222) | about a year ago | (#44138707)

Wrong kind of quantum computer. This does quantum annealing [wikipedia.org] .

Re:The question is (4, Interesting)

WaywardGeek (1480513) | about a year ago | (#44138743)

Not too surprisingly, when a large US military contractor became a major purchaser of D-Wave equipment, all the company claims about being able to factor large integers vanished. D-Wave was going to have a blog series on it. I looked at it's architecture carefully, and yes, if the D-Wave machine has low enough noise, then a 512-qbit D-Wave machine should be able to factor integers close to 500 bits long. The next bigger machine could tackle 2,000 bit integers. The machine seems almost perfectly suited to this problem. The trick is dealing with noise. No one at D-Wave claims that their machine is perfectly coherent all the time during the annealing process. If 1 of the 512 bits suddenly drops out of quantum coherence, it will still act like a normal simulated annealing element until it re-enters coherence. Is noise like that enough to throw off detection of that one minimum solution? I don't know. I do feel that quantum effects will have a positive impact up to some temperature, after which it will just act like a normal annealing machine. I think there will be a phase change at some temperature where instead of qbits occasionally dropping out of coherence, just adding some noise to the solution, there will be so many out of quantum coherence that they will not be able to function at a chip-wide quantum level, and there will be no chance of finding that minimum energy solution.

Re:The question is (3, Interesting)

WaywardGeek (1480513) | about a year ago | (#44138869)

I just went googling for my old posts about how to do integer factorization with D-Wave. Guess what? GONE! I thought I'd posted it in enough hard to scrub places... Anyway, all this machine is does is minimize an energy equation. I found somebody who had integer factorization coded as an energy equation as the sum of squared terms, but with the D-Wave machine, it does that naturally, and you don't need to square anything. I've got a lot going on at work, my mother is being sued, and I'm doing some genetics stuff. Do I really need to go back and recreate the integer factoring equation?

Re:The question is (5, Funny)

BradleyUffner (103496) | about a year ago | (#44139027)

I just went googling for my old posts about how to do integer factorization with D-Wave. Guess what? GONE!

That's what you get for observing them.

Re:The question is (0)

nextekcarl (1402899) | about a year ago | (#44139983)

You can either know the exact equation or it's exact location on the internet, but the Uncertainty principle clearly says you can't know both at the same time. We obviously know which he chose now.

Re:The question is (1)

Anonymous Coward | about a year ago | (#44139039)

Yes. Obviously you do. And you need to post it everywhere again. Duh.

Re:The question is (1)

WaywardGeek (1480513) | about a year ago | (#44139877)

Google this: dwave integer factorization New Scientist

Do you see all the "New Scientist" links near the top? Click on one of them. Of course you don't see it. These links started to fade to obscurity while I was writing this short response. If you do find one, you'll find the link goes nowhere.

Re:The question is (1)

WaywardGeek (1480513) | about a year ago | (#44139885)

By the way, the title of the New Scientist article should be "Controversial quantum computer beats factoring record"

Re:The question is (1)

WaywardGeek (1480513) | about a year ago | (#44139951)

Awe crud... it only factored 143. I factored 300+ bit numbers with custom algorithms in Python, which only sounds impressive until you find out what others have done. Still.. why are links to integer factorization by D-Wave machine being removed from Google results?

Re:The question is (0)

Anonymous Coward | about a year ago | (#44140093)

Again, you should re-write and publish these algorithms. Submit them to the EFF next time.

We'd all love to see them.

Re:The question is (0)

Anonymous Coward | about a year ago | (#44140141)

Because Google has turned to crap?

Sometime in the past 2 or 3 years it's search quality begin diminishing significantly. And that applies to both the computer science stuff I research, as well as legal stuff.

I would try Bing, but I suspect the better response is to stop relying on search engines so much, or "the cloud" in general, and make sure I have my own house in order regarding data management and preservation.

Re:The question is (0)

Anonymous Coward | about a year ago | (#44140421)

But but but! They said the military is 30 years ahead of current technology!
Fucking lying ass hippies!

Re:The question is (1)

gweihir (88907) | about a year ago | (#44140113)

No. So far everything points to this device not actually being able to do anything useful faster than classical computers.

Re:The question is (0)

Anonymous Coward | about a year ago | (#44140245)

Seems already solved pretty well, see this:

https://github.com/exaexa/codecrypt [github.com]

Imagine a computer with this Quantum processor (0)

Spy Handler (822350) | about a year ago | (#44138589)

and a Quantum Fireball hard drive... mind boggles

Re:Imagine a computer with this Quantum processor (1)

binarylarry (1338699) | about a year ago | (#44139599)

Quantum CPU + Quantum Fireball HD = HADOUKEN!

Was anyone really surprised by this? (1)

Anonymous Coward | about a year ago | (#44138593)

I think everyone pretty much knew this with any even remotely entry level knowledge on the topic.

It was doing things that no classical computer could do in any reasonable time at the size it is.
Those benchmarks not too far back especially proved this fact.

I guess now though it is good that it is 100% confirmed so the morons can shut the hell up about it.
Looking forward to see what their new 512Qubit system could do. (other than make encryption useless within a human lifetime)

Re:Was anyone really surprised by this? (0)

Anonymous Coward | about a year ago | (#44138639)

yes i am playing doom on it right now, its awful fast
the pixles are everywhere

Re:Was anyone really surprised by this? (1)

Empiric (675968) | about a year ago | (#44138681)

(other than make encryption useless within a human lifetime)

Not sure about that. Though qubits are great for prime factorization (the one-way function upon which mainstream cryptography relies, and breaks if it becomes no longer in practical terms one-way), I'm not sure that it would help for, say, one-time pads or chained-XOR encryption methods (notably, though trivially simple to implement, IIRC using it immediately disqualified an encryption system from being legally exportable). I think in those cases you end up with the quantum algorithm not finding the actual message out of all the possible messages the data could represent, but all the possible messages the message could have been.

I'll now await correction from an actual specialist in the field...

Re:Was anyone really surprised by this? (4, Informative)

timeOday (582209) | about a year ago | (#44138771)

Well, I can tell you that no amount of computation will help for a one-time pad. That would be essentially the same as decrypting an empty sheet of paper. There is no information in either half of an OTP duo; only in the differences between the halves.

Re:Was anyone really surprised by this? (0)

Anonymous Coward | about a year ago | (#44140183)

There are known quantum algorithms to crack RSA, DSA, and similar. In other words, there are known quantum solutions to prime factorization (RSA) and discrete logarithms (DSA).

However, cryptographers know plenty of other public-key algorithms for which there are no known quantum solutions that can dent the computational complexity. I'm unsure if any of those algorithms are provably impervious, though. I'm not a mathematician, just a security engineer. But rest assured that if ubiquitous quantum computers appeared overnight, there are practical alternatives to RSA that could be used, at least in the short-term.

As for secret-key algorithms such as block ciphers, quantum computers don't help much. There's no mathematical structure there that has a quantum short-cut.

Re:Was anyone really surprised by this? (1)

MightyYar (622222) | about a year ago | (#44138687)

Why would a quantum annealer help break encryption? Isn't that a different field of quantum problem (factoring)?

Re:Was anyone really surprised by this? (1)

gweihir (88907) | about a year ago | (#44140143)

Indeed it is. A quantum annealer is not a very useful thing, and it is not really faster than classical computers optimized for this.

Re:Was anyone really surprised by this? (1)

Anonymous Coward | about a year ago | (#44138723)

You mean the benchmarks where a classical computer was faster? And okay, it's 'quantum'. Shor's algorithm doesn't run on a quantum annealer... the marketing department of the company that sells them is less optimistic than you.

What are you, a quantum fanboy?

Re:Was anyone really surprised by this? (0)

Anonymous Coward | about a year ago | (#44138889)

No, I mean the 439 benchmark just recently that absolutely destroyed classic computers.
Mere seconds compared to over half an hour quicker.

You are way behind, Son.

Re:Was anyone really surprised by this? (4, Interesting)

the gnat (153162) | about a year ago | (#44139081)

No, I mean the 439 benchmark just recently that absolutely destroyed classic computers. Mere seconds compared to over half an hour quicker.

That was a terrible benchmark. They measured performance against possibly the most inefficient algorithm possible (using a third-party implementation) - not even remotely doing the same type of computations. That was where the "3600-fold" improvement came from. Some other computer scientists spent a bit of time optimizing an algorithm (also annealing, I think) for conventional computers in response, with the eventual result that their implementation was faster than the D-Wave. Which makes the entire effort sound like $10 million to avoid writing better software in the first place.

It vaguely reminds me of all of the GPU benchmarks I've seen where single-precision floating-point performance on the GPU is compared to double-precision performance on the CPU. Except orders of magnitude worse.

Re:Was anyone really surprised by this? (1)

Chuckstar (799005) | about a year ago | (#44139281)

But it doesn't matter what the times were for one specific run of the calculation. The question is how the two algorithms scale.

I saw a blog somewhere that the guy claimed the improved classical algorithm scales at the same rate as the quantum annealing algorith, meaning no gain for DWave. But there's wasn't any kind of proof in that post, just a claim.

Re:Was anyone really surprised by this? (0)

Anonymous Coward | about a year ago | (#44139831)

There are a few issues here.

First, it is impossible to prove asymptotic time complexity of an algorithm (or special purpose machine) by simply running it a bunch of times, and there is nothing (at least from what I have seen) to suggest that DWave has published enough information to properly (that is, mathematically) prove that their machine works at a better complexity class than what is achievable by algorithms written specifically for those same problems on a classical computer.

Second, if you insist on relying on the ugly kludge of trying to prove better asymptotic run time by simply using benchmarks, you're still in rather a bad situation, since DWave has not published any benchmarks that are significantly better than what can be achieved by a *competently chosen* algorithm on a classical computer. And since they have only given one of their machines to the university (USC) which has been basically carrying the torch for them in academia for quite some time, no one else -- particularly none of the academics who have been vocally critical of their claims -- is able to directly or indirectly use such a machine to make (or unmake, as might still be the outcome) their case.

Finally, the reality is simply that DWave was eminently happy to claim that the supposed 3600x performance increase proved that their machine was doing Something Special; now that it has been established to basically anyone who doesn't have his head up his very own ass that the claim was wildly overblown, they are trying to fall back on claims that on some unspecified larger problem they will perform better. Given that the proper comparison (meaning the one that chose its algorithm wisely) showed virtually no difference between a ~$10,000,000 DWave machine and a ~$2000 PC, one cannot help but wonder what sort of problem size would actually be required to see a cost benefit from the DWave and if said problem size is feasible (given other issues like storage and memory constraints) and useful.

I do hope that DWave's efforts bear some fruit; it would be disappointing if it turns out to be without any practical use, and even more disappointing if it turns out to have been a hoax (particularly given the rather impressive credentials of some involved with the company). However, this entire situation seems to underscore one of the major benefits of doing research -- particularly basic research -- in academia with publications all along the way. If DWave had been publishing anywhere near as much information as is expected from their counterparts working on quantum computing in academia, there would be far less reason at this point to consider that they are going down a dead-end road, or that they are mistaken about their machine's properties, or that (worst of all possibilities) they are deliberately misrepresenting the inner workings of their machine.

Re:Was anyone really surprised by this? (1)

Icegryphon (715550) | about a year ago | (#44138729)

I have video of the Vesuvius Chip that Google and Nasa are working with for AI.

http://www.youtube.com/watch?v=_Wlsd9mljiU [youtube.com]

Re:Was anyone really surprised by this? (0)

Anonymous Coward | about a year ago | (#44139771)

"I have video of the Vesuvius Chip that Google and Nasa are working with for AI. "

Didn't Vesuvius erupt and kill everybody?

Re:Was anyone really surprised by this? (1)

gweihir (88907) | about a year ago | (#44140135)

Sorry, but absolutely nothing has been confirmed and the moron is you. If you want to see "quantum effects at work", have a look at any LED. This does not mean anything and the wording is carefully designed to obscure that fact.

Great Scott! (5, Funny)

Anonymous Coward | about a year ago | (#44138595)

Great... now the NSA can record everything we do *and* everything we don't do in all possible parallel universes... Welp, the analog world was nice while it lasted I guess.

-- stoops

Re:Great Scott! (1)

Anonymous Coward | about a year ago | (#44138651)

1. This isn't a general quantum computer. It's a "quantum annealer".
2. Not all classical encryption is necessarily vulnerable to quantum computing.

Re:Great Scott! (2)

gweihir (88907) | about a year ago | (#44140155)

For block-ciphers, the key-bits are halves. For example AES-256 remains completely secure even with a working general quantum computer. For AES-128, it would need a lot more than 128 bits and it would still need to break 64 bits. But constant factors do matter and there is reason to believe general quantum computers (if they ever work) will not be able to do many steps per second.

RSA is a bit different, it could be in trouble. Bit there is always dlog crypto, and AFAIK, quantum computers do not help against that.

Re:Great Scott! (0)

Anonymous Coward | about a year ago | (#44140299)

Just FYI...

- The effect of Grover Search ("security bits doubling") of the search in O(n^0.5) is not that dramatic at all -- for brute-force attacks, you still need a decision tool that decides whether the experimentally-deciphered plaintext is the text you're looking for. AND you certainly will meet more than one valid (say, well formatted and correct English) plaintext. So it doesn't really break it, you still need some good information about what the plaintext should contain.

- All nubmer-theoretic based problems are reduced to log-times by quantum computing Shor's algorithm, as it was extended to solve dlog as well, and works on any group regardless whether it is numeric or made of elliptic curves. This breaks original RSA, DSA, ElGamal, Diffie-Hellman and EC-based stuff like ECRSA/ECDSA.

But hey, there already are quantum-intractable practical cryptosystems :) see
http://pqcrypto.org/ [pqcrypto.org]
https://github.com/exaexa/codecrypt [github.com]
etc...

This could be huge (1)

cold fjord (826450) | about a year ago | (#44138637)

If this really works, it could be huge. One of the interesting things about quantum computing is that there has been a fair amount of algorithm development done for quantum computers even though they are barely out of the concept stage.

A bit dated but nice general background article on quantum computers:
The Quantum Computer [rice.edu]

Re:This could be huge (1)

Anonymous Coward | about a year ago | (#44138865)

This is not a "Quantum Computer" in that sense of the word.

The device easily finds "rest states" of qubits. It's basically a specialty ASIC that performs a few steps of a few different algorithms very well. (imagine taking blobs of clay, shaping them to any shapes you like, and dropping them on the ground (or through a sheet/material with holes in it BEFORE it hits the ground)

That's basically all the thing does. Impost states onto atoms (correct me if I'm wrong, I believe the qubits are molecules in D-WAVE's approach) then let's them settle close to attractors based on their energy / spin / electrical shape.

If you can find a use for the thing, It does it well. if you can't, join the crowd.

Re:This could be huge (1)

cold fjord (826450) | about a year ago | (#44139187)

Thank you for that clarification.

Re:This could be huge (2)

the gnat (153162) | about a year ago | (#44139151)

there has been a fair amount of algorithm development done for quantum computers even though they are barely out of the concept stage

As the AC above me notes, most of those algorithms won't run on this particular computer. Building a more general-purpose quantum computer is vastly more difficult - this is not even remotely my field of expertise, but from what I've read it has something to do with error-correction. D-Wave is essentially taking a huge shortcut to end up with a vastly less powerful (but probably still unique) technology. It's possible that this will turn out to have been a wise course; the best-case scenario is that their system is successful enough within its limited domain to promote more aggressive development of a more conventional machine - either by an expanded D-Wave or someone else with deep enough pockets.

Re:This could be huge (1)

cold fjord (826450) | about a year ago | (#44139199)

I see. That is helpful. Thank you for that clarification.

Re:This could be huge (1)

gweihir (88907) | about a year ago | (#44140161)

That is the basic ingredient of any good scam "if this really works, it could be huge". Then use enough obfuscation that even "experts" are confused, and you can seel the most pathetic things at a high price.

2 Standard Questions to Evaluate any tech (1)

flappinbooger (574405) | about a year ago | (#44138673)

1) Can I run Linux on it?

2) Can I mine bitcoin with it?

Re:2 Standard Questions to Evaluate any tech (0)

Anonymous Coward | about a year ago | (#44138807)

3) Imagine a Beowulf cluster of those bad boys!

Re:2 Standard Questions to Evaluate any tech (4, Funny)

guruevi (827432) | about a year ago | (#44138849)

1) Yes
2) No
--next calculation--
1) No
2) Yes

Re:2 Standard Questions to Evaluate any tech (1)

wonkey_monkey (2592601) | about a year ago | (#44138883)

If 1) then 2). Inelegant choice of questions.

Re:2 Standard Questions to Evaluate any tech (0)

Anonymous Coward | about a year ago | (#44138963)

1) Nope. it requires a co-processor to run even trivial code. I guess you could use the d-wave to find the... ummn.. lowest.. yeah, I got nothing here.
2) nope*.

*=Though in theory, you could SOMEWHAT use qubits to follow the SHA curve, it wouldn't really be any better than a really tiny ASIC. I'm not an engineer, so this number is pulled from my posterior, but I expect once asic's get down to the 5-7nm range doing SHA, you're getting close to the limits of classical "solve the lowest energy state". In THEORY (and with MILLIONS of qubits) you could skip right to this phase, for SOME algorithms. (though the people that have d-wave's 128-256 bit units have yet to even MODEL simple algorithms on the things.. so...)

Re:2 Standard Questions to Evaluate any tech (0)

Anonymous Coward | about a year ago | (#44139005)

Can it run Crysis?

Re:2 Standard Questions to Evaluate any tech (0)

Anonymous Coward | about a year ago | (#44139491)

If they really do have quantum computers that can do NP-hard things in reasonable amounts of time, this could kill Bitcoin along with all sorts of other encryption.

What about maintenance? (5, Funny)

jennatalia (2684459) | about a year ago | (#44138675)

Are we going to need quantum mechanics to work on these chips and computers?

Re:What about maintenance? (0)

Anonymous Coward | about a year ago | (#44140373)

For compiler design you will. Or for assembly as well. Once someone writes a C compiler for it I'll be interested.

Re:What about maintenance? (0)

Anonymous Coward | about a year ago | (#44140379)

Or I should say

For compiler design you will. Or for assembly as well. Once someone writes a C compiler for it you won't even need to know the name "Albert Einstein".

Posting from NW AR (-1)

Anonymous Coward | about a year ago | (#44138693)

My wife is a cunt!

"appears to be" (0)

Anonymous Coward | about a year ago | (#44138803)

device appears to be operating as a quantum processor

But are you sure? In a quantum world, can you really be sure of anything?

Actually... (2)

Black Parrot (19622) | about a year ago | (#44138821)

the device appears to be operating as a quantum processor

Maybe it both is and isn't, until you have a look at it.

Re:Actually... (1)

VortexCortex (1117377) | about a year ago | (#44139203)

the device appears to be operating as a quantum processor

Maybe it both is and isn't, until you have a look at it.

Or, we exist in the universe where appears to be operating as a quantum processor, and in another universe right next door it's not and instead you're making this joke about Sigurdur Thordarson existing as a superposition of both a WikiLeak's employee and FBI informant.

Quantum Annealing (1)

Anonymous Coward | about a year ago | (#44139063)

I think this is the same group I read about in Scott Aaronson's blog post last month: D-Wave: Truth finally starts to emerge [scottaaronson.com] . There is indirect evidence that the D-Wave machine is actually doing quantum annealing rather than classical annealing, which is a great accomplishment, but quantum computing is still a long way from being practical. And the D-Wave machine is no faster than classical simulated annealing running on a much cheaper normal computer.

quantum mechanics has a functional role... (1)

alienzed (732782) | about a year ago | (#44139253)

Is there anything in the universe in which quantum mechanics does not have a functional role?

I don't get it. (2)

xyourfacekillerx (939258) | about a year ago | (#44139297)

I don't want to pay $32 USD for the paper. Am I the only one who can't figure out what they proved and how? The paper's abstract doesn't help much to balance the media's interpretation.

Re:I don't get it. (4, Informative)

the gnat (153162) | about a year ago | (#44139595)

I am pretty sure that this 7-month-old arXiv preprint [arxiv.org] corresponds to the Nature Communications [nature.com] paper. The titles and author lists are identical, but the abstract deviates, so who knows what changes it went through in revision (I don't have access to the official paper either, even at the university where I work). But presumably it covers the same ground, and it looks like all of the figures from the official are in the preprint.

(Yo, fuck Nature Publishing Group.)

SOMEONE EXPLAIN THIS (0)

Anonymous Coward | about a year ago | (#44139687)

Someone explain what this thing is, what it does, how it works, etc. in a way that a five year old could understand.

TIA

We've been waiting! (1)

poofmeisterp (650750) | about a year ago | (#44139711)

And it does nothing. And everything. It defines what you want it to do; technically it's already done it.

I'll pay $903,845,908,435 for one!

I'll bet (0)

Anonymous Coward | about a year ago | (#44139811)

If they ship Windows 8 on it, no one will ever use it.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>