Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Record-Low Error Rate For Qubit Processor

samzenpus posted more than 3 years ago | from the top-of-the-class dept.

Hardware 66

An anonymous reader writes "Thanks to advances in experimental design, physicists at the National Institute of Standards and Technology have achieved a record-low probability of error in quantum information processing with a single quantum bit (qubit) — the first published error rate small enough to meet theoretical requirements for building viable quantum computers. 'One error per 10,000 logic operations is a commonly agreed upon target for a low enough error rate to use error correction protocols in a quantum computer,' said Kenton Brown, who led the project as a NIST postdoctoral researcher. 'It is generally accepted that if error rates are above that, you will introduce more errors in your correction operations than you are able to correct. We've been able to show that we have good enough control over our single-qubit operations that our probability of error is 1 per 50,000 logic operations.'"

Sorry! There are no comments related to the filter you selected.

Progress! (1)

iluvcapra (782887) | more than 3 years ago | (#37269058)

Another few orders of magnitude and they might approach vacuum tube-levels of reliability.

Re:Progress! (2)

NoNonAlphaCharsHere (2201864) | more than 3 years ago | (#37269098)

One error per bit per 50,000 logic operations should be accurate enough for non-technical people.

Re:Progress! (1)

flargleblarg (685368) | more than 3 years ago | (#37269328)

One error in 640,000 ought to be enough for anyone.

Re:Progress! (0)

Anonymous Coward | more than 3 years ago | (#37272314)

One error in 640,000 ought to be enough for anyone.

655,360.
As much as people might mock the idea of the kibibyte, the fact is that kilo=1000, kilobyte=1024 is completely nonsensical (and causes silly mistakes as we just saw).

Re:Progress! (1)

onefineline (1981168) | more than 3 years ago | (#37359838)

Considering it was a joke, I don't think it was that big of deal that there was a "mistake." In fact, people might not have gotten the joke if the author had written it like you did.... since it would have been kind of ruined. =/

Re:Progress! (1)

davester666 (731373) | more than 3 years ago | (#37275910)

19 times out of 20, they'll get 1 error in 50,000....

Re:Progress! (2, Informative)

Anonymous Coward | more than 3 years ago | (#37269356)

Most early computing errors were caused by memory (not RAM as early technologies weren't random access) The shift from mercury delay lines to magnetic cores saw a serveral-orders-of-magnitude drop in error rates, and the associated increase in the viability of general purpose computing.

Re:Progress! (1)

jellomizer (103300) | more than 3 years ago | (#37274182)

I think a lot of us has forgotten how bad old computers were at with hardware errors, and how much environment can effect the old computers. My old Amstrad CPC1512 use cause programs to crash or odd inputs on the screen when ever I turned on the fan.
 

Re:Progress! (1)

msheekhah (903443) | more than 3 years ago | (#37270936)

Better than Pentium II math?

Re:Progress! (2)

Nefarious Wheel (628136) | more than 3 years ago | (#37271948)

And of course you remember the joke -- why did they call it the Pentium instead of the 586? They added 100 to 486 and got 585.939434521165242233345, which wouldn't fit on the package.

More Errors (0)

Anonymous Coward | more than 3 years ago | (#37269092)

I always made a lot more errors than that when I played Qubit.

Re:More Errors (1)

MobileTatsu-NJG (946591) | more than 3 years ago | (#37270296)

You'll make fewer errors if you only jump on the disc when Coily is right behind you.

Pffff! (0, Troll)

Ecuador (740021) | more than 3 years ago | (#37269106)

One error per 10,000? They have some serious catching up to do, Intel had managed just 1 error in 9 billion over 15 years ago! And the later Pentiums were probably even better than that!

Re:Pffff! (4, Insightful)

EraserMouseMan (847479) | more than 3 years ago | (#37269214)

"They have some catching up to do"

Yea, that's the whole point of their efforts.

Re:Pffff! (1)

JoshuaZ (1134087) | more than 3 years ago | (#37269346)

You don't need such low error correction rates though. The key is that if the error rates are low enough you can then use clever error correcting mechanisms. But if the error rate from stray particles and other issues causing you to repeatedly lose quantum entanglement is too high then you can't use clever algorithms to deal with these problems. But if you have an error rate below about 1 in every 10,000 operations then you can use the good stuff. Note that by the entire nature of quantum computers even if we have practical ones it is unlikely that they will have an error rate in the ballpark of 1 in a billion. It would be nice if we could get that rate but it seems to be unlikely.

Re:Pffff! (0)

Anonymous Coward | more than 3 years ago | (#37270448)

Talk about redundancy. The structure of what you said was basically: If A then we can do B. But if not A then we can't do B. But if A we can do B. It is unlikely to have D. It's nice to have D but it is unlikely.

Re:Pffff! (0)

Anonymous Coward | more than 3 years ago | (#37272282)

Well, at least it indicates that he read and understood the content of the slashdot article summary, and can explain it in his own words. It is not for nothing that slashdotters maintain they are smarter than the average idiot.

Re:Pffff! (0)

KiloByte (825081) | more than 3 years ago | (#37269354)

Er, what? 1 error in 9 billion bits is around ten errors per second. A computer with such an error rate is worthless unless you build your algorithms specifically to handle that. Only if you prove the errors are completely random you can put three chips next to each other and vote.

Re:Pffff! (1)

Anonymous Coward | more than 3 years ago | (#37269410)

Only if you prove the errors are completely random you can put three chips next to each other and vote.

That's not how error detection and correction is performed.

Re:Pffff! (0)

Ecuador (740021) | more than 3 years ago | (#37269362)

I hate to have to explain jokes, but maybe some slashdotters are too young to remember the Pentium FDIV bug?

Re:Pffff! (1)

modmans2ndcoming (929661) | more than 3 years ago | (#37269460)

12.00000000001

Re:Pffff! (5, Insightful)

Anonymous Coward | more than 3 years ago | (#37269476)

Sometimes the reason why no one laughs is not because they didn't get the joke.

Re:Pffff! (1)

martas (1439879) | more than 3 years ago | (#37275270)

I thought it was funny... When I got the joke, that is.

Uncertainty (4, Funny)

Knave75 (894961) | more than 3 years ago | (#37269114)

The problem is that once you know what the error is, you don't know where the error is.

I mean, once you know where the error is, you don't know what the error is.

I mean, err... I'm not sure.

Re:Uncertainty (1)

NoNonAlphaCharsHere (2201864) | more than 3 years ago | (#37269146)

You can only observe the error by making it.

Re:Uncertainty (1)

msauve (701917) | more than 3 years ago | (#37269228)

I thought I made an error once, but I was mistaken.

Re:Uncertainty (1)

AngryDeuce (2205124) | more than 3 years ago | (#37269466)

Are you sure you were looking in the right place?

Re:Uncertainty (1)

stevelinton (4044) | more than 3 years ago | (#37269402)

In fact you correct the error by observing it.

Re:Uncertainty (0)

Anonymous Coward | more than 3 years ago | (#37269490)

To paraphrase Alan Kay: The best way of observing the error is to invent it.

Re:Uncertainty (1)

russryan (981552) | more than 3 years ago | (#37269874)

It means the cat is dead.

Re:Uncertainty (0)

Anonymous Coward | more than 3 years ago | (#37272080)

It means the cat is dead.

You just made a cat!

Re:Uncertainty (0)

Anonymous Coward | more than 3 years ago | (#37274704)

Dos Equis Spokesperson:

Quantum mechanics revert to HIS singularity, ensuring HE is always correct

Stay thirsty my friends..

Re:Uncertainty (1)

tom17 (659054) | more than 3 years ago | (#37275242)

I have to try that some time. Is it actually as good as the ad makes it sound?

Is it a pils? I hate pils, I like Helles.

Re:Uncertainty (1)

tom17 (659054) | more than 3 years ago | (#37275256)

Preferably Augustiner. Oh how I miss my Augie :(

Quantum computers already on the market (0)

Anonymous Coward | more than 3 years ago | (#37269184)

If this "breakthrough" only just now made quantum computers practical, then how are quantum computers already commercially available [physorg.com] ?

Re:Quantum computers already on the market (2)

JoshuaZ (1134087) | more than 3 years ago | (#37269290)

It isn't at all clear that D-Waves system is using any sort of quantum entanglement at all. D-Wave has had a long history of massive hype. See e.g. http://www.scottaaronson.com/blog/?p=431 [scottaaronson.com] . It isn't at all clear that D-Waves commercial system can do any of the things that we expect a quantum computer to do like say factor integers using Shor's algorithm http://en.wikipedia.org/wiki/Shor's_algorithm [wikipedia.org] . It seems that D-Wave has made a fast computer but there's very little evidence that it actually is using quantum processes any more than a normal computer. You could call your laptop a quantum computer because quantum mechanics is used in determining how the transistors function and you might be close to what D-Wave is claiming. The key is whether there are entangled qubits that we can get info from, and D-Wave has shown little indication of that. They have had a handful of research papers that sort of point in that direction but it is very hard to separate the hype from what they've actually done.

Still a long way away (4, Informative)

JoshuaZ (1134087) | more than 3 years ago | (#37269250)

An important thing to recognize is that most of this experiment was done with a single qubit. Practical quantum computing will need to have this sort of error rate for thousands of qubits. The good news is that the methodology they used looks very promising. They used microwave beams rather than lasers to manipulate the ions. This has been I think suggested before but this may be the first successful use of that sort of thing. As TFA discusses, this drastically reduces the error rate as well as the rate of stray ions.

We are starting to move towards the point where quantum computers may be practical. But we're still a long way off. In the first few years of the last decade a few different groups successfully factored 15 as 3*5 using a quantum computer. (15 is the smallest number which is non-trivial to factor using a quantum computer since the fast factoring algorithm for quantum computers- Shore's algorithm- requires an odd composite number that is not a perfect power. It is easy to factor a perfect kth power a bit by looking instead at the kth root. And factoring an even number is easily reduced to factoring an odd number. So 15 is the smallest interesting case where the quantum aspects of the process matter.) Those systems used a classical NMR system http://en.wikipedia.org/wiki/Nuclear_magnetic_resonance_(NMR)_quantum_computing [wikipedia.org] which has since been seen as too limited. There are now a lot of different ideas of other approaches that will scale better but so far they haven't been that successful.

One important thing to realize is that quantum computers will not magically solve everything. They can do a few things quite quickly such as factor large numbers. But they can't for example solve NP complete problems to the best of our knowledge, and it is widely believed that NP complete problems cannot be solved in polynomial time with a quantum computer. That is, it is believed that BQP is a proper subset of NP. Unfortunately, right now we can't even show that that BQP is a subset of NP, let alone that it is a proper subset. Factoring big integers is useful mainly for a small number of number theorists and a large number of other people who want to break cryptography. There are a few other cryptographic systems that can also be broken more easily by a quantum computer but there's not that much else. However, that is changing and people getting a better and better understanding of what can be done with quantum computers. A lot of the work has involved clever stuff involving using quantum computers to quickly calculate stuff related to Fourier series. Moreover, once we get even the most marginally useful quantum computers there will be a lot more incentive to figure out what sorts of practical things can be done with them.

So the upshot is that these are still a long way off, but they are coming. The way it looked in the late 1990s or early 2000s it was reasonable to think that the technical difficulties would make them never practical. They still are a long way from being practical but right now it doesn't look like there are any fundamental physical barriers and it looks like in the long run the problems that do exist will be solved.

Re:Still a long way away (0)

Anonymous Coward | more than 3 years ago | (#37269522)

Not really contradicting anything you said, but I thought Grover's algorithm [wikimedia.org] allowed for a useful speed-up for NP-complete problems, even though it did not make them polynomial.

Re:Still a long way away (0)

Dishevel (1105119) | more than 3 years ago | (#37269562)

I am just wondering how long till I get an Intel Core9Q 10 Core CPU with 8 regular 64bit cores and 2 quantum cores.
Is it even possible to have what would basically amount to a quantum co processor?
A great GP computer with the added goodness of quantum computing when needed?

Re:Still a long way away (1)

DigiShaman (671371) | more than 3 years ago | (#37269590)

Assuming that's even possible to place a QC die in the same CPU package. If the hardware to manipulate a quantum chip is complex enough, it may come in the form of a PCIe card at best. Otherwise, it could take form as a completely separate break-out box using a Thunderbolt interface.

Re:Still a long way away (1)

drolli (522659) | more than 3 years ago | (#37270158)

I hope you are joking.

No qc which is in sight will fit on a pci card anytime soon.

Re:Still a long way away (1)

MightyYar (622222) | more than 3 years ago | (#37271344)

No qc which is in sight will fit on a pci card anytime soon.

That's okay, I have an ISA slot open.

Mixing QC and GP CPUs at different temps (2)

billstewart (78916) | more than 3 years ago | (#37271828)

Chances are pretty good that your Quantum Computer will be running at liquid helium temperatures, maybe 4 Kelvin or so. Your general purpose CPU won't. There have been projects to run CPUs at liquid-nitrogen temperatures, and that already tends to get into mechanical difficulties; you're probably not going to be running your overclocked Xeon down at 4K.

Also, the quantum computer isn't likely to be something you're pumping a lot of data through - you're more likely to set it up, have it magically give you a probably-correct answer, and feed that answer to another computer that figures out if it's actually correct and then does something with it. For instance, if you're using the QC as an oracle to factor large numbers, you'll have it give you the result, then let your general-purpose machine multiply the factors together to find out if they give the right result, and then you'll use a general-purpose machine to rip off the bank account whose private key you just cracked.

Re:Still a long way away (2)

hweimer (709734) | more than 3 years ago | (#37269878)

An important thing to recognize is that most of this experiment was done with a single qubit. Practical quantum computing will need to have this sort of error rate for thousands of qubits.

I'd take any quantum computer with 50 qubits and get a Nobel prize for beating the shit out of all current supercomputers simulating quantum systems like high-temperature superconductors, quark bound states such as proton and neutrons, or quantum magnets. Also, keep in mind that Rainer Blatt's group recently succeeded in demonstrating entanglement between 14 qubits [arxiv.org] in a similar setup. And for quantum simulations, the error rates probably don't have to be crazily low anyway, it turns out that such errors typically correspond to a nonzero temperature for the simulated system. So if this effective temperature is low enough that you can see interesting quantum physics, you are still in business.

Re:Still a long way away (1)

Anonymous Coward | more than 3 years ago | (#37271770)

You are correct; ion trap systems are in principle scalable, but the task is very challenging (probably much harder than scaling, say, a superconductor-based system). But it is much more than this. The operations needed to manipulate a single qubit are significantly different from the operations needed to interact two qubits. (There is no need to directly interact three or more qubits, since such gates can be built up out of two-qubit operations, much like the two-bit AND and one-bit NOT are universal for classical circuits.)

For example, in a typical liquid NMR quantum system, two-qubit gate error rates are 50x greater than the one-qubit gate error rates. In a superconducting qubit system, the ratio is more like 15x, and with qubits based on photons, the ratio is 100x! (Photons are easy to manipulate singly but terribly hard to interact.)

In the best trapped ion systems, two-qubit gate error rates are only 1.5x larger than the one-qubit gate error rates. However, the mechanisms for implementing two-qubit gates are very different from those for the one-qubit gates, so improvements to one-qubit gate errors may not directly help two-qubit gates.

Still very promising. With these error rates, a few dozen qubits are enough to start solving interesting problems (interesting to physicists at least) getting beyond classical supercomputers of today or the forseeable future. Give me 30,000 qubits (or maybe fewer) with these error rates, and RSA-1024 is cracked wide open.

Is it just me, or... (1)

gman003 (1693318) | more than 3 years ago | (#37269256)

Am I the only one who has difficulty thinking of quantum computers as things that actually exist and do calculations? It's like my brain has placed "quantum computer" firmly into the category "things that are theoretically possible but unable to be built with current technology", and refuses to change it, even to "things that exist in the lab but won't be commercially viable for decades outside classified government work".

Re:Is it just me, or... (2)

betterunixthanunix (980855) | more than 3 years ago | (#37269312)

The problem is that people are not generally aware of what a quantum computer would be useful for. Why should I care if there is a quantum computer sitting under my desk? How do I benefit from quantum algorithms?

There are indeed tangible benefits to quantum computing, beyond just attacking public key cryptosystems. As an example, quantum computers can speed up certain search algorithms, which is one of the promised commercial applications of a quantum computer.

Personally, I put quantum computers in the same technological category as fusion power. The world would be an exciting place if we had cold fusion, and we are just a few steps from having it...but those steps are measured in light years and involve practical and theoretical challenges that are hard to address. I have no doubt that some day, we will answer those questions and build fusion reactors and quantum computers, but I get the feeling that that day is pretty far off.

Re:Is it just me, or... (0)

Anonymous Coward | more than 3 years ago | (#37269406)

Well, John Layman will buy it for the same reason he buys an iPhone...the hype.

Re:Is it just me, or... (1)

Baloroth (2370816) | more than 3 years ago | (#37269432)

That's because as of now it still is in the "theoretically possible but unable to be built." Keep in mind this new technique is for one (1) qubit. You need more (at least 8? Or do quantum computers work that differently from normal ones) to do anything practical. And it only meets the theoretical requirement, once you use ECC. Previously, it wasn't accurate enough that you could count on the ECC to be performed right. Making a quantum computer, even in the lab, is a few years off yet.

Or so I think. I don't really understand quantum computing yet, so I could be wrong.

Re:Is it just me, or... (0)

Anonymous Coward | more than 3 years ago | (#37270194)

quantum computers work in a totally different way from regular computers. They are more like analog computers that you set up to simulate a system and then you read off the answer. The difference from an analog computer is that the answer is an integration over all possible solution paths.

Re:Is it just me, or... (1)

gman003 (1693318) | more than 3 years ago | (#37270862)

"Existing but not able to do anything practical" is still a pretty big difference from "Could exist but nobody's built one yet". It's like the early airplanes - they existed, but they weren't exactly useful for anything besides proving that heavier-than-air aircraft could work.

Sounds good (1)

Dunbal (464142) | more than 3 years ago | (#37269308)

show that we have good enough control over our single-qubit operations that our probability of error is 1 per 50,000 logic operations.

They forgot to add: "we calculated this probability on our quantum computer"

Operation and number dependent? (0)

Anonymous Coward | more than 3 years ago | (#37269416)

Depends on the operation. For 4195835/3145727 100% error rate was acceptable ;)

Well, what is it? (0)

Anonymous Coward | more than 3 years ago | (#37269420)

Is the probability of an error 1 in 50,000, is the error rate probably 1 in 50,000, or is the error rate 1 in 50,000?

A standard Open-Source Quantum Computing Language (1)

TheSync (5291) | more than 3 years ago | (#37269686)

What we really need is a "standardized" open-source quantum computing language so that we can develop and exchange quantum algorithms to prepare for the day when quantum computers are real.

Right now we have the QCL [tuwien.ac.at] language, QCF [sourceforge.net] for Matlab/Octave, and the Cove framework [youtube.com] that could be used with any language, but it looks like there is really only a C# implementation right now.

None of these have really taken hold as a "standard" though, and probably elements of all of them could be brought together in something multi-platform and all-inclusive.

Re:A standard Open-Source Quantum Computing Langua (1)

vux984 (928602) | more than 3 years ago | (#37269700)

What do you get when you have THREE competing standards and you try to take elements of them to make something multi-platform and all inclusive?

FOUR competing standards

Re:A standard Open-Source Quantum Computing Langua (1)

the linux geek (799780) | more than 3 years ago | (#37269776)

Yet another person who has no idea how standards actually work, but is apparently literate enough to read XKCD.

Re:A standard Open-Source Quantum Computing Langua (1)

vux984 (928602) | more than 3 years ago | (#37270006)

Yet another person who has no idea how standards actually work, but is apparently literate enough to read XKCD.

Do tell. How do you think they really work?

Re:A standard Open-Source Quantum Computing Langua (0)

Anonymous Coward | more than 3 years ago | (#37271324)

"There can be only one." is the motto. It is why Linux has never caught on outside of the techno-geek crowd. It is much easier if everything is uniform and doesn't change. And you don't have to worry about hundreds of different hardware configurations.

Re:A standard Open-Source Quantum Computing Langua (0)

Anonymous Coward | more than 3 years ago | (#37273944)

It is why Linux has never caught on outside of the techno-geek crowd. ...and servers, and HPC, and embedded, and smartphones, and basically everything except for desktop computers, yeah.

Re:A standard Open-Source Quantum Computing Langua (1)

Archwyrm (670653) | more than 3 years ago | (#37270038)

Clearly your sense of humor does not fully implement the standard.

Re:A standard Open-Source Quantum Computing Langua (1)

Nefarious Wheel (628136) | more than 3 years ago | (#37271982)

I like standards. I believe everybody should have a set.

They aren't errors! (1)

scorp1us (235526) | more than 3 years ago | (#37271744)

They are just answers to questions you haven't asked yet.

Re:They aren't errors! (1)

Lithdren (605362) | more than 3 years ago | (#37275344)

Clearly we need to make a larger quantum computer to calculate the questions we haven't asked yet, to make sense of all these answers that dont make sense yet.

Perhaps something with a biological matrix..quick, get the mice!

Excellent news! (1)

ath1901 (1570281) | more than 3 years ago | (#37273950)

Now if someone could just implement Shors algorithm all 1-bit encryption algorithms will be rendered obsolete!

Ob. Imagine a Beowulf cluster of these... (1)

Tastecicles (1153671) | more than 3 years ago | (#37276644)

...I'll wait.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?