Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

IBM Touts Quantum Computing Breakthrough

timothy posted more than 2 years ago | from the or-they-don't-it's-indeterminate dept.

IBM 132

Lucas123 writes "IBM today claimed to have been able to reduce error rates and retain the integrity of quantum mechanical properties in quantum bits or qubits long enough to perform a gate operation, opening the door to new microfabrication techniques that allow engineers to begin designing a quantum computer. While still a long ways off, the creation of a quantum computer would mean data processing power would be exponentially increased over what is possible with today's silicon-based computing."

cancel ×

132 comments

Sorry! There are no comments related to the filter you selected.

So (-1)

Anonymous Coward | more than 2 years ago | (#39184323)

How many African Americans helped to invent this?

None, of course.

Re:So (0, Redundant)

Agent Z5q (144666) | more than 2 years ago | (#39184439)

I have become increasingly disappointed with Slashdot’s lack of good content. This particular story about quantum computing is actually a pretty good one, but they're becoming fewer and farther between. So, I’ve reached the point where I’m going to delete my Slashdot bookmark. Any suggestions for a good replacement tech news website?

Re:So (1)

garaged (579941) | more than 2 years ago | (#39184663)

I dont know anything better, and you will be dissapointed with scientific journals too, so get used to it

Re:So (1)

vlm (69642) | more than 2 years ago | (#39184671)

Any suggestions for a good replacement tech news website?

You'll have to clarify your definition. If you want "tech news" as in news about "hard science" tech there are places like arxiv.org or PLOS for bio stuff. My best guess for IT type primary sources is maybe the debian-announce mailing list or the daily SANS ISC diary? There are no primary source places that I'm aware of with social media type features, not /. certainly not on G+ or whatever.

If by "tech news" you mean news about other tech news sites, if you prefer a weekly format thats "this week in tech" and some other twit and rev3 shows and/or at least some /. articles. One important filter consideration is I don't want to go to a site exclusively populated with "txt speak" and/or total noobs. "Hay guys I just heard of this unix port 4 my pc its called linux has any1 else hear herd of it????!?". Sorry I don't want to see that. I haven't been a noob to this computer thing since 1981 or to the internet since the (very) late 80s.

Re:So (4, Funny)

HaZardman27 (1521119) | more than 2 years ago | (#39184889)

has any1 else hear hurd of it????!?

FTFY.

Re:So (1)

Yobgod Ababua (68687) | more than 2 years ago | (#39185789)

I've always been partial to The Register (www.theregister.co.uk) which is snarky and British (and the home of the BOFH) and to Ars Technica (www.arstechnica.com) which tries to focus on actually writing interesting articles about news rather than just linking to things.

There is often some overlap between themselves and /. but it's not as bad as you'd expect.

Re:So (0)

Anonymous Coward | more than 2 years ago | (#39186031)

Except that the Register is even more populist than Slashdot and written by mediocre hacks - Slashdot at least doesn't pretend to be staffed by journalists, merely linking elsewhere. And it's only "home" to the BOFH in the sense that the Aussie sold out to them ("he's a BOFH, what do you expect?").

Ars technica has the occasional highly knowledgeable contributor but usually gives me the uncomfortable sense of an editorial belief that the more you write, the more insightful and relevant your piece is.

There are three honest sources of STEM news:
(1) Peer-reviewed journals;
(2) progress reports by practitioners;
(3) Chatty, informal summaries of press releases, word of mouth and items in (1) and (2).

Items in (3) rquire you to either go to (1) or (2) or examine the subject yourself to obtain an in-depth analysis.

Outlets which try to position themselves between (2) and (3) are a waste of time, also likely to reinforce any of your existing biases since they're never neutral.

Re:So (1)

Yobgod Ababua (68687) | more than 2 years ago | (#39186261)

"...also likely to reinforce any of your existing biases since they're never neutral."

I thought we were talking about science content? Why would biases be relevant?
Independently reproduceable results rule; soundbites drool.

Re:So (1)

jones_supa (887896) | more than 2 years ago | (#39186515)

Real World Technologies [realworldtech.com] is worth mentioning too. It's a high-quality, rarely updated site involving mostly CPU/GPU/APU type stuff.

Re:So (1)

forkfail (228161) | more than 2 years ago | (#39188825)

Maybe it's more a reflection of the changes in the tech industry than in the site.

Re:So (0)

Anonymous Coward | more than 2 years ago | (#39187523)

Just one and he kept shouting things about peanuts.

And people say .... (3, Funny)

JimCanuck (2474366) | more than 2 years ago | (#39184327)

And people keep telling me IBM isn't innovative and cutting edge anymore. [/Sarcasm]

Re:And people say .... (0)

Anonymous Coward | more than 2 years ago | (#39184355)

Is that actually sarcastic?

Re:And people say .... (2)

The_Crisis (2221344) | more than 2 years ago | (#39184401)

Yes, Sheldon...sarcasm.

Re:And people say .... (1)

sexconker (1179573) | more than 2 years ago | (#39186473)

Yes, Sheldon...sarcasm.

Bazinga.

Re:And people say .... (0)

Anonymous Coward | more than 2 years ago | (#39184359)

Times change, they kinda dropped the ball with the "PC's will never catch on" thing.

Re:And people say .... (2)

Haxagon (2454432) | more than 2 years ago | (#39184423)

Cloud computing seems to suggest otherwise. I'm sure it was cheaper to get out of PCs before the general PC profit margins decrease dramatically as people move toward dumb terminals once again. It's not entirely a good thing.

Re:And people say .... (1)

forkfail (228161) | more than 2 years ago | (#39188877)

It may be cyclic.

We started with mainframes, then went to PC's. From somebody else having your computing resources to you having them.

While the cloud may be inherently distributed, it is like the mainframe approach in so far as your interface is much like a mainframe interface, and someone else again has your computing resources.

Ironically, quantum computing might well put cycle things back to the PC model, in so far as you might no longer need the power of the cloud to get the job done; it might be feasible to have adequate and massive power in your desktop box again.

Re:And people say .... (0)

Anonymous Coward | more than 2 years ago | (#39184405)

Well, we'll see. It won't be until I see a "Deepak Chopra Guide to Quantum Computing" that I'll be inclined to think that this is truly a breakthrough.

Re:And people say .... (3, Funny)

Gilmoure (18428) | more than 2 years ago | (#39185079)

I wonder what animal will go on the cover of O'Reilly Quantum Computing In a Nutshell; a cat in a box?

Re:And people say .... (2)

vikingpower (768921) | more than 2 years ago | (#39185349)

Reminds me of the t-shirt a colleague of mine wears. "Wanted - Schrödinger's cat. Dead and alive".

Re:And people say .... (2)

forkfail (228161) | more than 2 years ago | (#39188899)

I wonder what animal will go on the cover of O'Reilly Quantum Computing In a Nutshell; a cat in a box?

You won't know till you open the book.

Re:And people say .... (5, Insightful)

Anonymous Coward | more than 2 years ago | (#39184567)

The depressing thing is that you will never see anything like this out of Apple. Billions of dollars in reserves and no "Jobs Labs".

Re:And people say .... (1, Insightful)

Anonymous Coward | more than 2 years ago | (#39184983)

no, but apple has been pretty focused on making technology cool and even desirable to the masses. While perhaps not as interesting to you as Quantum computing, its certainly important, and something that IBM was never able to do.

Re:And people say .... (1)

bws111 (1216812) | more than 2 years ago | (#39185027)

Wait, you mean International Business Machines doesn't make things for the masses? Who'd a thunk it?

Re:And people say .... (0)

Anonymous Coward | more than 2 years ago | (#39185189)

Wait, you mean International Business Machines doesn't make things for the masses? Who'd a thunk it?

Wait, you mean Apple Computer didn't make computers out of apples? Who'd a thunk it?

Re:And people say .... (5, Insightful)

Darth Snowshoe (1434515) | more than 2 years ago | (#39185011)

THIS, like times a million. NYTimes this weekend had an excellent article on the history of Bell Labs (the laser, the transistor, communications satellites, etc). HP, whatever else you may think of them, supported the pure research lab which brought forth the memristor. IBM can point to things such as this, its various efforts to simulate a brain, and Watson. Google, bless their souls, is pushing for automated driving (this may not sound in the same league, until you realize the consequences for everybody who drives or rides in an auto.)

Where is the pure research at Apple? Do they think they can get by on just making better UIs, for the rest of forever? Are they at all part of a larger community?

Re:And people say .... (3, Interesting)

BitZtream (692029) | more than 2 years ago | (#39185223)

BASF, we don't make the things you use.

We make the things you use BETTER.

That was the commercial I remember for several years.

Its not always about making cutting edge front page news break throughs, sometimes its just about refining something until its just right after someone else made the break through and then forgot about it because they moved on to the next shiny thing.

Both kinds of people/businesses are useful and needed, well atleast until this utopian dream you have becomes reality and everyone works for the common good anyway.

BASF (1)

Candyban (723804) | more than 2 years ago | (#39186185)

BASF, we don't make the things you use.
We make the things you use BETTER.

That was the commercial I remember for several years.

I suppose they couldn't live up to this promise any more since they changed it to We don't just make chemicals, we make chemistry [youtube.com]

Re:And people say .... (1)

Anonymous Coward | more than 2 years ago | (#39185227)

No, they think they can get by on just making better UIs and using third-party or licensed hardware in shiny cases put together by suicidal chinese slaves. Also, suing their competitors to death when actual science prevails over the macfaggotry of the jobsian cult.

Re:And people say .... (0)

Anonymous Coward | more than 2 years ago | (#39185529)

As long as they can sue you for having a Design/UI with same colors and rounded edges ;)

Re:And people say .... (5, Interesting)

steelfood (895457) | more than 2 years ago | (#39185555)

Apple doesn't do technological research. Instead, they pour all of that money into usage research, so that they can design an improved user experience.

It's not necessarily a bad thing. There's a place for both the technological side, and the usability side. Most tech companies focus on the technology side while neglecting the usability, which is why so much technology ends up unusable by laymen.

Microsoft actually does a lot of usability research too. But the difference between Microsoft and Apple is that Apple has (or had) someone steering the ship. They're a top-down dictatorship-style management house. Microsoft is more about internal competition to see who wins out. They're more of a survival-of-the-fittest, cream-of-the-crop-rises-to-the-top type of management house.

Re:And people say .... (3, Informative)

Darth Snowshoe (1434515) | more than 2 years ago | (#39185839)

But Apple SHOULD do technological research. Because it provides a long term competitive edge for them, and because its the right thing to do. Corporations, like people, live in a larger society, culture (and nation) and they benefit from those things. Apple would not exist were it not embedded in the Silicon Valley culture emanating from Stanford and Berkeley. Apple should give something back. Maybe Steve would not understand this, but surely Woz would.

Yeah, iPhones are great, but honestly, ten years from now, we'll be on to a newer, better UI (glasses, brain implants, holodecks, or whatever.) It turns out we're still using lasers and transistors and communications satellites, all invented by Bell Labs in the 60s.

Here, I'm pasting the best bit from the NYTimes/Bell Labs article, written by Jon Gertner;

"But what should our pursuit of innovation actually accomplish? By one definition, innovation is an important new product or process, deployed on a large scale and having a significant impact on society and the economy, that can do a job (as Mr. Kelly once put it) “better, or cheaper, or both.” Regrettably, we now use the term to describe almost anything. It can describe a smartphone app or a social media tool; or it can describe the transistor or the blueprint for a cellphone system. The differences are immense. One type of innovation creates a handful of jobs and modest revenues; another, the type Mr. Kelly and his colleagues at Bell Labs repeatedly sought, creates millions of jobs and a long-lasting platform for society’s wealth and well-being."

The whole article is here (paywall yadda-yadda)
http://www.nytimes.com/2012/02/26/opinion/sunday/innovation-and-the-bell-labs-miracle.html?pagewanted=all [nytimes.com]

Re:And people say .... (0)

Anonymous Coward | more than 2 years ago | (#39186173)

Instead, they pour all of that money into usage research, so that they can design an improved user experience.

Nonsense, they have close to $100 billion in cash that they aren't pouring into anything.

Re:And people say .... (1)

msobkow (48369) | more than 2 years ago | (#39186333)

Bullshit.

Apple invests in PATENTING UI components so they can SUE companies and people who use them.

IBM, on the other hand, sponsored, developed, published, and GAVE AWAY the Common User Interface Standard.

Apple has no intent on sharing anything with anyone. They want to OWN the market. All markets. And any device that makes the mistake of using a common sense gesture, icon, or interface that anyone with a functioning brain cell could have come up with.

Apple is a pimply leech on the ass of computing, sucking away for all it's worth, and giving NOTHING back.

Re:And people say .... (0)

Anonymous Coward | more than 2 years ago | (#39188737)

You can't tell me that if Apple drop a measly $1Billion over several years, for a pure R&D lab isn't a good idea. Even just for to get some people in there thinking outside the box. They have around $90Billion sitting in the bank collecting interest!

I'll predict it now. Apple will stagnate not from it's user experience devices, or from the massive user usage data it has and moneys it acquires from licensing said data, but from the singularity that lies at the intersection of personal historical data, and their physical presence at the now. They're bound by external variables. Namely, cellular bandwidth, battery life, and handheld computational power. AFAIK, and please correct if I'm wrong, they aren't actively developing improvements on any of these fronts, in house.

Re:And people say .... (1)

geoffrobinson (109879) | more than 2 years ago | (#39185687)

It's not a zero-sum game. This would only be a problem if Apple was the only company that ever will exist. Let them do what they do well and let IBM do what they do well.

Re:And people say .... (-1)

Anonymous Coward | more than 2 years ago | (#39185817)

"Bell Labs (the laser, the transistor, communications satellites, etc)"

Um, NASA invented all these things, to go to the fucking moon. Because nothing is more important, and there were no intelligent people before NASA.

Re:And people say .... (0)

Anonymous Coward | more than 2 years ago | (#39187315)

"Better UIs"??? What the hell are you smoking, and where do you live that it's legal??

Re:And people say .... (1)

mcgrew (92797) | more than 2 years ago | (#39188773)

Why is that so depressing? Apple is design and marketing, not engineering or research. And they do a damned good job of it, too -- they do usually have the best designs if not the best engineering (e.g., iPhone antenna).

Re:And people say .... (0)

Anonymous Coward | more than 2 years ago | (#39186909)

Clearly, they know a little bit.

another misleading quantum computing article (5, Insightful)

Anonymous Coward | more than 2 years ago | (#39184373)

1) Repeated news about being able to perform some operation with a tiny number of qubits do not suggest that it is probably true that a useful quantum computer of practical size can be built;

2) It wouldn't mean data processing power would be "exponentially increased", but that certain algorithms could be executed asymptotically faster.

QC remains a second rate branch of mathematics for computer science types who don't want to apply themselves to less glamorous problems in the more mature and challenging fields of classical computing. For engineers, it's still in the nuclear fusion stage: kinda just possible in the right conditions, but under no conditions shown useful.

Re:another misleading quantum computing article (1)

rhook (943951) | more than 2 years ago | (#39187423)

That is just the kind of attitude that holds back progress.

Liberate your phone! (-1)

Anonymous Coward | more than 2 years ago | (#39184377)

Uninstall Android NOW!

Quantum computing (-1)

Anonymous Coward | more than 2 years ago | (#39184413)

Back in the early nineties I had a couple of Quantum hard drives....

Exponentially? (1, Informative)

drooling-dog (189103) | more than 2 years ago | (#39184417)

data processing power would be exponentially increased over what is possible with today's silicon-based computing.

Please, please, please stop misusing the word "exponentially". It just means that something is increasing (or declining) at a constant rate, which is practically the opposite of what is meant here.

Re:Exponentially? (0)

Anonymous Coward | more than 2 years ago | (#39184443)

Eponetially means that is is at a constantly increasing rate, which is what is happening here. Your Ignorance is showing.

Re:Exponentially? (0)

Anonymous Coward | more than 2 years ago | (#39184673)

Like a quadratic then? I always wondered what 'eponetially' meant.

Re:Exponentially? (1)

drooling-dog (189103) | more than 2 years ago | (#39186993)

Your Ignorance is showing.

Well, somebody's certainly is. Suppose that something grows continuously at a constant rate of 10% per year. That is exponential growth, because it fits the relation x = e^(0.1*n). Try it.

Re:Exponentially? (0)

Anonymous Coward | more than 2 years ago | (#39188517)

That's 10% on the last years principle + last years 10% on interests, you stupid fuck. The slope of e^(.1n) is .1*e^(.1*n), which is not constant. Go back to the wiki and start at y=mx+b before you make yourself look like a fool again.

Re:Exponentially? (-1)

Anonymous Coward | more than 2 years ago | (#39184503)

Wow you are #@$&ing dumb.

Re:Exponentially? Yes (4, Informative)

JoshuaZ (1134087) | more than 2 years ago | (#39184515)

Actually, this is a correct use. Some algorithms on quantum computers are exponentially faster than the best known classical algorithms. For example, estimating a Gauss sum http://en.wikipedia.org/wiki/Gauss_sum [wikipedia.org] scales exponentially in time, but the most efficient quantum algorithms are bounded by a polynomial. So exponential speed up is a valid use of the term here.

Re:Exponentially? (4, Informative)

vlm (69642) | more than 2 years ago | (#39184557)

The whole discussion is fubar

First of all, the derivative of e to the x ("exponential function") is e to the x. Yeah thats true the D is the same as the function itself. Welcome to 1st semester calculus, kids. Not a constant, not even sure what "constantly increasing" means mathematically, although if AC meant its linear thats a bucket of fail too.

The next fubar is quantum computing doesn't provide a magic exponential speedup. There is a page length summary on the wikipedia but it should come as No Surprise Whatsoever to anyone in CS that different algorithm designs inherently have different big O notation and magically sprinkling quantum pixie dust doesn't change that, some algos are linear, some poly, some constant, some exponential, all quantum computing does is swap about where some belong. Solve for X where X+1=2 is not gonna change much, factoring into primes is going to change quite a bit. Some of the most interesting problems are polynomial time not exponential in quantum computing. http://en.wikipedia.org/wiki/Quantum_computer#Potential [wikipedia.org]

Re:Exponentially? (2)

Curunir_wolf (588405) | more than 2 years ago | (#39185261)

Wouldn't this be a game-changer for encryption, though (if they can actually make it work, that is)? I mean, brute-force decryption seems like exactly the kind of computational task that a quantum computer could easily handle. So a brute-force attack on a key that may take hundreds of years on a current supercomputer could be done in a few minutes. No password would be safe from any organization with access to that kind of computing power. Or am I understanding the potential?

Re:Exponentially? (2)

vlm (69642) | more than 2 years ago | (#39186419)

Wouldn't this be a game-changer for encryption, though (if they can actually make it work, that is)? I mean, brute-force decryption seems like exactly the kind of computational task that a quantum computer could easily handle. So a brute-force attack on a key that may take hundreds of years on a current supercomputer could be done in a few minutes. No password would be safe from any organization with access to that kind of computing power. Or am I understanding the potential?

Not necessarily, no. For any crypto app you can come up with some formula where you chunk in the number of bits and it spits out how long it takes to crack it. It exclusively has to do with scalability in design. Double a linear algo and that number takes twice as long. Most (good) crypto is exponential so triple the number of bits it goes up by 3^3 or 27 times longer or whatever. The deal is quantum computing for some crypto increases by poly instead of exponential.

What no one wants to talk about is what if the quantum solution for a given sample takes 100 years ... thats a bucket of fail even if it only increases poly. That is completely freaking useless because it scales nicely if you double the number of bits its just great that it only increases to 200 years, but no one can wait 100 years anyway, so...

This is also the strongly minority belief I hold about P=NP where I'm willing to accept there might be a poly solution to a supposedly exponential problem, but the constant factor for a trivial problem might be like a factorial number of universe-lifetimes so theoretically it exists which is why it hasn't been disproven yet and also it has no real world effect.

Much like the physics analogy of time dilatation at highway speeds, it exists, but its irrelevant in scale.

Re:Exponentially? (0)

Anonymous Coward | more than 2 years ago | (#39186437)

Depends on the form of encryption you use, not all encryption methods are vulnerable to Fourier sampling attacks like RSA is. I seem to recall seeing a story here a year or so back claiming the McEliece cryptosystem to be immune to this attack?

Re:Exponentially? (1, Troll)

drooling-dog (189103) | more than 2 years ago | (#39185931)

Oy... The rate is constant, meaning that the increase is in constant proportion to the value of the function at any given time. That's why calculations of continuous compound growth take exponential form, and it's a result of e^x being its own derivative, as you point out.

Of course neither the OP nor I were talking about the computational order ("Big-O") of a quantum algorithm, because no specific algorithm was under discussion. If such algorithms were typically exponential in N - i.e., O(e^N) - that wouldn't be very encouraging news, would it? The word "exponentially" was clearly used to mean a large, discontinuous increase - a quantum leap, if you will - and not the kind of smooth, fixed-rate growth that it implies in reality.

But, whatever... It's a peeve of mine.

Re:Exponentially? (1)

NoNonAlphaCharsHere (2201864) | more than 2 years ago | (#39184565)

It could have been worse. They could have said "quantum".

Oooooo! (1)

Grindalf (1089511) | more than 2 years ago | (#39184435)

Never mind the banter, where's my nano Mac? Show us your sugar cube sized z-series mainframe! :0)

Re:Oooooo! (1)

mcgrew (92797) | more than 2 years ago | (#39185651)

Show us your sugar cube sized z-series mainframe! :0)

The Raspberry Pi isn't much bigger than a large sugar cube, and it's more powerful than any mainframe from 1960.

economist article more interesting (3, Informative)

rgbrenner (317308) | more than 2 years ago | (#39184481)

The Economist had an interesting article a couple days ago.. at least it's interesting if you don't really know the details of quantum computing:

Quantum computing: An uncertain future [economist.com]

Each extra qubit in a quantum machine doubles the number of simultaneous operations it can perform. It is this which gives quantum computing its power. Two entangled qubits permit four operations; three permit eight; and so on. A 300-qubit computer could perform more concurrent operations than there are atoms in the visible universe.

Re:economist article more interesting (0)

Anonymous Coward | more than 2 years ago | (#39184967)

The Economist is one of the few good magazines left. Love their stuff.

And two picoseconds later... (0)

Anonymous Coward | more than 2 years ago | (#39184593)

The software goons find a way to bring even that hardware to a crawl...

Bad explanition of qubit superposition (3, Informative)

Gary van der Merwe (831179) | more than 2 years ago | (#39184599)

Quote from article:

A qubit, like today's conventional bit, can have two possible values: a 0 or a 1. The difference is that a bit must be a 0 or 1, and a qubit can be a 0, 1, or a superposition of both. "Suppose you take 2 qubits. You can be in 00, 01, 10, and 11 at the same time. For 3 qubits you can be in 8 states at the same time (000, 001, 111, etc.). For each qubit you double the number of states you can be in at the same time. This is part of the reason why a quantum computer could be much more powerful," Ketchen said.

I find that to be a terrible explanation. What he said: "For each qubit you double the number of states you can be in at the same time." is also true for normal bits. Huh? Here is a better explanation: http://en.wikipedia.org/wiki/Qubit [wikipedia.org]

Re:Bad explanition of qubit superposition (1)

Anonymous Coward | more than 2 years ago | (#39184769)

Actually, it is correct. An additional normal bit doubles the number of *possible* states. An additional (entangled) qubit doubles the "number of states you can be in at the same time" (with emphasis being on "at the same time"), which is a colloquial description of doubling the dimension of the state space.

Re:Bad explanition of qubit superposition (2)

Hatta (162192) | more than 2 years ago | (#39186499)

No. With regular bits, you can only be in one state at once. Adding a bit doubles - 1 the number of states you are not in.

What really IS important: (1)

lexa1979 (2020026) | more than 2 years ago | (#39184625)

does it run GNU/Linux ?

Re:What really IS important: (0)

Anonymous Coward | more than 2 years ago | (#39184927)

How many levels of Doom? How about the Flight Simulator?

Re:What really IS important: (0)

Anonymous Coward | more than 2 years ago | (#39189083)

I for one look forward to Quake 2030 on a 4Quintaherz quantum processor. :-)

This does _not_ imply scalability! (5, Interesting)

gweihir (88907) | more than 2 years ago | (#39184641)

For conventional computers, as soon as you have "and" and "not" in gate-form, you can do everything, as you can just connect them together. For quantum computers that is not true, as all elements performing the complete computation need to be entangled the whole time.

IMO, there is now reason to believe that the real-world scalability of quantum computers is so bad that it negates any speed advantage. It seems the complexity of building a quantum computer that can do computations on inputs of size n is at least high-order polynomial or maybe exponential in n. That would explain why no significant advances have been made in keeping larger quantum computing elements entangled in the last 10 years or so and no meaningful sizes have been reached.

Keep in mind that, for example, to break RSA 2048, you have to keep > 2048 bits entangled while doing computations on them. And you cannot take smaller elements and combine them, the whole > 2048 bits need to represent the input all must be entangled with each other or the computation does not work.

Re:This does _not_ imply scalability! (3, Informative)

vlm (69642) | more than 2 years ago | (#39184869)

Theres a nice wiki page with pages and pages of detailed explanation of what this post is talking about.

http://en.wikipedia.org/wiki/Quantum_decoherence [wikipedia.org]

Here's a nice analogy for quantum computing... its a magic old fashioned analog computer with serious reliability and I/O issues. Imagine at the dawn of the computer era you wanted to simulate the statics of a large railroad bridge. In 8 bits it would take a very long time, 16 bits much longer... And to prevent rounding error propagation you have issues. So why not simulate it with a thundering herd of analog opamps which will "instantly" solve the bridges static loads? OK cool, other than all the opamps must work perfectly the entire time you take a measurement which with vacuum tubes is questionable and qubits maybe impossible. The other problem is if you want 32 bit accuracy now your proto-computer engineer needs to build a 32 bit A/D converter to connect to your analog computer... good luck... This is not a perfect quantum computing analogy, but pretty close in many regards.

There is a bad trend in computer science to assume "all computers and algorithm programming problems are about the same" which they historically have been, but are not in the real world. So given two roughly identical algorithms and problems on two roughly identical computers, the smaller big-O notation wins every time, more or less. That is a huge mistake to try that thinking across widely different architectures... OK so factoring computation is exponential on classical computers and everyone ignores I/O because thats constant with a normal bus design or at worst linear. OK so factoring computation is poly on quantum computers hooray for us... whoops looks like I/O might go exponential and constant factor might be years/decades to get the thing working.

The way to keep secure with a classical computer is to pick an algorithm that big O scales such that it can't be broken in this universe. The way to keep secure with a quantum adversary is to pick a key size that seems to make it an engineering impossibility to build a quantum computer, even if by some miracle a quantum computer could solve it in poly time if only it could somehow be built.

Re:This does _not_ imply scalability! (1)

gweihir (88907) | more than 2 years ago | (#39188101)

I like your analog computer analogy. Maybe for those that are not into electronics: Building a working 32 bit A/D converter (i.e. one that has 32 bits accuracy) is pretty much impossible, even at 24 bits the lower bits are only noise from several different noise sources. And OpAmps are pretty noisy to when you get to that precision level. 16...20 bits is about the practical limit unless you do things like supercooling and even then you only gain a few bits.

I also completely agree on the countermeasures. And if the exponent on the "polynomial time" is large enough, then there is no problem at all. The polynomial vs. exp mantra is convenience, not necessity for crypto. If you have, say x^2 for users and x^6 for attackers, you can build practical public key crypto on this, especially if the x^2 is runtime, but the x^6 is complexity of the hardware you need to build.

Re:This does _not_ imply scalability! (0)

Anonymous Coward | more than 2 years ago | (#39185043)

Science has to keep trying, I guess, but seriously...

I get the feeling that this is one of those areas which lives in the same neighborhood as perpetual motion. I mean, if the whole, "measure it and it's no longer true" thing is accurate, then doesn't that kinda imply something?

But then, I know for a fact I don't clearly understand what the hell they're even trying to do. Quantum mechanics might as well be voodoo witchcraft afaic.

Re:This does _not_ imply scalability! (2, Insightful)

Anonymous Coward | more than 2 years ago | (#39185149)

Another way of explaining this is that in order to take advantage of the exponential speed-up of quantum computing in practical applications, you need exponentially better management of entanglement and decoherence effects, which turns out to be a very difficult engineering problem. People keep proposing different models for quantum computing hoping that if they do these operations in solid state rather than via NMR, or in Bose-Einstein condensates, or using exotic pseudo-particles, or other means that the entanglement management and decoherence issues will become tractable. To the best of my knowledge, nobody has yet come up with an approach that really addresses the underlying issue.

I don't really think there's any way to *prove* that it's impossible to do this though, which is why people will keep banging their heads into the problem for some time. Maybe they'll come up with something, or Quantum Computing will become computer science's fusion (a suck of funding and effort that keeps dragging out for decade after decade).

Welcome H(I)A(B)L(M) (0)

na1led (1030470) | more than 2 years ago | (#39184683)

I wonder if IBM will be upgrading Watson with a Quantum computer brain. Won't be long now before they invent HAL.

I miss IBM PC's (1)

thaiceman (2564009) | more than 2 years ago | (#39184823)

I miss the days when IBM actually made PC's they were always rock solid. You could beat someone to death with one of there laptops and after wiping the blood off it it would still work...

Re:I miss IBM PC's (1)

vlm (69642) | more than 2 years ago | (#39184913)

I miss the days when IBM actually made PC's they were always rock solid. You could beat someone to death with one of there laptops and after wiping the blood off it it would still work...

Model M keyboard with the steel backplate and buckling springs. Still use mine with a PS/2 to usb converter thing (not an adapter, a more expensive converter). Lack of a windows key didn't bother me until I switched to the "awesome" windowmanager which likes to use that key as a control key. Bummer.

Re:I miss IBM PC's (1)

thaiceman (2564009) | more than 2 years ago | (#39184995)

I didn't even think about the keyboards, which reminds me that I have a few in the basement hooked up to various machines. I don't get to use them as much as I used to because I use my laptop for all of my daily driver stuff. If you ever break it (not likely) you can pick up a replacement from Unicomp: http://www.pckeyboard.com/ [pckeyboard.com] They are expensive but its the only way to get a new model m these days unless you come across an unopened IBM branded one somewhere (in which case its worth a small fortune).

Re:I miss IBM PC's (1)

vlm (69642) | more than 2 years ago | (#39185289)

Why I'll be... a brand new 104 key type M... that means a windoze key to drive "awesome" window manager with. I may have to retire my old PS/2 type M...

They're not expensive, they're only a hundred bucks. If they're as good as a real type M, your grandkids will be using them, which works out to "about a can of soda per month". Expensive is something like an all plastic "gamers keyboard" for $30 that only lives for 6 months before keys start sticking (true anecdotal story).

Re:I miss IBM PC's (0)

Anonymous Coward | more than 2 years ago | (#39185163)

I miss the days when IBM actually made PC's they were always rock solid. You could beat someone to death with one of there laptops and after wiping the blood off it it would still work...

What's wrong with using a simple axe ?

Re:I miss IBM PC's (0)

Anonymous Coward | more than 2 years ago | (#39185193)

I miss the days when people actually knew the difference between there and their.

Please enlighten me : Quantum computers & MWI (1)

Altesse (698587) | more than 2 years ago | (#39185117)

Ok, IANAP, but, like many slashdotters, am interested in all things science and especially quantum mechanics. Please explain, if you may, this contradiction, because I've been unable to find a good explanation in anything I've read so far.

If we consider the many worlds interpretation to be viable, from what I understand :
- when a scientist will start up the very first quantum computer for the first time -- say, a big 250 qubit computer -- and will test it against a big cypher or whatever, 2^250 universes will participate in the process
- after the quantum collapse, the unique solution will be found, the cypher will be cracked and OUR scientist in OUR world will open a bottle of champagne and congratulate with their team

... Does this mean that, in 2^250 - 1 universes, the scientist will commit suicide, or get fired, (because obviously, the other solutions are uncorrect) ?

Re:Please enlighten me : Quantum computers & M (1)

Altesse (698587) | more than 2 years ago | (#39185239)

Edit : the other solutions are incorrect
(In the other worlds, I'm better at learning foreign languages).

Re:Please enlighten me : Quantum computers & M (1)

JoshuaZ (1134087) | more than 2 years ago | (#39186039)

No. Quantum computing works whether or not MWI is correct. And it doesn't have to do with quantum suicide. In an MWI situation, the vast majority of universes will get the same (correct) result. Essentially, the different universes cooperate with each other before the split off. This isn't quite correct (in MWI there are really discrete universes but rather part of a continuum, and there are a lot of other subtleties involved).

Re:Please enlighten me : Quantum computers & M (1)

FrangoAssado (561740) | more than 2 years ago | (#39188323)

That's not how quantum computers work, despite of what you might have read in science popularization articles. Quantum algorithms don't work like classical algorithms work, but "doing all possibilities at once". That wouldn't work because of the contradiction you described -- once you measure the result, all the other "possibilities" go away.

Quantum algorithms work by not only solving the problem, but also shifting the probabilities of the qubits in such a way that, when you measure it, you get a very high probability of measuring the "right" answer. For example: in the quantum part of Shor's algorithm [wikipedia.org] , you start with a lot of qubits that have 50% probability of being 0 and 50% of being 1, and after doing the computation, you end up with the qubits with a very high probability of having in the "correct" value. This works because the probability of being right can be as high as you want -- you can set things up so that the probability of being wrong is the same as that of an asteroid hitting the computer while it's calculating, for example.

That's why it's very hard to come up with quantum algorithms. For a lot of problems, it's doubtful that it's even possible to have a quantum algorithm that's much better than classical algorithms. For example, most quantum computer scientists think that quantum computers will never be able to solve NP-complete problems much faster than classical computers.

Pre-emptive strike against wtf is a QC (5, Informative)

mathimus1863 (1120437) | more than 2 years ago | (#39185143)

I took a class on Quantum computing, and studied many specific QC algorithms, so I know a little bit about them.

Quantum Computers are not super-computers. On a bit-for-bit (or qubit-for-qubit) scale, they're not necessarily faster than regular computers, they just process info differently. Since information is stored in a quantum "superposition" of states, as opposed to a deterministic state like regular computers, the qubits exhibit quantum interference when mixed with other qubits. Typically, your qubit starts in 50% '0' and 50% '1', and thus when you measure it, you get a 50% chance of it being one or the other (and then it assumes that state). But if you don't measure, and push it through quantum circuits allowing them to interact with other qubits, you get the quantum phases to interfere and cancel out. If you are damned smart (as I realized you have to be, to design QC algorithms), you can figure out creative ways to encode your problem into qubits, and use the interference to cancel out the information you don't want, and leave the information you do want.

For instance, some calculations will start with the 50/50 qubit above, and end with 99% '0' and 1% '1' at the end of the calculation, or vice versa, depending on the answer. Then you've got a 99% chance of getting the right answer. If you run the calculation twice, you have a 99.99% chance of measuring the correct answer. However, the details of these circuits which perform quantum algorithms are extremely non-intuitive to most people, even those who study it. I found it to require an amazing degree of creativity, to figure out how leverage quantum interference constructively.

But what does this get us? Well it turns out that quantum computers can run anything a classical computer can do, and such algorithms can be written identically if you really wanted to, but doing so gets the same results as the classical computer (i.e. same order of growth). But, the smart people who have been publishing papers about this for the past 20 years have been finding new ways to combine qubits, to take advantage of nature of certain problems (usually deep, pure-math concepts), to achieve better orders of growth than possible on a classical computer. For instance, factoring large numbers is difficult on classical computers, which is why RSA/PGP/GPG/PKI/SSL is secure. It's order of growth is e^( n^(1/3) ). It's not quite exponential, but it's still prohibitive. It turns out that Shor figured out how to get it to n^2 on a quantum computer (which is the same order of growth as decrypting with the private key on a classical computer!). Strangely, trying to guess someone's encryption key, normally O(n) on classical computers (where n is the number of possible keys encryption keys) it's only O(sqrt(n)) on QCs using Grover's algorithm. Weird (but sqrt(n) is still usually too big).

There's a vast number of other problems for which efficient quantum algorithms have been found. Unfortunately, a lot of these problems aren't particularly useful in real life (besides to the curious pure-mathematician). A lot of them are better, but not phenomenal. Like verifying that two sparse matrices were mulitplied correctly has order of growth n^(7/3) on a classical computer, n^(5/3) on a quantum computer. You can find a pretty extensive list by googling "quantum algorithm zoo." But the reality is that "most" problems we face in computer science do not benefit from quantum computers. In these cases, they are no better than a classical computer. But for problems like integer factorization, bringing the compute requirements down to polynomial time isn't just faster: it makes a problem solvable that wasn't before.

Unfortunately [for humanity], there is no evidence yet that quantum computers will solve NP-complete problems efficiently. Most likely, they won't. So don't get your hopes up about solving the traveling salesmen problem any time soon. But there is still a lot of cool stuff we can do with them. In fact, the theory is so far ahead of the technology, that we're anxiously waiting for breakthroughs like this, so we can start plugging problems through known algorithms.

Re:Pre-emptive strike against wtf is a QC (1)

msheekhah (903443) | more than 2 years ago | (#39185635)

You can't design quantum algorithms to solve classical computer problems faster?

Re:Pre-emptive strike against wtf is a QC (0)

Anonymous Coward | more than 2 years ago | (#39185901)

You can design quantum algorithms to solve problems faster. Simply implementing a classical algorithm on a quantum computer will not provide ANY speed improvement (in fact, it will almost certainly run much slower, since classical computers are so fast now). "Classical computer problems" is not a terribly meaningful phrase.

Re:Pre-emptive strike against wtf is a QC (1)

Tyler Durden (136036) | more than 2 years ago | (#39185799)

Your explanation was awesome. Thank you.

Re:Pre-emptive strike against wtf is a QC (1)

na1led (1030470) | more than 2 years ago | (#39185807)

What about Quantum Entanglement? Being able to communicate instantaneously across any distance, which would be beneficial to probes exploring deep space.

Re:Pre-emptive strike against wtf is a QC (4, Informative)

JoshuaZ (1134087) | more than 2 years ago | (#39186001)

Entagnlement doesn't allow you to communicate information. The following analogy may help. Imagine two coins that whenever they are both flipped they end up either both heads or both tails, but you can't control which one comes up. So if you separate the two coins, you can use them to get a shared source of randomness which you can use for some useful things (like cryptography) but you can't use it to communicate.

Re:Pre-emptive strike against wtf is a QC (1)

dAzED1 (33635) | more than 2 years ago | (#39187325)

sortof. If both sides can instantly know a rapidly changing value, that value can be used to both encrypt and compress communication. The fact that it isn't controllable doesn't mean that it can't be used for such things.

good explanation (1)

globaljustin (574257) | more than 2 years ago | (#39187913)

"On a bit-for-bit (or qubit-for-qubit) scale, they're not necessarily faster than regular computers, they just process info differently."

Thank you. I have been trying and failing (in tweets @DrEpperly) to explain the concept you describe very succinctly. I have a telecommunications background so we just think of it as having two channels...sort of like the old 'dual-mode' phones...

When you get published saying this please send me a link ;)

Re:Pre-emptive strike against wtf is a QC (1)

FrangoAssado (561740) | more than 2 years ago | (#39188937)

That was a great explanation.

Just one small nitpick: when you talk about factorization, you use "n" for the number of bits, and when you talk about guessing an encryption key, you use "n" for the number of possible keys, which makes things a little unnecessarily confusing. I'd change the second one to also use number of bits -- so it would be O(2^n) on classical computers and O(2^(n/2)) for quantum computers. This way it's also easier to see that the square root (i.e., the factor of 1/2 in the exponent) doesn't buy a lot.

IBM layoffs (1)

Anonymous Coward | more than 2 years ago | (#39185567)

This news story appears the day after IBM laid off a number of engineers in STG. (system and technology group, the part of the company that works on operation systems and hardware like Power, blades, Z, etc)

Not that IBM would be attempting to deflect any negative news stories which might range from the very tight lipped control on number of employees let go, forbidding those employees let go from talking to the press or lose their severance pay, current number of employees in the US, brain drain of engineers leaving cause impacts in product creation.

Stay classy IBM.

HD Decryption Orders (0)

Anonymous Coward | more than 2 years ago | (#39185631)

As soon as the 'decryption orders' stop and Truecrypt encrypted volumes are being entered into evidence decrypted, IBM will be on to something.

"...properties in quantum bits or quibits" (1)

SomePoorSchmuck (183775) | more than 2 years ago | (#39185673)

I always thought the contraction was "qubit" for "Quantum Bit".
Is "quibit" an accepted variant spelling, and, if so, where does the extra letter "i" come from?

how misleading! (0)

Anonymous Coward | more than 2 years ago | (#39186925)

i hate how there are all these articles about how they finally have these break throughs! there was over a dozen alerts for IBM this morning and this record breaking discovery. DWAVE systems of BURNABY BRITISH COLUMBIA CANADA! has made a functional quantum computer chipset for over a year! They are already building chips at 500+ qubits with there first machine being sold to LOCKHEED MARTIN for $10 MILLION. they are already establishing a quantum computing cloud to allow access to developers and organizations to perform specific calculations. all these computer companies either dont read the news or are just fragrantly disregarding any level of honest disclosure. it's disgusting! its a huge breakthrough AGREED. made by another company!!!! get over it! TRY AND CATCH UP ALL YOU LIKE.. GOOD FOR YOU! BUT DONT TRY AND FOOL PEOPLE INTO THINKING YOU ARE THE PIONEER AND INDUSTRY LEADER WHEN YOU ARE NOT!

today's arXiv on quantum computation (0)

Anonymous Coward | more than 2 years ago | (#39188019)

Everybody who's is interested in advances of quantum computation should have a look at this publication [ucsb.edu]
It was uploaded to arXiv today and shows another implementation of Shor's algorithm in a four-qubit system.
Althought their fidelity is not as high as the one claimed by IBM, I do think that their technology is a little bit more advanced.

Quantum Computing the new fusion (1)

jweller13 (1148823) | more than 2 years ago | (#39188607)

Promises of quantum computers seem to be suffering from the fusion syndrome. Fusion has been "only 20 years away" for the last 40 years. :P
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>