Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Separating Hope From Hype In Quantum Computing

timothy posted more than 3 years ago | from the semi-un-non-deterministic dept.

Software 109

pgptag writes "This talk by Dr. Suzanne Gilbert (video) explains why quantum computers are useful, and also dispels some of the myths about what they can and cannot do. It addresses some of the practical ways in which we can build quantum computers and gives realistic timescales for how far away commercially useful systems might be."

cancel ×

109 comments

Sorry! There are no comments related to the filter you selected.

Post with unknown state (5, Funny)

NoSleepDemon (1521253) | more than 3 years ago | (#33497822)

Upon observation, this post has collapsed into the first post state.

Re:Post with unknown state (-1, Troll)

Anonymous Coward | more than 3 years ago | (#33498162)

Upon observation, this post has collapsed into the first post state.

This one collapsed into the nigger joke state.

How was break dancing invented? From niggers trying to steal hubcaps from moving cars.

Re:Post with unknown state (1, Informative)

Anonymous Coward | more than 3 years ago | (#33498632)

The direct link appears:

http://blip.tv/file/get/Telexlr8-vbSuzanneGildertOnQuantumComputingInTeleplaceSeptember4640.flv

question: (1)

Pojut (1027544) | more than 3 years ago | (#33497830)

Would this sort of thing ever become useful for personal use? Or is Quantum Computing strictly a commercial endeavor?

Re:question: (5, Funny)

tom17 (659054) | more than 3 years ago | (#33497954)

Definitely commercial-only. The world only needs five quantum computers.

Re:question: (3, Insightful)

mcgrew (92797) | more than 3 years ago | (#33498324)

It appears that the moderators don't know any history. You're obviously making a joke based on the observation in the early 1950s that "the worldwide market for computers is about ten." It's funny now, but then computers weren't very useful for anybody without huge number crunching and database needs and multi-million dollar budgets. At the time, a computer took an entire building to house, and a whole lot of personnel to operate. The most powerful computer in existance was less powerful than a singing Hallmark card.

So the joke's on the mods, who actualy believe it. Of course, right now the worldwide market is zero, since they haven't actually constructed one yet. If and when they accomplish the feat, it's possible that in the future all compuers will be quantum computers. I doubt I'll live long enough to see it (I'm not young any more). [kuro5hin.org]

That link will give the mods a little computer history if they're interested.

Re:question: (1)

Bigjeff5 (1143585) | more than 3 years ago | (#33498814)

You've got to give these things times, my man. This is Slashdot. Just an hour after your post (and about an hour and 40 minutes after the original) it is up to +5, Funny where it belongs.

Re:question: (1)

mcgrew (92797) | more than 3 years ago | (#33501178)

It was a +4 insightful when I made the post; it's now +5 funny, 50% insightful and 50% funny according to the "score" link.

Re:question: (4, Informative)

houghi (78078) | more than 3 years ago | (#33498992)

The quote debunked http://en.wikipedia.org/wiki/Thomas_J._Watson#Famous_misquote [wikipedia.org]

Some facts:
1) The misquote is "I think there is a world market for maybe five computers,"
2) The story had already been described as a myth in 1973
3) Correct quote: "IBM had developed a paper plan for such a machine and took this paper plan across the country to some 20 concerns that we thought could use such a machine. [...] But, as a result of our trip, on which we expected to get orders for five machines, we came home with orders for 18."

Re:question: (2, Funny)

Ihmhi (1206036) | more than 3 years ago | (#33500320)

Geez, thanks for ruining a good meme with facts. Next thing you know we'll find out all those cats have been misquoted time and time again.

Re:question: (0)

Anonymous Coward | more than 3 years ago | (#33501818)

O Hai theys talks abut mez!

Re:question: (1)

mcgrew (92797) | more than 3 years ago | (#33501092)

Thank you for the correction. The wiki article is interesting, this line "All these early quotes are questioned by Eric Weiss, an Editor of the Annals of the History of Computing in ACS letters in 1985" caught my eye - Harry Houdini's [wikipedia.org] real nane was Eric Weiss, according to biographies I've read, although wikipedia says "Harry Houdini was born as Erik Ivan Weisz (he would later spell his birth name as Ehrich Weiss)". It's magic!

As you say, Watson didn't say it, but from the wiki article:

I went to see Professor Douglas Hartree, who had built the first differential analyzers in England and had more experience in using these very specialized computers than anyone else. He told me that, in his opinion, all the calculations that would ever be needed in this country could be done on the three digital computers which were then being built -- one in Cambridge, one in Teddington, and one in Manchester. No one else, he said, would ever need machines of their own, or would be able to afford to buy them.

(quotation from an article by Lord Bowden; American Scientist vol 58 (1970) pp 43-53); cited on Usenet.[13] The misquote is itself often misquoted, with fifty computers instead of five.

Re:question: (1)

tibit (1762298) | more than 3 years ago | (#33499154)

I have an offtopic question. Eniac is your contemporary, and it was used to do a bunch of numerical calculations. How did it compare to what Feynman and technologically-apt teenagers under his direction did for Project Manhattan? Does anyone know a rough order of magnitude of multiplies-and-adds that both projects had to go through, and the time it took? IIRC, Feynman's boys have figured out pretty much every basic contemporary CPU/GPU design trick (pipelining, interleaving/scheduling, speculative execution, you name it they've done it), even stuff that is not in production (error correction with pipeline stalling and reexecution that would work for, say SEUs). The boys have done it all on multi-colored punch cards.

Re:question: (1)

philipborlin (629841) | more than 3 years ago | (#33499294)

Of course, right now the worldwide market is zero, since they haven't actually constructed one yet.

Except that in the video she clearly talks about several quantum computers that have been built and have actually solved problems.

Re:question: (1)

arth1 (260657) | more than 3 years ago | (#33499402)

That link starts out with "ENIAC, the first electronic programmable computer", and goes downhill from there. Sure, let's forget Colossus Mark I and II.

The rest of the article is a jump between family anecdotes and quite limited personal experiences that I do not think "will give the mods a little computer history". Where are TI, Fairchild and Motorola? Or HP? Were the DEC PDPs not worth mentioning? Didn't anything happen between 1974 and 1982 that deserved more than a single combined sentence? To me, it seems you missed the start of the home computer revolution, as well as most of what happened in business computers while you were in the army?

Re:question: (1)

mcgrew (92797) | more than 3 years ago | (#33500646)

It's a little computer history, not an exhaustive history. Note the title is "Growing up with computers"; it's a personal chronicle to give a little insight to younger folks.

Re:question: (0)

Anonymous Coward | more than 3 years ago | (#33500402)

(I'm not young any more). [kuro5hin.org]

That link will give the mods a little computer history if they're interested.

Unfortunately the "history" is, as so often, written entirely from the US point of view.

ENIAC was *not* the world's first electronic digital computer, that was Colossus, built by the British Post Office Research Unit and designed by Tom Flowers. It was completed in December 1943 and went into service (i.e., unlike ENIAC, it was actually ready before the war ended) in February 1944.

The world's first electronic, digital *stored program" computer was the Manchester (UK) BABY, which ran its first program in June 1948.

The world's first production dspc was the EDSAC, built at Cambridge University by Maurice ("Mr.Microcode") Wilkes; it ran its first program in March 1949.

The US did no better than get the world's third dspc.

The world's 4th dspc, BTW, was built in Australia.

It's called the "Not Invented Here Syndrome" people.

Re:question: (1)

mcgrew (92797) | more than 3 years ago | (#33500908)

Colossus was secret until long after the fact. And I haven't reread that piece in years (I'd rather read other people's writing, I can't learn from my own), but didn't I mention that the first computer I bought was the TS-1000, designed by Clive Sinclair, a Brit?

Of course a first person chronicle written by an American will have an American bias, just as a first person chronicle written by an Austrilain will have an Australian bias.

Re:question: (2, Interesting)

Marxist Hacker 42 (638312) | more than 3 years ago | (#33505406)

Actually, based on TFA, I'd say we're more likely to see a multi-core processor with some quantum and some classic cores. Kind of like the old floating point co-processors, or going back still further, the TI-99/4A architecture which was made up of a CPU with dedicated video, audio, and peripheral co-processors.

and each of those.. (0)

Anonymous Coward | more than 3 years ago | (#33504106)

..quantum computer will never need any more than 640 qubits!
The presentation was ok, but something anyone could have put together using internet resources alone, so meh.
Also, crappy multi-video feed, and general crappy implementation of quantum computing, crappy world really

Perhaps (2, Funny)

sycodon (149926) | more than 3 years ago | (#33497990)

Maybe in the future, a Quantum Computer running Windows x.x will be able to harness its power to show the contents of a folder in less than the 30 seconds it takes now.

Re:Perhaps (3, Funny)

Whalou (721698) | more than 3 years ago | (#33498040)

On the contrary, observing the content of a folder would change its state.


I'm not a quantum physics expert and I don't play one on television

And if I did, the show would have been canceled.

Re:Perhaps (1)

CarpetShark (865376) | more than 3 years ago | (#33505908)

On the contrary, observing the content of a folder would change its state.

I know windows 2.0 was a while back, but the good news is that they're finally going to get this bug fixed for Windows 8.

Re:Perhaps (5, Funny)

daveime (1253762) | more than 3 years ago | (#33498098)

If you'd categorized your porn collection properly, it wouldn't need to all be in one folder :-(

Re:Perhaps (1)

ByOhTek (1181381) | more than 3 years ago | (#33498104)

It takes less than 30 seconds now, for those of us running on hardware more advance than an 80286

Re:Perhaps (2, Funny)

sycodon (149926) | more than 3 years ago | (#33498228)

Here we are having fun and you have to go throw your superior hardware in our face.

I bet you're real fun at parties.

Re:Perhaps (1)

toastar (573882) | more than 3 years ago | (#33498366)

It takes less than 30 seconds now, for those of us running on hardware more advance than an 80286

It takes more then 30 seconds for a 486 to index a TB of porn... And thats even with turbo

Re:Perhaps (1)

Killall -9 Bash (622952) | more than 3 years ago | (#33498554)

It feels like 30 seconds when you have a badass computer at home with 4 cores, 8gb ram, and an SSD.... and then you go to work. 5 seconds of explorer refreshing seems like 5 minutes.

Re:Perhaps (1)

owlstead (636356) | more than 3 years ago | (#33500252)

Yeah, but I solved that by removing all the porn from my computer at work.

Re:Perhaps (0)

Anonymous Coward | more than 3 years ago | (#33498252)

Obviously someone is incapable of using a computer correctly.

Re:Perhaps (0)

Anonymous Coward | more than 3 years ago | (#33499066)

It's just practice for foreplay, dude.

Re:Perhaps (0)

Anonymous Coward | more than 3 years ago | (#33500242)

Thirty seconds for foreplay? Get real.

Waay too long.

You mean like.. (0)

pablo_max (626328) | more than 3 years ago | (#33497842)

"I hope this thing is really fast running my Beowulf cluster" or "oh man, these things will run a Beowulf cluster 10000x faster than today's machines! "

Re:You mean like.. (1, Insightful)

Anonymous Coward | more than 3 years ago | (#33498348)

I think that either English is not your first language, or you don't know what a Beowulf cluster is.

Re:You mean like.. (1)

Bigjeff5 (1143585) | more than 3 years ago | (#33498838)

You obviously don't know what a Beowulf cluster is.

The joke is "imagine a Beowulf cluster of those!" for a reason.

The quantum computers wouldn't run your Beowulf cluster, they would be your Beowulf cluster.

And the first ones will probably be slow as shit anyway (but catch up much faster than current tech).

Re:You mean like.. (1)

maxwell demon (590494) | more than 3 years ago | (#33501056)

The quantum computers wouldn't run your Beowulf cluster, they would be your Beowulf cluster.

Unless you're running a Beowulf cluster emulator on them, of course.

Oops...thought this was about Obama (3, Funny)

bricko (1052210) | more than 3 years ago | (#33497844)

Oops. Thought this thread title was about Obama....sorrry.

Re:Oops...thought this was about Obama (0)

GrumblyStuff (870046) | more than 3 years ago | (#33501766)

Nah, it's about Fox News. By balancing out mostly truth with mostly fiction, their audience doesn't know the exact state of the union. Because they don't know, they don't languish into depression and become unproductive. However, given their politics, if they did become unproductive there, we would be better off and more productive overall. It doesn't bother me since on a long enough timeline, everything collapses into one state.

Video? (1, Funny)

Anonymous Coward | more than 3 years ago | (#33497846)

What video? There's no video on that page, only a huge blank gap sponsored by Adobe.

Re:Video? (3, Funny)

bhartman34 (886109) | more than 3 years ago | (#33497858)

Is that on the iPad or iPod Touch? :)

Re:Video? (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#33498204)

The video that everyone but people using inferior devices or OSes can see.

Re:Video? (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#33498676)

I'm not going to run a fucking plug-in that's going to waste 80%+ of my CPU just to play a godamn video. Adobe's code is totally inefficient and I won't help them boost their stupid "installed userbase" if I can help it.

Re:Video? (0, Troll)

SETIGuy (33768) | more than 3 years ago | (#33498738)

Too embarrassed to say you're reading this from an iPad?

Re:Video? (-1, Redundant)

Anonymous Coward | more than 3 years ago | (#33498780)

You know... because it was busy doing something else productive? Not using an app, because it's CPU intensive for a few hours is ridiculous. You are a baby. Your crying is ignored and the rest of us are enjoying the video.

Re:Video? (1)

bhartman34 (886109) | more than 3 years ago | (#33500484)

Let's assume that you're correct that Flash would use up 80% of your CPU. What else are you planning on doing while you're watching a 2-hour+ video on quantum computing and quantum mechanics? I mean, okay, you could compile code with those clock cycles, I suppose, but other than some automated task (which will still putter along while you watch the video, by the way), what would you need the CPU for in the meantime? I highly doubt you'd get anything out of the video if you tried to play Call of Duty 4 in another window while you watched it.

Clock cycles aren't an irreplaceable resource, either. Once the video ends, you get them back. Applications borrow them, they don't steal them.

Can someone post a _real_ summary (0)

Anonymous Coward | more than 3 years ago | (#33497850)

For those who don't currently have 1.5 hours to WTFV?

Y'know, like _what_ is she actually saying, rather than just stating a fact she happens to be saying something.

Re:Can someone post a _real_ summary (1)

CarpetShark (865376) | more than 3 years ago | (#33506074)

It's well worth watching, considering that this is the future and the summary is comprehensive and all. However, the basic upshot is:

* some myths have built up around quantum computing, such as that classical crypto will be made obsolete. [ I'm not sure what her point was here; she seems to dismiss most these, but then never really goes back to it. Her other details seem to support these "myths" more than debunk them ]
* quantum computers fit reality better than classical computers, therefore we need them to understand and model our physical universe, almost regardless of how much "better" they are than classical computers in other ways [ Though this seems largely a performance thing to me ]
* building quantum computers is progressing; the company she works for (DWave) is producing 128-qubit chips now [ though they're specialised ]
* building stuff for quantum computers (such as refrigeration units to supercool them) is progressing in parallel [ supercoolers are available as desktop devices, will probably fit in a PC case like a PSU soon ]
* the chips they're building at DWave are specialised for pattern recognition and other energy reduction (simulated/quantum annealing) problems
* she expects quantum computing to be based around specialised co-processors (like GPUs, physics cards etc.) rather than general purpose CPUs
* there are lots of different ways to build quantum computers, each with their own pros and cons. We're nowhere near a standardised architecture (like Von Neumann) yet, for quantum computing.
* google have worked with DWave on an image recog project using their chip, and it's now performing better on their quantum chip(s?) than on google's previous hardware
* they used boink to generate known-good test results to compare with their quantum chip
* boink took ~1000 hours on ~1000 computers to do what their quantum chip does in something between a few seconds and a few minutes
* QC is still facing some major hurdles, not only of engineering and science, but of funding too.
* funding is the main obstacle
* one of the big funding problems is building a quantum computer in small labs/foundries that rivals the established, trillion-dollar industry of classical computing. Since foundries work better in large scale, it's hard to compete and prove the worth of QC
* she doesn't believe that academia has the necessary funding/project-time structure to allow non-commercial research/development at a higher level than the fundamental concepts

Who is going to watch this? (4, Insightful)

Zontar_Thing_From_Ve (949321) | more than 3 years ago | (#33497856)

We can't even get people to read the articles referenced in submissions. That's wildly optimistic to expect us to watch a video that is over 2 hours long.

This is begging for an "executive summary" from anyone who has time to watch it, if there is such a person.

Re:Who is going to watch this? (4, Informative)

JoshuaZ (1134087) | more than 3 years ago | (#33497940)

I don't have time to watch this right now, but if I have to make a guess, the primary points are going to be about the common misconceptions about quantum computers. The most common such belief seems to be the belief that a quantum computer can solve NP-complete problems in polynomial time. This is false although many problems which are believed to be in NP are believed to be not in P are solvable with quantum computer. The most prominent example is integer factoring since the difficulty of factoring large integers is something many crypto systems depend on (such as RSA). There's probably some addressing also that consciousness probably has nothing to do with any quantum effects in the human brain because structures there are generally too warm and too large to have meaningful quantum entanglement.

Re:Who is going to watch this? (2, Funny)

BergZ (1680594) | more than 3 years ago | (#33498572)

Warm, wet, and squishy doesn't seem to be a limiting factor on quantum mechanical behavior anymore: Untangling the Quantum Entanglement Behind Photosynthesis [sciencedaily.com] .

Re:Who is going to watch this? (0)

Anonymous Coward | more than 3 years ago | (#33498774)

A better title would be "How to get a bigger grant by saying your biochemistry experiment is an example of quantum entanglement." So is any atom or molecule with two or more electrons.

Allow me to expand on the complexity theory parts (1)

jonaskoelker (922170) | more than 3 years ago | (#33505976)

The most common such belief seems to be the belief that a quantum computer can solve NP-complete problems in polynomial time.

Allow me to expand a bit on that.

There's a complexity class known as BQP which is defined to be what quantum computers can do in polynomial time (hence the Q and P; the B is for Bounded error probability, i.e. algorithms succeed with probability at least 2/3; if you want better: repeat and take majority voting).

It is known that BQP contains P and BPP (randomized poly-time turing machines), and is contained within PSPACE (which contains NP).

It is conjectured that P != NP and that BQP contains some but not all of (NP minus P), and is also not contained in NP.

Now, let's talk about factoring. It is known to be in NP (if I show you a candidate factoring, you can multiply it together rapidly and check whether I told the truth; you can even check primality in polynomial time). It's also in coNP---well, depending on how you turn factoring into a yes/no question. I lean towards "given n and k (in binary) does n have a non-trivial factor less than k". Then you can extract such a factor by binary search. It that case, it's easy to check proofs that a number _doesn't_ have a non-trivial factor less than k: give the full factorization as the proof, then verify the factorization as checking, and also check that no prime factor is less than k.

We don't know whether factoring is in P. Factoring is known to be in BQP, though!

This is due to Shor's algorithm. A very brief 10 kilofoot overview: it uses Fourier transformation to detect periodicity in functions. In cyclic groups, raising a generator element to exponents 1, 2, 3, ... has a periodicity to it. In particular, the RSA group has a periodicity to it; and you can factor n if you know phi(n), the order of the RSA group. (I'm told you can do "basically the same" to solve discrete logarithms).

I hope this helps someone.

Summarizing... (0)

Joce640k (829181) | more than 3 years ago | (#33497992)

Quantum computers are useful for the following class of problem:

      1. The only way to solve it is to guess answers repeatedly and check them,
      2. There are n possible answers to check,
      3. Every possible answer takes the same amount of time to check, and
      4. There are no clues about which answers might be better: generating possibilities randomly is just as good as checking them in some special order.

If your problem doesn't look like that then quantum computers won't help.

(source: wikipedia [wikipedia.org] )

Re:Summarizing... (2, Informative)

Tacvek (948259) | more than 3 years ago | (#33498472)

That is obviously not the only thing it can do. In P time it can solve P problem (much like a classical computer, but potentially using $\sqrt{classical}$ time, if it meets the above requirements. You can use quantum computing to find (with any probability of your choice which is less than one) the solution to a BPP problem in P time, which is again just like classical computers. Something new here is the ability to solve BQP problems (with any chosen probality less than one) in P time.

That last one is the killer. That is because two of the "hard" problems we use in asymmetric cryptography are BQP, namely integer factroization and discrete logarithms
are in BQP.[1]

What we really want is asymmetric encryption based on an NP-complete problem where many instances can be shown to take no less time (asymptotically) than the hardest instances to solve (i.e. many instances are tied for the hardest), and an easy way to generate instances of this hardest problem, and corresponding solution. That is really tricky, as many FNP problems that are not optimization problems (not NPO) have many instances that can be solved in only P time.

Footnote:
[1] Actually that is not strictly true. The problems have more than a yes or no answer, making them FBQP problems. But FBQP-complete problems take no longer to solve than BQP-complete problems. So quantum computers can solve FBQP with any given probability of success in only P time.

Re:Summarizing... (0)

Anonymous Coward | more than 3 years ago | (#33498496)

If your problem doesn't look like that then quantum computers won't help.

Sweet! Quantum computers could find me a girlfriend!

Re:Who is going to watch this? (4, Insightful)

gandhi_2 (1108023) | more than 3 years ago | (#33498008)

I just want to know what exactly is added to this presentation by using an avatar on a virtual stage.

People want to bash powerpoint but someone takes up half the video area with superfluous (and bad) VR and no one minds?

Re:Who is going to watch this? (2, Informative)

pgptag (782470) | more than 3 years ago | (#33498406)

@gandhi re added value of avatar on virtual stage: this is an online talk with a participative audience in realtime telepresence. The second half of the video shows a very lively Q/A session and discussion, with a lot of people asking a lot of questions.

Re:Who is going to watch this? (0, Offtopic)

ceoyoyo (59147) | more than 3 years ago | (#33499608)

You're not on Twitter.

Re:Who is going to watch this? (1)

ceoyoyo (59147) | more than 3 years ago | (#33499590)

It makes it a video. And videos are cool.

Re:Who is going to watch this? (3, Insightful)

mcgrew (92797) | more than 3 years ago | (#33498028)

Indeed; is there a printed transcript anywhere? I can read a lot faster than I can listen, with a lot better comprehension.

Re:Who is going to watch this? (0)

Anonymous Coward | more than 3 years ago | (#33498260)

Actually, she said that Quantum computing is:
    bogus,
    all powerful,
    great for games,
    only for mainframes,
    easily done on any PC,
    expensive,
    cheap,

but that's until you open the video.

Re:Who is going to watch this? (0)

Anonymous Coward | more than 3 years ago | (#33499132)

Let's see.

A (arguably) cute geeky girl in her late 20's / early 30's lecturing for two hours about quantum computers and making attempts at jokes about sci-fi and technology.

Yeah... Who's gonna want to watch that?

Re:Who is going to watch this? (1)

noidentity (188756) | more than 3 years ago | (#33499174)

We can't even get people to read the articles referenced in submissions. That's wildly optimistic to expect us to watch a video that is over 2 hours long.

As long as nobody watches it, we can't really say for certain what's in it.

Re:Who is going to watch this? (1)

philipborlin (629841) | more than 3 years ago | (#33499624)

There are many myths about quantum computers. The most prevalent myths are that they will break all cryptographic protocols, be exponentially faster, do all calculations simultaneously, and solve NP-Complete problems in polynomial time. These are all untrue to various degrees.

A quantum computer is a computer that uses at least one quantum effect to solve problems. Currently quantum computers are leveraging either superposition or entanglement. A difficult hurdle to scaling quantum computers is decoherence which basically renders qubits inoperable. Decoherence happens when qubits are too close to each other. This is a major problem because currently we scale processors by cramming more and more transistors into a smaller and smaller space.

There is no one-way to build quantum computers. Several models are Gate model, Adiabatic, Measurement Based, and Topological. Several implementations are Ion Trap, NMR schemes, Nitrogen Vacancy, and Superconducting electronics. Some of these such as Nitrogen Vacancy are theoretical at this point since nobody has figured out how to implement them yet.

Basically Quantum computers at this stage aren't envisioned to replace classical computers, but they will be really useful as specialized computers for solving certain types of problems. The problem with classical computers are that they work off of classical physics and so they have a hard time modeling the way the universe really works (ie in a quantum manner). Quantum computers on the other hand behave in a quantum manner (duh) and so they are more ideally suited to solving simulation type problems. Some problems they are ideally suited for are machine learning, pattern recognition, neural networks, and building synthetic brains.

IBM has a 7 qubit machine that can successfully factor the number 15. This is not very impressive computationally but it does serve as a working proof that quantum computers aren't just theory. She showed a picture of a chip that has 128 qubits on it and another picture of a quantum computer that takes up a full room. She predicts we will see a commercially viable quantum computer within the next few years.

Re:Who is going to watch this? (2, Insightful)

CarpetShark (865376) | more than 3 years ago | (#33500602)

Yeah. A cute, fresh-faced, geeky female doctor with glasses, summarising quantum computers in about an hour. Nah, no one here wants to watch that ;)

Re:Who is going to watch this? (1)

PCM2 (4486) | more than 3 years ago | (#33501242)

Not only that, but whatever crappy player they're using doesn't seem to want to let you seek. No matter where you move the marker, the whole presentation just starts over from the beginning -- complete with the audience jabbering right over the speaker.

Re:Who is going to watch this? (2, Informative)

pgptag (782470) | more than 3 years ago | (#33506104)

Not only that, but whatever crappy player they're using doesn't seem to want to let you seek. No matter where you move the marker, the whole presentation just starts over from the beginning -- complete with the audience jabbering right over the speaker.

Go to the source http://telexlr8.blip.tv/file/4083093/ [telexlr8.blip.tv] open the Files and Links box in the right column and download the original .mp4 video file.

It's either hope or hype (0)

Anonymous Coward | more than 3 years ago | (#33497876)

Or both at the same time.

Schrodinger's laptop (2, Funny)

Drakkenmensch (1255800) | more than 3 years ago | (#33497910)

The quantum computer is both a realistic ideal and vaporware hype, until a computer journalist examines the claims.

She's a cutie (1)

DurendalMac (736637) | more than 3 years ago | (#33497926)

I'd position her qubits any day of the week!

Re:She's a cutie (0)

Anonymous Coward | more than 3 years ago | (#33498026)

She's British. Can't you tell?
Why do you think they are all alcoholics on that island?

Re:She's a cutie (0)

Anonymous Coward | more than 3 years ago | (#33498426)

I'd position her qubits any day of the week!

I find I have a sudden strong interest in Quantum Computing.

Irony? (2, Informative)

gotfork (1395155) | more than 3 years ago | (#33498096)

Someone from D-wave is giving a talk called "separating hope from hype": http://arstechnica.com/hardware/news/2007/02/quantum.ars [arstechnica.com] http://www.technologyreview.com/computing/20587/ [technologyreview.com] http://en.wikipedia.org/wiki/D-Wave_Systems [wikipedia.org]

Re:Irony? (2, Funny)

Abstrackt (609015) | more than 3 years ago | (#33498192)

Looks more like spooky action at a distance to me.

The real question is (1)

Kjella (173770) | more than 3 years ago | (#33498112)

The real question is if there's some significant use case not already covered by current methods, like RSA and AES for encryption. Sure quantum encryption have some nice theoretical properties, but most things are not 110% secure. You can still bribe people, extort people, plant spys, record passwords and so on. I doubt for almost any system that pure crypto is the weakest link in the chain anymore. Maybe, just maybe there's a quantum code cracking computer deep in the halls of the NSA but it won't be any of the "regular" attackers no matter how well funded and organized they are. Seriously, it's nice to be paranoid but the idea of a quantum attack on your encryption seems as unlikely as an asteroid impact taking out your main office.

Re:The real question is (1, Funny)

Whatanut (203397) | more than 3 years ago | (#33498392)

At some point this becomes truth...

http://www.xkcd.org/538/ [xkcd.org]

Re:The real question is (1)

maxwell demon (590494) | more than 3 years ago | (#33501256)

The real question is if there's some significant use case not already covered by current methods, like RSA and AES for encryption.

I'd expect that the main use for quantum computers will be to simulate quantum systems.

Second Life Presentations suck (2, Informative)

Danathar (267989) | more than 3 years ago | (#33498390)

I was going to listen, but the dude yakking in the background totally oblivious (well..not totally oblivious as he questioned himself as to why he can hear himself talking) to the fact that his mic is broadcasting right over the speaker. Dumb.

W/O RTFA (5, Informative)

mathimus1863 (1120437) | more than 3 years ago | (#33498398)

I took a class on Quantum computing, and studied many specific QC algorithms, so I know a little bit about them. If you don't want to RTFA, then read this: Quantum Computers are not super-computers. On a bit-for-bit (or qubit-for-qubit) scale, they're not necessarily faster than regular computers, they just process info differently. Since information is stored in a quantum "superposition" of states, as opposed to a deterministic state like regular computers, the qubits exhibit quantum interference around other qubits. Typically, your bit starts in 50% '0' and 50% '1', and thus when you measure it, you get a 50% chance of it being one or the other (and then it assumes that state). But if you don't measure, and push it through quantum circuits allowing them to interact with other qubits, you get the quantum phases to interfere and cancel out. If you are damned smart (as I realized you have to be, to design QC algorithms), you can figure out creative ways to encode your problem into qubits, and use the interference to cancel out the information you don't want, and leave the information you do want. For instance, some calculations will start with the 50/50 qubit above, and end with 99% '0' and 1% '1' at the end of the calculation, or vice versa, depending on the answer. Then you've got a 99% chance of getting the right answer. If you run the calculation twice, you have a 99.99% chance of measuring the correct answer. However, the details of these circuits which perform quantum algorithms are extremely non-intuitive to most people, even those who study it. I found it to require an amazing degree of creativity, to figure out how to combine qubits to take advantage of quantum interference constructively. But what does this get us? Well it turns out that quantum computers can run anything a classical computer can do, and such algorithms can be written identically if you really wanted to, but doing so gets the same results as the classical computer (i.e. same order of growth). But, the smart people who have been publishing papers about this for the past 20 years have been finding new ways to combine qubits, to take advantage of nature of certain problems (usually deep, pure-math concepts), to achieve better orders of growth than possible on a classical computer. For instance, factoring large numbers is difficult on classical computers, which is why RSA/PGP/GPG/PKI/SSL is secure. It's order of growth is e^( n^(1/3) ). It's not quite exponential, but it's still prohibitive. It turns out that Shor figured out how to get it to n^2 on a quantum computer (which is the same order of growth as decrypting with the private key on a classical computer!). Strangely, trying to guess someone's encryption key, normally O(n) on classical computers (where n is the number of possible keys encryption keys) it's only O(sqrt(n)) on QCs. Weird (but sqrt(n) is still usually too big). There's a vast number of other problems for which efficient quantum algorithms have been found. Unfortunately, a lot of these problems aren't particularly useful in real life (besides to the curious pure-mathematician). A lot of them are better, but not phenomenal. Like verifying that two sparse matrices were mulitplied correctly has order of growth n^(7/3) on a classical computer, n^(5/3) on a quantum computer. You can find a pretty extensive list by googling "quantum algorithm zoo." Unfortunately [for humanity], there is no evidence yet that quantum computers will solve NP-complete problems efficiently. Most likely, they won't. So don't get your hopes up about solving the traveling salesmen problem any time soon. But there is still a lot of cool stuff we can do with them. In fact, the theory is so far ahead of the technology, that we're anxiously waiting for breakthroughs like this, so we can start plugging problems through known algorithms.

Re:W/O RTFA (1)

kurokame (1764228) | more than 3 years ago | (#33498640)

They're hypothetically faster in the case of quantum-quantum operations since they're analog with hypothetically infinite data density (where a binary bit stores a 0 or a 1, a qbit stores any value between 0 and 1). But without improved ways to interface with this, it's of fairly limited use. Nature simulates itself with perfect fidelity, which is not really of help to us unless we can find a reliable way to reduce the answer to something consistent and human-understandable.

It's true but potentially misleading to say that quantum computers are not supercomputers...for the simple reason that they operate on a completely different principle. The effective power of an ideal quantum computer for applications to which it is well-suited would be several orders of magnitude better, but once this gets out of the lab we will probably be looking at a hybrid approach rather than a complete transition which leaves the old approach behind.

Quantum encryption is an any-time-now technology (I believe it has actually been implemented on a test scale in military / industrial / banking applications). Quantum computing is a "20 years" technology - scientist speak for "we can see it just over the horizon" which tends to evaluate to somewhere between 10 years and 100 or more. If you want to get a really hard look at the future of computing, try looking back twenty years. Man without time machine is doomed to make wild guesses about future.

Incidentally, I am posting this from a "quantum computer" - modern solid state physics relies heavily on quantum mechanics.

Re:W/O RTFA (0)

Anonymous Coward | more than 3 years ago | (#33498654)

Christ!

I wish you'd taken a writing class and learnt how to paragraph properly.

Re:W/O RTFA (1)

baka_toroi (1194359) | more than 3 years ago | (#33500086)

I agree, though Slashdot's posting system is not newbie friendly.

Actually, it sucks big time.

Re:W/O RTFA (1)

owlstead (636356) | more than 3 years ago | (#33500356)

It's not very friendly to any other user either if you ask me.

(But the content of posts can be pretty high - as the GGP illustrates - and the moderator system usually works - somewhat. So we take the awkwardness of the editing system together with the advantages. Slashdot maintainers, that does not mean that we don't want a better editor, thank you very much.)

Re:W/O RTFA (1)

noidentity (188756) | more than 3 years ago | (#33499196)

Interesting post. Please consider using paragraphs in the future. They help readability singificantly.

Re:W/O RTFA (1)

astar (203020) | more than 3 years ago | (#33499860)

Yah, I did not rtfa either. But hey you might know something. So here is where I am coming from: where do you use a kinematic causality model vs a dynamic causality model? So there was a odd slashdot article recently on someone who built a quantum computer that was reliable enough to get some statistics on. So I guess he did a thousand runs. He got the right answer 60% of the time and something apparently random 40% of the time. If I think kinematics, I think machine. And I wonder, was the quantum computer a machine? Oh, well. But the high random results are interesting. Any comments.

Re:W/O RTFA (0)

Anonymous Coward | more than 3 years ago | (#33500326)

Thanks for that summary.

Whenever quantum computing comes up, the first thing people talk about is the algorithms. For my money, though, what are the physics of scaling entangled systems up arbitrarily? I guess I'm a CS guy, so I understand the algorithms better than the physics.

In particular, I get superposition of states, but is there a result, or how do we approach a result, that guarantees or qualifies the capacity of nature to not accidentally collapse wavefunctions in macroscopically observable physical systems? I mean, mumbo-jumbo about consciousness transcending multiple universes and all that, is there a way to be sure that we can achieve the kinds of enormous superpositions that represent quantum solutions to non-trivial instances of NP-complete problems? In real life, things get observed a lot by bumping into other objects, photons bouncing off them, etc.

It seems like people take the physical possibility of arbitrarily large, macroscopically observable, entangled systems as given. Is it really just an engineering problem, and not one where a fundamental physical limit might rear its head, possibly getting in the way of solving arbitrarily large problems nondeterministically?

Perhaps I'm being dense, but I haven't followed any discussions to where they talk about how you can grow entangled systems faster than the probability of their wave functions collapsing steals away your nondeterminism.

It's great to hear a quantum physicist say... (1)

bemymonkey (1244086) | more than 3 years ago | (#33498482)

"Why do I hear my voice?" during a video conference.

Makes me feel hella smart. :D

Re:It's great to hear a quantum physicist say... (1)

pgptag (782470) | more than 3 years ago | (#33498820)

"Why do I hear my voice?" during a video conference.

Makes me feel hella smart. :D

I was surprised to hear my voice during a video conference, because I was hot speaking! Somebody in the audience had started playing a video clip he had recorded a few minutes before.

Separating Hype from Hope is easy.... (0)

Anonymous Coward | more than 3 years ago | (#33498636)

Hype = D-Wave
Hope = !D-Wave

Completely Solve Chess (0)

ecorona (953223) | more than 3 years ago | (#33498704)

Quantum computers will bring an end to the competitiveness of AI vs Human chess games. They will be able to completely solve the game, meaning that they can produce a series of moves that will guarantee a win (or tie if the human player plays perfectly) from any position on the board. On the other hand, we will leap frog centuries of moore's law in bioinformatics algorithms. Goodbye protein folding problem!

Re:Completely Solve Chess (0)

Anonymous Coward | more than 3 years ago | (#33499482)

Good job eating up that hype.

Re:Completely Solve Chess (1)

ecorona (953223) | more than 3 years ago | (#33500362)

Well, whether the technology will arrive may or may not be hype. I don't know. However, once and if it arrives, they will be able to do these things. That part, at least, is not hype.

Re:Completely Solve Chess (1)

Bigjeff5 (1143585) | more than 3 years ago | (#33499850)

I think you are underestimating chess.

Yes it is solvable, but assuming one variation can be calculated in a single floating point operation (it probably can't, but who knows) with current tech (3 petaflop/s) it would take 10^97 years just to calculate the first move*. The next calculation is nearly as big as the first, with the calculations getting slightly smaller as the game goes on.

That means a computer that solves one chess move per year (for the first 10 moves or so, after that the calculations are a fraction of the size of the first calc) would need to be 10^97 times more powerful than the most powerful supercomputer we have. With that computer you could probably solve it in 5-10 years. If it's only 10^96 times as powerful, though, it would take well over 100 years to solve.

I've got bad news for you in that regard, because the current supercomputers (3 petaflop/s or less) are only 10^16 times more powerful than the Z3, the first turing complete computer, which was invented in 1941 and could only do one multiplication per three seconds. Assuming the same exponential rate of growth, and in 70 years we'll still be 10^81 times too weak to solve chess in any reasonable amount of time.

Add to that the fact that a quantum computer will almost certainly have absolutely no advantage in solving for each position than a classical computer (it's a simple accounting of each move, there aren't any computationally complex problems, just an assload of computations), and there is no reason to believe quantum computing will solve chess any time soon.

I know we'll do it eventually, I just can't see it happening for a couple hundred years. Once solved, though, Chess will be boring with regards to computers. It will become a simple "if this than this" operations. Humans still probably couldn't handle it, but any of today's computers would be unbeatable.

In other words, the game is solvable, but the amount of computational power necessary is absolutely staggering. It's so staggering that some have argued that computers (yes, even quantum computers) will run into the limits of physics long before they are ever powerful enough to truly solve chess. You run into the limits of the speed of light and quantum entanglement and the laws of thermodynamics and such at some point, and we are already nearing the physical limits of chipmaking for classical computers now (the reason we had to stop the GHZ expansion), just 70 years after the invention of the computer (of which we are only 10^16 times more powerful today).

*To get the first move, you need to calculate the entire tree, which has 10^120 variations for a 40 move (average) game, in order to know the the best possible move. From there the variations start to shrink (you've eliminated all the branches that don't start with your first move), but at this point the number is still nearly as large. Not until many moves in would the calculation time shrink to a reasonable level for that powerful of a computer. So far only limited 4 piece or fewer end game scenarios have been solved.

Re:Completely Solve Chess (1)

ecorona (953223) | more than 3 years ago | (#33500410)

I am no expert on the matter, but I was under the impression that chess is a problem that would benefit tremendously from quantum computer computations. I've read many sources online that claim this is the case, though none are academic papers. The staggering number of computations does not necessarily phase me because of how quantum computers work. Such staggering computations are more of a problem for classical computers.

Re:Completely Solve Chess (0)

Anonymous Coward | more than 3 years ago | (#33501752)

10^120? Ever heard of pruning?

Re:Completely Solve Chess (0)

Anonymous Coward | more than 3 years ago | (#33502132)

Hell, even if you fed all 32 pieces to a standard brute force end-game solver you'd get the result after some 10^54 operations. And we'd be solving the problem from one end only. There's still a birthday problem to be exploited.

link to .flv (0)

Anonymous Coward | more than 3 years ago | (#33498706)

http://blip.tv/file/get/Telexlr8-vbSuzanneGildertOnQuantumComputingInTeleplaceSeptember4640.flv

wrong spelling of the name (0)

Anonymous Coward | more than 3 years ago | (#33498714)

Her name is Gildert not Gilbert in case you are trying to find her blog...

Oh not again (2, Informative)

iris-n (1276146) | more than 3 years ago | (#33501686)

These crooks from D-Wave just won't give up. 128 qubits quantum computer!? pics or it didn't happen.

For more info: http://en.wikipedia.org/wiki/D-Wave_Systems [wikipedia.org]

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>