Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Ultimate Limits Of Computers

timothy posted more than 13 years ago | from the density-and-entropy dept.

Technology 180

Qui-Gon writes: "Found an interesting article about the 'The Ultimate Limits of Computers' over at Ars Technica. This article is very heavy on the physics of computing, not for the faint-hearted." Somewhere between practical reality and sheer Gedankenexperiment, the exploration here keeps getting more relevant as shrinking die sizes and improving nanotech wear away at previously impassable barriers. The article is based on a paper discussing the absolute limits of computational power.

cancel ×

180 comments

Sorry! There are no comments related to the filter you selected.

Limits of Scientists... (1)

Anonymous Coward | more than 13 years ago | (#132851)

"When a scientist says something is possible, they're probably right;
when a scientist says something is impossible, they're probably wrong."
--Anonymous

Is the Enterprise's computer slow? (1)

Anonymous Coward | more than 13 years ago | (#132852)

Computers will never reach a plateau of operating capacity. Even if we were to design a computer so fast that it was limited by the speed of light, we could then design a warp core to generated a warp field around the computer and enable FTL (faster than light) processing. At least, this is how they do it on Star Trek TNG.

Re:THIS, is ther right link (1)

mvw (2916) | more than 13 years ago | (#132855)

Not exactly that kind of bottom.

Limit on Operations (2)

mvw (2916) | more than 13 years ago | (#132856)

Note that the given speed limit is on state change rate. Those states could represent much more complex operations than simple arithmetic. All in all the article is nice, but I enjoyed the one on Hans Moravec's page more, where Feynman pondered about the ultimate library. Regards, Marc

Re:Limit on Operations (2)

mvw (2916) | more than 13 years ago | (#132857)

Sorry, I meant a link found from Ralph Merckle's page. Read here [zyvex.com] .

Re:*Theoretical* limit (2)

mvw (2916) | more than 13 years ago | (#132858)

It's all a matter of any physical system having a bounded energy having a corresponding bounded rate of state change.

But like writing fast or dog slow programs with a given programming language, there might be very different approaches (regarding computational power) to use the physics for building fast computers.

Thus there is a physical limit, as well as a logical limit (theories of computation and complexity) to consider.

Re:"massively parallel" machines will blow limit a (2)

mvw (2916) | more than 13 years ago | (#132859)

There are parallel machines out there. As much as I heard, the attempt to generate parallelizing compilers, thus compilers that take a conventional program, analyze its structure and try to generate code that takes advantage of the parallel hardware, has not been very successful.

And of course it depends totally on the problem, if parallelism can be exploited.

Re:Reversibility and Thermodynamics (2)

mvw (2916) | more than 13 years ago | (#132860)

Now before you say "BS", think about it. In physics, if you know the initial state (starting position, velocity, acceleration) of an object in an isolated system, you can easily compute where it was at any given time earlier. This uses the same concept. For example, If you add 43 to a register, you can subtract 43 from that register and get your energy back.

Search for "reversible" in this article [slashdot.org] .

Re:"massively parallel" machines will blow limit a (2)

mvw (2916) | more than 13 years ago | (#132861)

Your example is one, that can't take advantage of parallelism. I would rather give an divide and conquer type algorithm (like sorting) as example for recursion and parallelism.

Re:Limit on Operations (2)

mvw (2916) | more than 13 years ago | (#132862)

I noticed his paper addressed quantum state change limits, apparently (without saying it) limited to fermions.

What equation are you refering to?

Re:until the next computer revolution (2)

mvw (2916) | more than 13 years ago | (#132863)

The only reason for quantum mechanics in this article is the fact that quantum mechanics gives a lower bound for miniaturisation (i.e. you can only keep making computer parts smaller until you get problems with the Heisenberg Unertainty Principle)

Actually short times allow for high energy fluctuations (dt * dE >= hbar), this is the very basic reasoning for the existance of short lived (so called virtual) particles in quantumn field theory, with its complications on the number of particle being not constant in physical process and the fact that one can not observe bare bone particles.

This could mean that fast processes have to deal with disturbance from such energy fluctuations, providing physical limits as well for fast computing.

Re:*Theoretical* limit (2)

mvw (2916) | more than 13 years ago | (#132864)

No, the limitations that technology can overcome are engineering limitations. The limitations talked about in the article are basic fundamental physics limitations that don't depend on any particular form of technology.

Considering the physical theory (here: non relativistic quantumn mechanics, which is still an approximation of nature) used to derive the limitations is sufficient.

Re:*Theoretical* limit (4)

mvw (2916) | more than 13 years ago | (#132869)

Different from what?

Who says that a complex problem needs a high number of state changes?

Each state change could be the result of a very high level operation, not something primitive like adding two numbers, but perhaps something like the outcome of the traveling salesman problem. Think of some clever physical setup here.

It will be due to the cleverness of the computer builders, to make most use out of the limitations.

Regards, Marc

Re:Reversibility and Thermodynamics (1)

Rift (3915) | more than 13 years ago | (#132870)

You're thinking in the wrong discipline. Reversable computing happens already all the time.

Think of it this way - a photon hits an electron in it's ground state, raising it to a higher state. Then, it emits an identical photon and drops to it's ground state again. Effectively, You added energy to store a bit (0 -> 1) of information in an electron. Then the operation reversed, and you gained the original energy back.

The problem with reversible computing is getting the information out in a reversible way. In my example above, how do we know if the electron is a 0 or a 1? we have to extract the photon to do it, thus destroying the reversibility. Now, there are some quantum properties that can be measured without destroying the reversibility, but they are difficult to control (the system easily loses coherence).

Reversible computers don't use the same 'modern electronics' that your current computer does. but they are theoretically possible (using only a small amount of energy to observe the state). Just not yet practical. Wait around a bit.

limits of AI and computer gaming (1)

Qeyser (6788) | more than 13 years ago | (#132874)

I recall reading a short article about the limits of computing technology in gaming -- specifically in terms of AI and "solving" classic games such as chess.

The argument went like this: if a computer system can map out every possible finite state (i.e. board position in chess) of a game, then from any point in the game it can be determined what the winning moves are (if there are any, that is.) For a game like chess there are relatively few board postions possible -- thus, all board positions could be explored and the game could conceivably be "solved" at some point by a computer (of course in a brute force sense only, but solved none-the-less.) Any human player would be hopeless against such an adversary.

However, in a game such as checkers, there are many many more possible board positions (I think the estimate I recall was 10^69, could be incorrect though . . .) so that compiling a complete library of all board postions would take considerably longer than the projected life-span of the universe (as estimated by the halflife of a proton). This would be true even if you used a ridiculously large computing system -- say, .1% of the available mass in the universe. So, the argument goes, a computer will never be able to brute-force beat a human at checkers.

Re:Say what? (1)

whydna (9312) | more than 13 years ago | (#132880)

Well... I meant that there is a physical limit to the thermodynamics that control computers just as there is a phyical limit to the size that you can design a transistor.

In a since, we're both right. You're right, my analogy is a poor one, but it still serves its purpose. ;-)

Thanks

Re:Reversibility and Thermodynamics (1)

whydna (9312) | more than 13 years ago | (#132881)

One problem to note with reversible computing is that your computer has to have enough memory to store every bit of information ever entered into it. I'm pretty sure that it can compress the info, but eventually you'll have to have an energy expensive and very hot 'information dumping' process. I say it's a problem, but of course normal computing requires the same thing, and doesn't let you choose when you do it.

Actually, no. It seems that way (that's what I thought at first too). But that's the cool part about it... you simply reverse the process by reversing the direction that you increment the Program Counter. I'll give a short example in pseudo assembly:

; assume that $1 and $2 are both 0 ADDI $1 32 ADDI $2 24 ADD $1 $2 ; at this point $1 should hold 56. ; and at no point have we stored anything in memory ; ok.. now let's do these instructions in reverse. ; normally, we wouldn't manually do it this way ; because the CPU would switch it, but this works SUB $1 $2 ; ok.. $1 now contains 32 SUBI $2 24 ; now $2 contains 0 SUBI $1 32; now $1 contains 0 This is a fairly primitive example, but it proves the point. The other thing is that there cannot be a MOVE instruction. Memory can be exchanged with a register value, but not copied. So, all you have to do is run the instructions backwards... I promise!!!

Re:Reversibility and Thermodynamics (1)

whydna (9312) | more than 13 years ago | (#132882)

botched the assembly... sorry:
; assume that $1 and $2 are both 0
ADDI $1 32
ADDI $2 24
ADD $1 $2
; at this point $1 should hold 56.
; and at no point have we stored anything in memory
; ok.. now let's do these instructions in reverse.
; normally, we wouldn't manually do it this way
; because the CPU would switch it, but this works
SUB $1 $2 ; ok.. $1 now contains 32
SUBI $2 24 ; now $2 contains 0
SUBI $1 32; now $1 contains 0

Reversibility and Thermodynamics (5)

whydna (9312) | more than 13 years ago | (#132885)

I've just joined a research group at my University to study reversible computing [ufl.edu] . The professor in charge wrote his doctoral thesis on the subject at MIT.

The concept is that a "normal" CPU erases information on every cycle (clearing registers, overwriting data, shifting data to nowhere, etc). When a CPU erases information, it's dissipated as heat. There are thermodynamic limits to this (kinda like Moore's law). So, if a computer could be designed not to erase data, you could reverse the CPU and get most of your energy back.

Now before you say "BS", think about it. In physics, if you know the initial state (starting position, velocity, acceleration) of an object in an isolated system, you can easily compute where it was at any given time earlier. This uses the same concept. For example, If you add 43 to a register, you can subtract 43 from that register and get your energy back.

Of course, certain instructions don't lend themselves to reversibility. For example, bit shifting is inherently irreversible. One option is to maintain a stack of "garbage data", but that's a poor solution. On the other hand, a number of instructions are reversible by default. .. XOR is always reversible, etc. So, a reversible CPU will probably have a more constrictive instruction set, but is still functional.

Reversibility is not anything new, but it does take a shift in thinking. Algorithms can be designed to run very efficiently on reversible computers, but it takes a bit more effort. Hopefully, we (the community of people studying reversible/adiabatic computers) will develop means of either converting irreversible algorithms or develop ways to make them less innefficient (double negative).

-Andy

Ultimate Limites: 1981 (3)

ch-chuck (9622) | more than 13 years ago | (#132886)

"640K ought to be enough for anybody. "
- Bill Gates (1955-), in 1981

How many angels on a pinhead? (2)

peter303 (12292) | more than 13 years ago | (#132887)

These kind of articles remind me of the futile
medieval debates on how many angels can dance
on a head of a pin. Same sub-arguments too-
whetner angels are material (atoms) or immaterial
(photons, quantum states), and so on.

Re:Ok -- but Moore's law gets us there fast enough (2)

HiThere (15173) | more than 13 years ago | (#132889)

You'll just need to use an encoded link to your base station. Or you could carry it in a fanny pack.

That could add new meaning to: "I'm loosing my mind!"

Caution: Now approaching the (technological) singularity.

Re:Knowledge is unlimited (2)

HiThere (15173) | more than 13 years ago | (#132890)

"Knowledge is unlimited"


That's an interesting conjecture. Any idea how you would go about proving it? Personally I don't believe it. It's like saying "there are an unlimited number of fish in the sea" or "there's an unlimited number of trees".

Well, there is a large number of fish in the sea, but we're taking them out faster than they grow back, so there won't continue to be a large number. There used to be a large number of trees. But we cut them down faster then they grew back, and now every tree is owned by some person or organization. And the number is still decreasing rapidly.

Personally, I don't think that anything in the universe is unlimited. Stupidity has been suggested, but even there I have my doubts. Still, there's probably a better argument to be made for an unlimited amount of stupidity than for an unlimited amount of unknown things to be learned. In fact, I doubt that the total number of things in the universe that can be know is as large as the powerset of the cardinality of the number of non-virtual elementary particles in the universe. And probably not even as large as the power set of the cardinality of the number of electrons in the universe. Now that's a LARGE number, but it sure isn't unlimited.




Caution: Now approaching the (technological) singularity.

Re:Ok (2)

Ralph Wiggam (22354) | more than 13 years ago | (#132892)

Imagine a Beowulf cluster of those things.

It'll suck the paint off your house and give your family a permanent orange afro.

-B

Re:Reversibility and Thermodynamics (2)

Shotgun (30919) | more than 13 years ago | (#132895)

See: Perpetual motion machine.

If you know an objects initial state, you can compute where it's been. This is not the same as putting it back in it's initial state.

If you add 43 to a register, you've had to drain the capacitance from some circuits and charge up others. The capacitance charge is drained to ground, and you will have to expend energy to move those electrons (or some others) back to Vcc. Sorry, no way around that with modern electronics.

Reversible algorithms make make some times of computing efficient, but not in the 'energy' efficient sense.

What to do with all that speed (1)

AlpineR (32307) | more than 13 years ago | (#132896)

Performing 5 * 10^50 operations per second with 1 kg of computer is so far beyond anything we have today. What would we use that speed for?

I was thinking about the same subject from the other direction recently. For my research [umich.edu] I am trying to simulate the growth of silicon germanium crystals. The usual way of approaching a physics problem like this is to make simplified approximations of what actually happens. For example, hard-to-model processes like diffusion are measured by experiment or found from standard relations. But these simplifications can yield a model which is subtly wrong and fails to predict unexpected behavior.

What I really want to do is model the entire crystal growth from the most fundamental physics (quantum mechanics or something like string theory). That way the simulation is not just an approximation of reality but is indistinguishable from reality! How fast of a computer do I need for this full-detail simulation?

Well, that depends on the size of the system I'm trying to simulate. How big of a computer do I need to model a 1g chunk of matter? Logically, it seems that at least 1g of computer would be required to model 1g of arbitrary material in real time. (Otherwise, I could simply model a 1g computer with a 0.5g one, which in turn is actually simulated on a 0.25g computer and so on.) So, perfectly simulating a 10g crystal would take 10g of "ultimate laptop" processing. But if, like the article mentions, the laptop uses ordinary matter rather than plasma then the computation rate is 10^40 ops/second rather than 10^51 ops/second. This implies that my 1kg super laptop would need 31.7 years to simulate 1 second of growth for a 10g crytal! How much computation would be required to mimic The Matrix? To perfectly simulate life on Earth would require a computer at least as large as the Earth itself!

So, although 2 GHz processors sound fast and the ultimate laptop in this article seems unfathomable, we have many applications today that can take all the computation speed available. A near future application for ultracomputing is the modelling of protein folding for drug design. There, the amount of matter being simulated is small fractions of a gram and the leveraging of computer weight and time is worthwhile.

AlpineR

The NSA (2)

dmaxwell (43234) | more than 13 years ago | (#132901)

Now everyone knows that the NSA is after my barbecue sauce recipe stored in a pgp encrypted file. Of COURSE they'll create a black hole computer just to get it. After all, that barbecue sauce IS kinda red!

Re:This is not a real distinction (1)

lubricated (49106) | more than 13 years ago | (#132902)

>For all you know, there are states "below absolute zero"

Not possible by the definition

That's like saying something can go slower than 0 mph. Not possible. Since speed(not velocity) is always positive. Since temperature is just a measurement of how fast "stuff" is moving around and absolute 0 is when nothing is moving you can't get colder. Saying otherwise just shows your ignorance.

Re:Reversibility and Thermodynamics (1)

wurp (51446) | more than 13 years ago | (#132903)

I'm not an expert, but some rudimentary examination and common sense indicate to me...

you don't need to actually reverse the operations to 'recover' energy in reversible operations. Reversible operations don't destroy information, and thus don't incur the minimum entropy increase caused by destroying information. This means that it is theoretically possible for them to dissipate arbitrarily small amounts of heat, and thus consume arbitrarily small amounts of energy.

In other words, reversible operations don't allow recovering the energy, they make it possible to not have to pay the energy in the first place.

One problem to note with reversible computing is that your computer has to have enough memory to store every bit of information ever entered into it. I'm pretty sure that it can compress the info, but eventually you'll have to have an energy expensive and very hot 'information dumping' process. I say it's a problem, but of course normal computing requires the same thing, and doesn't let you choose when you do it.

Yaknow, maybe I should've blasted the guy and removed any indications of uncertainty in my reply. That's the only way I've ever gotten modded up...

Regards,
Bobby

Bobby Martin aka Wurp
Cosm Development Team

Limits of Key Searching (2)

jcr (53032) | more than 13 years ago | (#132905)

Landon Noll, who is probably best known to slashdotters for his work on LavaRand, has done some work calculating the limits of how many bits a cryptographic key has to have to be immune to brute-force searching in this universe. As I recall, he was going to publish it in Scientific American, but I can't seem to find it.

At any rate, taking into account such issues as your computer crushing itself into a black hole if it gets too massive, IIRC, Landon concluded that a key of about 530 bits is Really Safe.

- jcr

No it won't (1)

volpe (58112) | more than 13 years ago | (#132906)

The physics limitations apply to massively parallel machines as well. Read the article. Multiple CPU's still work by changing state. If you've got a million CPU's, and together they weigh less than a kilogram, they all have a collective maximum number of state changes per second as computed in the article for an arbitrary device of that mass.

Re:*Theoretical* limit (2)

volpe (58112) | more than 13 years ago | (#132907)

> there might be very different approaches
> (regarding computational power) to use the
> physics for building fast computers

Different from what? Like I said, the discussion relied on no particular form of technology. The only assumption is that a computer system must "change state" in order to perform any computation. And a computer is basically a "finite state machine". And if you think a computer can compute something without changing state, then how will you know when it's produced the intended result?

Re:Say what? (2)

volpe (58112) | more than 13 years ago | (#132908)

I think you may have missed my point, though. Moore's law is sort of an "anti-limit". It's exactly unlike the physical "laws" which place limits on things because Moore's law implies that there is no limit (doubling every 1.5 years, yada yada yada).

Say what? (3)

volpe (58112) | more than 13 years ago | (#132910)

> There are thermodynamic limits to this (kinda
> like Moore's law).

I think you meant to say, "almost, but not quite, entirely UNlike Moore's Law".

*Theoretical* limit (5)

volpe (58112) | more than 13 years ago | (#132911)

No, the limitations that technology can overcome are engineering limitations. The limitations talked about in the article are basic fundamental physics limitations that don't depend on any particular form of technology. Note that nowhere is it said that the problem is the size of the tracings on the microchip, or heat dissipation, or whatever. It's all a matter of any physical system having a bounded energy having a corresponding bounded rate of state change. Saying that there will be another technological revolution that surpasses this is like saying we'll be able to cool things below absolute zero when we figure out how to build better condensing coils for our refrigerators.

Architecture Change Wanted: Apply Within (3)

tarsi210 (70325) | more than 13 years ago | (#132915)

From the: Damn-that's-neat-but-what's-the-point dept.

Wow! That's some neat physics (only a part of which I understand). But really do you think we'll need to get anywhere near these sizes and amounts?

The time will come when the theory has advanced far enough that we'll drop the Von-Neumian-style of doing computing and go with something a bit more, shall I say, better? The human brain certainly doesn't have anything near those figures of capacity, and it's about 1-2kg, occupies about 1 L^3 of space.

And I don't know about you, but I LOVE the graphics. They are kicking some major ARSE. The refresh rate could be a bit higher, though, I still get blurry vision when stumbling home from the bar. :)

Re:Ok (2)

selectspec (74651) | more than 13 years ago | (#132918)

I'm trying to get some figures but I believe that is considerably more instructions in a single second than the aggregate computations of every microchip's lifespan that was ever built and operated.

Ok (5)

selectspec (74651) | more than 13 years ago | (#132919)

5.4258 * 10^50 maximum operations/sec in a 1kg chunk of matter.

hmmm

That is the equivilant of 542,580,000,000,000,000,000,000,000,000,000,000,00 0,000 1Ghz CPU's.

I think we're covered for awhile.

Bah (1)

diablovision (83618) | more than 13 years ago | (#132920)

Roads? Where we're going, we don't need "roads"!

The speed of light does not matter... (2)

Domini (103836) | more than 13 years ago | (#132922)

They are able to transfer information faster than the speed of light aparently.

About 300 times as fast at the moment through cecium vapour. (Aparently it has a negative refraction index...)

Also, what about non-clocked machines (massively paralel logic machines) such as the one NASA just bought.

The limits are a bit near-sighted, methinks.

D.

Checkers vs Chess (1)

dman123 (115218) | more than 13 years ago | (#132927)

I have often thought about this brute force method on chess and checkers ever since I saw "Wargames." But you say that chess has fewer board positions than checkers???

1) Checkers uses fewer pieces, with only two possible types (regular or king).

2) Checkers uses only half the number of board spaces as chess.

3) Although it's true that your pawns aren't going to end up in the first row, moves and positions are pretty much unlimited in chess. Note: A lot of these positions are not practical if someone is playing to win, but they are still theoretically possible.

Conclusion) Deep Blue played chess, but it would have creamed Kasparov at checkers.

If I'm wrong, let me know. It won't be the first time.

--
dman123 forever!

Re:Knowledge is unlimited (1)

jejones (115979) | more than 13 years ago | (#132928)

Well, yes, but...relativity knocked the socks out from under Newton's notions of absolute time and place, but people still use Newtonian mechanics to design car engines. Having to be consistent with what we already know means that even if the theoretical underpinnings change, the results will look like refinement, not revolution. (There's a fine Asimov essay on this, titled something like "The Rightness of Wrong.")

Re:limits of AI and computer gaming (1)

jejones (115979) | more than 13 years ago | (#132929)

I think you're thinking of "go," not checkers. Look up the work of Arthur Samuels on checkers-playing software (which, BTW, was done so long ago that the data representation he used was given as an example in the IBM 360 assembly language text I used in college--Struble, sitting to my left as I type...and thank goodness I don't have to write for those any more!).

Re:the assumptions seem wrong... (1)

bibos (116554) | more than 13 years ago | (#132930)

wouldn't the amount of evergy available be based on the voltage supplied to the computer and the resistance of its circuits? so wouldn't the relevant equation be V=IR, not E=MC2? don't think using electricity here, we're talking about particle physics and quantum computers. so forget all your cables and contacts and resistors.He's talking about states of nuclei (being the spin of the nucleus for example)If you convert all the mass to energy (pure energy, that is) then you've got the maximum energy. you can't have a higher energy, cause you don't got any mass to convert any more.

Re:Moore's Law vs Physics (1)

bibos (116554) | more than 13 years ago | (#132931)

it seems to me that a lot of people who understand alot of one (or more) of the nature sciences forget that it's all just observation and interpretation. And I guess it's not those people that find out about completely new things that break all the other previouslyassumed 'laws'. If there wouldn't be anysceptics, we wouldn't have come this farand I guess if there aren't more scepticsto come, we wouldn't even advance in science.

Re:the assumptions seem wrong... (1)

bibos (116554) | more than 13 years ago | (#132932)

the problem is that the nuclei of the computer can't take any more energy when they are, in fact, already pure energy. (e=mc)

Re:the assumptions seem wrong... (1)

bibos (116554) | more than 13 years ago | (#132933)

they are, in fact, already pure energy. (e=mc) at which point i'd ask: is a computer without any mass really a computer ?

Re:the assumptions seem wrong... (1)

bibos (116554) | more than 13 years ago | (#132934)

wouldn't the ultimate computer be able to harness all of the energy in the universe? yes, it would. but we've limited it to a 1 kg computer with 1L.

biological computer? (1)

revengance (132255) | more than 13 years ago | (#132938)

Well, I do agree that the limits of present computer will be reached. However, it only means that the number of cycles of a computer is limited. We could have computers that process more complicated process per cycle. I remember reading somewhere that there are "biological" computers that can solve complicated maths problems with just one processing cycle. This is very possible. An example is the human brain. It can process certain information effortlessly that normal computers could not process. Maybe after the limits of current computers is reached, more research will be devoted into biological computers to solve specialised problems. They might not be able to solved problems that current computer solves effortlessly, but they have might have a niche is solving problems that current or even future computer cannot solve.

Is it just me or did the author make a mistake? (3)

krogoth (134320) | more than 13 years ago | (#132940)

In his article he claims that "The maximum energy an ultimate laptop [1kg] can contain is given by Einstein's famous formula relating mass and energy: E = mc2. Plugging in the laptop's mass, the speed of light, and combining the result with Eq. 2 tells us that the maximum number of operations per second a 1 kg lump of matter can be made to perform is 5.4258e50."

i'm assuming this is at 9e16J per second, which means to make his "ultimate laptop", he would have to split the atoms of 1kg of any material per second... which means he would need to carry a large nuclear power plant around with him (even then, I don't think they go through 1kg/s).

What he fails to understand is that Einstein's formula is an equivalence, not a potential. Maybe that is the maximum energy a mass can have, but to get at that energy (in J/s) you would have to split enough atoms that that mass was lost (your 'laptop' would get 1kg lighter every second). Unfortunately, his whole article is based on this principle, so you can't use anything he says unless you plan to sustain a nuclear reaction which loses 1kg/s in fission to power this "ultimate laptop".

He correctly used the values in the formula, but he didn't apply it correctly. Maybe he should have done a bit more research.

Re:Architecture Change Wanted: Apply Within (1)

wkw3 (140770) | more than 13 years ago | (#132941)

And I don't know about you, but I LOVE the graphics. They are kicking some major ARSE. The refresh rate could be a bit higher, though, I still get blurry vision when stumbling home from the bar. :)

That's motion blur. It's a feature.

Re:Is it just me or did the author make a mistake? (1)

pclminion (145572) | more than 13 years ago | (#132943)

The argument is intended to set a theoretical speed limit, not to describe a method of extracting energy from the mass in a laptop. The gist of this is that, provided a certain amount of energy, it is possible to compute only so fast. The argument is ok, even if he is abusing physics a little bit.

The real problem comes on the next page where he starts talking about entropy. Claude Shannon (considered to be the father of information theory) made a grave mistake in calling his measure of information density "entropy." He should have just called it "information density." You can learn about this definition of entropy in any good data compression or information theory textbook. It is unfortunate because many people seem to think that physical entropy and informational entropy are somehow equivalent, when in fact they are not.

There may, in truth, be a relationship between physical entropy and informational entropy but the relationship has not yet been discovered, as this article would lead us to believe. If it ever is discovered, the form of the relationship will certainly not be as simple as is stated here.

This is all very dubious and actually somewhat amusing.

This is great news! (2)

djrogers (153854) | more than 13 years ago | (#132944)

Every time I read an article about a limit in some area of computing (network speed, storage, CPU speed, stupidity of A. Grove), it seems as if it's the last sign that a new method/paradigm is on the horizon, with a significant breakthrough coming.

Bring it on!

Re:until the next computer revolution (2)

sfstich (173141) | more than 13 years ago | (#132950)

Actually, the article doesn't talk about quantum computing at all.

The only reason for quantum mechanics in this article is the fact that quantum mechanics gives a lower bound for miniaturisation (i.e. you can only keep making computer parts smaller until you get problems with the Heisenberg Unertainty Principle)

The article even specifically states that it doesn't refer to a special type of architecture.

--
This signature has been deprecated

We Already Built Such Things... (1)

rnbc (174939) | more than 13 years ago | (#132951)

Let's state things in another way: such a computer is nothing more than a physical system which (entire) evolution is used and monitored to accomplish a certain task...

Generally heavy computation is used to predict outcomes, to analyse data, normally related to physical systems, and any system is physicall.

Now, consider this: if you can built a system to such a precision you don't need to simulate anyting, anymore. Just built the damn thing fullscale, and if it doesn't work restart from zero.

What I'm saying is that a small model of a plane, for example, is a physical computer, built to model the plane.

The best simulation of a system is the system itself, if you can built arbitrary systems with little effort. And the technology to built this kind of machines requires us also to be able to built almost arbitrary systems, so in the end it will be useless to make use of it to built computers, I think.

Actually, Knowledge is inherently limited (1)

indole (177514) | more than 13 years ago | (#132952)

Theories are theories and our best theories are Our Best Theories. Don't blame science for shattering yesterdays Best Theories and don't misattribute the popular misconception of Truth vs. Theory to the few that actually understand what Truth and Theory are.

(and in reference to the subject line, See: Godel's Incompleteness Theorem [everything2.org]

Just like the superparamagnetic barrier (5)

cvd6262 (180823) | more than 13 years ago | (#132953)

I interned at the Storage Systems Division of IBM in San Jose, CA. We had a brown bag seminar where somebody big (his name escapes me) spoke on the furture of magnetic storage.

He had a great graph of the last 30+ years of GB/square inch, which seemed to coincide with Moore's Law (which, just like this article, addressed processing issues, I know. Bare with me here.). There were red lines drawn every ten years or so representing what scientists had believed to be the superparamagnetic barrier - the point at which it would be physically impossible to cram any more data onto a disk.

The guy had a great line every time one of these came up. "In 19XX Dr. XYZ at ABC University discovered the superparamagnetic barrier.... We broke it X years later." (X was usually a single digit.

My point is that it will be interesting to watch if these "scientific" finding will not require revision. True, this one may be based on sound scientific principles, but so were all those who attempted to predict the superparamagnetic barrier.

still a lot of time to go.... (2)

leuk_he (194174) | more than 13 years ago | (#132957)

a 1 kg black hole computer is ~10^51 operations per second.

1 Megahz = 10^6 operations per second 1 Gighz = 10^9 ops per second. Moore's law seems to go for speed now too. amout 10 time faster in 5 year. so we still have to go for about 200 year before we reach this speed.

I do not think i live for another 200 year. no problem there.

Damn! the sun does not go black hole for some 10^9 years. no luck here......

Re:Reversibility and Thermodynamics (2)

d_lesage (199542) | more than 13 years ago | (#132960)

For example, If you add 43 to a register, you can subtract 43 from that register and get your energy back

Can you imagine:

"Oh man. I had a bug in my program, and accidentally substracted more from my registry that what I originally added. Now my computer is literally frozen. Does anyone have an icepick?"

This is not a real distinction (1)

Flat5 (207129) | more than 13 years ago | (#132963)

You are assuming that we know all the laws of physics. Now, we certainly do know the laws of physics in a more enduring way than we know how to engineer microchips, but to say that we know all the laws of physics is silly. It amuses me how physicists have claimed to be "almost there" forever.

For all you know, there are states "below absolute zero" that we just haven't created so that they could be measured or conceptualized yet. So yes, you're right, it is like saying that - and there's nothing wrong with such a statement at all.

I believe it was Einstein who said that "science" is nothing more than the rationalization of technology.

Bob

Re:Limit on Operations (1)

aburnsio.com (213397) | more than 13 years ago | (#132967)

I noticed his paper addressed quantum state change limits, apparently (without saying it) limited to fermions. It doesn't seem that his analysis would apply to bosons as well, particularly photons. His paper seems to say that a given type of computer, a fermion computer, has these limits, but what about a boson computer?

Also, I'm wondering if his article has been peer-reviewed. Intersecting physics and computation has had its problems in the past, especially interpreting the results. Even among experts there are disagreements about what exactly these results mean. I would also have liked to see more tables of comparative results of different sizes and speeds of theoretical computers and graphs of the speed and size tradeoffs. He also gives neutrinos as an example of particles having zero mass, even though they are now known to have a small but finite mass.

And as a caveat, which should have been expressed in the article, the limits given are dependent on our current understanding of physics within the system given (the ultimate laptop), and further discoveries could of course alter these results. Even given current physics, the interpretations of the equations is very speculative and a matter of much debate.

Re:Will our need ever push it this far? (1)

aburnsio.com (213397) | more than 13 years ago | (#132968)

Why do people always get stuck on this "it's powerful enough now" bent? You can always add full 3D physics simulation to your word processor, not that you need it, but then again, do you really need color TV or a palm pilot? I think you could have best summed up your reply:
"640k is all anyone will need for memory." -- Bill Gates, early 80s.

until the next computer revolution (3)

gergi (220700) | more than 13 years ago | (#132969)

The author indicates that computing is limited by quantum mechanics and that we have quite a while (many, many years) until we reach that limit. Well, I suspect that many, many years in the future, researchers will have found yet another way to perform 'computer processing', faster and more efficient than quantum processing.

Knowledge is unlimited (2)

alen (225700) | more than 13 years ago | (#132973)

Every year we seem to think we know every thing there is to know about physics, biology and any other science. We are convinced that our current theories are laws of nature. And every year some discovery shatters that belief in a given discipline.

Vogons! (1)

morie (227571) | more than 13 years ago | (#132976)

The ultimate limit to computing power has long ago been determined to be related to the ammount of Vogon ships passing through the area building hyperspace bypasses.

Re:Moore's Law vs Physics (1)

SlippyToad (240532) | more than 13 years ago | (#132978)

I just didn't like the way he trash talked one observation while relying on the "inherent truths" of another.

If it were "Moore's Theory of Computational Progress" and some sort of cause were assigned to why microprocessing power must double every eighteen months, it would be like a law of physics. But without a cause, it remains just an observation of what has happened so far, not what must happen. A nuclear war, or a worldwide strike (or alien abduction) of microprocessor engineers would interrupt the steady progress of Moore's Law.

Re:Knowledge is unlimited (1)

SlippyToad (240532) | more than 13 years ago | (#132979)

And every year some discovery shatters that belief in a given discipline.

When our beliefs were based on religious dogma or vaporous philosophies, this was true. Aristotelian and Platonic notions of the laws of physics were basically pulled from their collective rectums. Opposing beliefs were frowned upon in principle, and dismissed because they were not of the correct origin. When the tools of science were re-introduced, many of these fundamental myths and misconceptions were easily debunked. This process continues today. Because science is a self-correcting process, sometimes an erronous conclusion will get introduced through the application of experiment and research, and persist until it is debunked. But more frequently what happens is a previous model or theory is expanded rather than exploded. Newton's gravitational laws are still valid. Einstein's General Relativity is just a more detailed model and a lot more complex to use. The Moon shot was calculated using Newtonian physics, I seem to recall. There was no need for the precision of Einstein's equations.

Stephen Hawking speculates in his Brief History of Time that we are approaching a complete model of physics. He points out that this was believed also 100 years ago and was false. However it is not valid to say that this will always be false. The laws of physics appear to be stable and have not been measurably changed or revoked. What has changed is the level of detail and precision with which we can measure them.

There are a huge number of people who subscribe as you apparently do to the idea that science doesn't really know anything because it keeps changing the rules. This often leads to the absurd conclusion that there are no real limits to what we can do, and someday warp drive and time travel and all manner of other things will be possible. There is also a tendancy to generalize "scientists" as this amorphous mass of people who all know each other and speak with a single voice. The real proof lies in your ability to absorb and critically analyze what you see in front of you. If you aren't capable or willing to do that, prepare to remain ignorant, and prepare for legions of people to tell you how wrong you are. Science is really just the application of logical discipline to the observation of facts. A non-scientist such as myself is just as capable of discovering fallacies in the analysis, if I'm willing to take the extra effort to do so. Having done that, and read this article, I can say with some certainty that I buy the conclusion. It's a well-presented conclusion. Despite the amount of math, the premises are simple:


A computer performs calculations by changing the state of bits which represent its data

The limiting factors of such a device are the maximum rate of change, the maximum amount of storage, and the maximum speed of data transmission between the components

Those things being equal for all computers, the maximum capabilities of a computer are limited by the speed of light and the density of matter.

A discovery that any one of the underlying physical laws were untrue would indeed be a revolution in physics. However, the last such revolution was remarkable only because it was the first time someone applied a consistent procedure of observation and experiment to said laws. That consistent procedure now only leads to expanded, more detailed laws of physics, not different ones.

Re:Moore's Law vs Physics (1)

SlippyToad (240532) | more than 13 years ago | (#132980)

Hence why we no longer believe newtonian physics to be accurate.

They're perfectly accurate. Just not as accurate, or as encompassing, as Einsteinian physics. It's a matter of what tool you want to use. A nail can be pounded into concrete with the head of another nail, if you have the time and patience. Or you can use a hammer. Or you can use your forehead. But you will probably get the best results with the hammer.

Re:How many angels on a pinhead? (2)

SlippyToad (240532) | more than 13 years ago | (#132981)

These kind of articles remind me of the futile
medieval debates on how many angels can dance
on a head of a pin

Of course, the real futility of the argument is that there is no way to demonstrate the existence of angels, let alone count their numbers. Subatomic particles, however, leave visible tracks in the cloud chambers of particle accelerators. Though it's impratical to count them, it's not impossible, and their behavior, while it is inherently uncertain, is not entirely random, and can be predicted to a degree.

Doesn't this only apply to Digital Processing? (1)

Rager-vs-Machine (241119) | more than 13 years ago | (#132982)


I think this article only applies to Digital Processing. If future engineers can more effectively harness Analog Computation, then the upper limits of computational power increase substantially.

Who says the future will be digital? We sure as hell ain't digital processors....

That's great - We've defined the upper limits of digital processing and storage, but what about analog??

*BOOM* (5)

Bonker (243350) | more than 13 years ago | (#132984)

"What was that?"

"Ah, just another script kiddie trying to DOS the database."

"I don't understand. He just upped and exploded."

"Yeah, his quantum computer heated up to the temperature of a supernova and then collapsed in on itself like a black hole. Happens all the time."

"Really?"

"You should see it when they try to encode movies with DivX!"


the ultimate quantum computing (1)

kipple (244681) | more than 13 years ago | (#132985)

- have you seen my new quantum uber-laptop?
- no I cannot see it.
- of course. otherwise I won't have it.

---

- Where is my quantum computing?
- It's here. It isn't. It's here. It isn't. It's here. It isn't. It's here. It isn't.

Re:Ok (1)

dswensen (252552) | more than 13 years ago | (#132986)

Yeah, but Quake XV]|['s framerate will still be too damn slow.

Will our need ever push it this far? (2)

baptiste (256004) | more than 13 years ago | (#132987)

While corporate computing needs are never satisfied by todays fastest machines, for the bulk of small businesses and homes, at what point have we reached overkill? I mean look at Windows 3.1 compared to XP. Besides the web browser and related stuff (Media Player, etc) what has really changed? I mean - what is it USED for? Most people use them for Games, Personal Finance, Letter writing, general record keeping, etc. Think about the programs most people use on a regular basis. Web browser, Office suite, finance package, MP3 player, etc.

All microsoft and other OS developers seem to be able to do is add lots of features that never REALLY get used and a few that do make high impact improvements. But do smart tags, the start menu, right click context menus, etc really require massive improvements in processor speed?

I can't help but think that Win2K on my Pent III 700 laptop is using the bulk of the resources just to RUN vs the load placed on it by any apps I'm using. That seems to make no sense.

So that begs the question. The whole idea of Linux from teh start was a free Unix that ran well on OLDER (cheaper - widely available) PCs. Even today that is still true. So if LInux continues to be accepted and moves into teh desktop mainstream someday - will that effect the push on PC technology?

Its striking that for less than an Apple I in 1977, I built a 1GHz Athlon server with the latest gadgets (SCSI RAID, LCD monitored drive sleds, PC133 SDRAM, etc) A PC with this much power is staggering - even compared to boxen from a year or two ago. But do I really NEED that much power? Not really, CPU wise, but it didn't make sense ot save $20 and get 200 less MHz when AMD, at the time was selling the 1GHZ athlon as the SLOWEST CPU.

We all know that no matter what Intel & AMD come up with, Micro$oft can overload it with a bloated OS upgrade that gains you squat. But in teh world or real OSes that treat system resources as something to be used scarcely, when will enough PC power be engouh for the bulk of the users (corporate flunkies, personal PCs, and small businesses?) When will we see a split in what is used for servers vs what is used in desktop PCs? Today, the latest CPUs are showing up in desktops almost at the same time they go into servers (Xeon excluded, but even there its getting more blurry)

Just like always it'll be amazing to see where we are 5 years from now, but I just can't imagine I'll be using a 3GHz desktop PC running RedHat 12.x that probably cost me $1000 :) It boggles the mind much more than the limits physics places on signal transmission on teh dies.... :)

Re:Will our need ever push it this far? (2)

baptiste (256004) | more than 13 years ago | (#132988)

No because I never said anyone Gate's statement was stupid because it was generalized. My point was the majority of users at home and in small businesses hardly tax their computers with what they USE, its the bloated OS usign most of the CPU time. The average person using AOL to surf the web and pay bills is like a 90 year old man driving a Porsche 30 MPH on the freeway. Since when did AIM require a massive CPU that could design a nuclear missle?

And I was shooting more from a point that if Micro$oft spent more time trying to streamline their OS vs bloat it with more Justice Dept attracting 'features' the user's experience might be better and they might be able to get cheaper PCs more to their NEEDS 5 years down the road.

Re:Ok (1)

Pogue Mahone (265053) | more than 13 years ago | (#132991)

That is the equivilant of 542,580,000,000,000,000,000,000,000,000,000,000,00 0,000 1Ghz CPU's.

A Don't worry, Bill's already got it covered.
Windows2200 will use every one of those GHz computing what the brainsaver (cranial-implant version of the screensaver) will show next.



--

Re:Knowledge is unlimited (1)

dNil (308743) | more than 13 years ago | (#132994)

I suppose I'm not the only one, but I always get a bit excited when someone claims a certain subject has reached its theoretical limit of knowledge. Reaing Murphy and Kuhn, one almost has to agree that a change of paradigm is imminent..

Knowledge being unlimited is probably something that the guys that wrote the article at hand might argue against. The theoretical limit of knowledge could of course be estimated using the same approach as for the 1 kg laptop. You only need to find a rouch esimate for the mass and volume of this and any closely-interacting parallell universa - and voila!

Black hole laptop computing (2)

dNil (308743) | more than 13 years ago | (#132995)

The idea of black hole computing is obviously heavy, but the requirements on a heat sink capable of handling the matter-energy conversion of one kg are staggering.

Overklocking might of course not be strictly necessary, considering the effects of general relativity.

Staggering might be descriptive of the investment costs for setting up a new singularity for each calculation, given the obvius difficulty of interactivity once a Schwartschild-barrier is in place.

One must though admire the article authors, not only on their interesting essay, but also on behalf of the courage involved in imagining the prescence of a dissapearing black hole in ones lap.

Wrong (2)

sharkticon (312992) | more than 13 years ago | (#132996)

His "calculation" is nothing more than a change of state in a quantum system. In real life, any calculation is likely to involve something more complex than this - the time taken for a single change of state is the theoretical minimum time for a single operation.

No matter how the machine works, it must involve state changes in order to have calculation of any kind. Barring completely new physics involving something other than normal matter, his calculation is correct.

*sigh* (2)

sharkticon (312992) | more than 13 years ago | (#132997)

He's talking about the theoretical maximum limit of processing power, not what is actually acheivable. Even in the article he says that there are good reasons for using less than this, and practical concerns like architecture don't come into it at all.

It's not bad science at all, it's theoretical science.

Backwards reasoning (2)

sharkticon (312992) | more than 13 years ago | (#132998)

Physics, however, is man made. In your own counter argument you said Moore's observation applies to technology and knowledge, two inherent ingredients to physics.

To use a phrase: bollocks. Physics is inherent to the Universe, and is independent of what we know about the Universe and how we are able to manipulate it. Obviously, our knowledge of physics changes, but the underlying principles remain the same.

As our technology and knowledge grows so does our ability to penetrate to the "underlying truths of nature". Hence why we no longer believe newtonian physics to be accurate.

But they are still accurate, we just now know they are only accurate within a certain domain (speed much less than the speed of light, low masses). What the author is talking about is how the fundamental physical laws of this Universe constrain processing power. Quantum mechanics (the basis of this article) is undoubtedly not the whole picture (which is why superstrings are the focus of such intense research), but in their domain it is correct, and so are the observations made in this article.

To exceed the limitations described here we will have to do our processing in some other domain - perhaps if we recreate conditions at the very start of the Universe when it was still 10/11-dimensional then we can harness additional computing power, but that wasn't what the article was talking about.

Not really (5)

sharkticon (312992) | more than 13 years ago | (#133000)

Every year we seem to think we know every thing there is to know about physics, biology and any other science.

You don't know many scientists do you? :)

If your assertion is true, then why would they bother doing it? If there was nothing left to know, then there would be no point in being a scientist, and no new research projects coming up.

We are convinced that our current theories are laws of nature.

The term "law of nature" is pretty loaded, and I doubt it would apply in many cases. And even then, such laws aren't universal. Consider Newton's "laws". Although they're called such, they're only applicable in certain domains (speeds much less than that of light, relatively low masses) and are only approximations to relativity. Similarly, our current physical theories (general relativity and quantum field theory) are only approximations to some higher theory which contains both. No scientist is convinced what we have now is the final "law of nature".

And every year some discovery shatters that belief in a given discipline.

I'll admit there have been, and probably always will be, some pretty amazing new discoveries that do come as a big suprise, but shattering belief? I think not. If anything, they often serve to spur on research into the various fields.

Whilst scientists can easily be as guilty of hubris as anyone else, you're portraying them in a far worse light than is deserved IMHO.

Re:the assumptions seem wrong... (1)

the_pres (313362) | more than 13 years ago | (#133001)

Well, an atom of uranium weights as 235-238 atoms of hydrogen, that is 10^2 times more. It means that 10^25 nuclei of hydrogen have the same mass than 10^23 nuclei of uranium. This is approximately the same.
It's correct to speak of current and resistance, but the article is trying to find a theoretical limit, and (in a theoretical lab) one could build a superconductive computer (R more or less zero) and power it with millions of A's. It's safer to say that something cannot contain more energy than its rest mass (by definition!).

Anyhow, I hope that they will NOT install Win3K on a PC that can convert mass/energy. Just one BSOD and... whooosh! you're history (and your town too).

*SIGH* (1)

TheAwfulTruth (325623) | more than 13 years ago | (#133002)

That's still bad theory. Good theory would have taken the mass of the inside of the battery case and converted that total enery. The original comment is right. You cannot "theoreticly" consume the processor when calculating the limit of power available to the processor!

A better version here: (1)

sakusha (441986) | more than 13 years ago | (#133004)

I read a much better summary of this topic, reviewing the same paper, on the NYTimes. Alas, it is now in the paid archives, so if anyone is interested in paying the $2.50 for the article, here it is: http://search.nytimes.com/plweb-cgi/fastweb?view=s ite&TemplateName=hitlist_MPoff.tmpl&dbname=unify&s orting=BYRELEVANCE&numresults=10&operator=AND&simp lesearch.x=10&simplesearch.y=10&query1=thedbs%3Dpa st365days%26section%3DALL%26fields%3DALL%26thequer y%3Dultimate%2520laptop&query8=from%20the%20past%2 0year&query7=ultimate%20laptop&query=(ultimate%20l aptop)%20AND%20(20000623=pdate)&query_rule=($query )

Not good science (2)

MajorBurrito (443772) | more than 13 years ago | (#133005)

I had a real problem with the science behind the article. It states:

The maximum energy an ultimate laptop can contain is given by Einstein's famous formula relating mass and energy: E = m c2. Plugging in the laptop's mass, the speed of light, and combining the result with Eq. 2 tells us that the maximum number of operations per second a 1 kg lump of matter can be made to perform is 5.4258 * 10 50. This means that the speed of a computer is ultimately limited by the energy that is available to it.

What he's actually saying is that you are converting the mass of the computer to energy in order to power it. So what part do you convert first? The screen? The RAM? The case? Not to mention that you have to have some way to funnel the energy into the computer without loss - it reminds me of the "massless ropes" and "frictionless pulleys" of a first-semester physics class.

Sorry folks, this article is misleading. We're going to be stuck with batteries for some time to come.

Moore's Law vs Physics (1)

sup4hleet (444456) | more than 13 years ago | (#133009)

I have a big problem with the first two paragraphs. In the first paragraph Geon states:

"It pays to keep in mind, however, that this isn't really a law, but only an observation, and does not reflect any underlying natural truths. "

In the second paraghraph he states:

"What we know is that any future technologies must obey and be constrained by the laws of physics."

I'd just like to say that I don't know that tomorrow's computing will be constrained by today's physics. This guy attacks Moore's law as being merely an observation, well what the hell is physics? It's a bunch of theories that try to explain observations! If no one ever observed anything, then there wouldn't be any physics. The moore (pardon the pun) observations we make, ideally, the closer to "the underlying truth" we get. Who knows maybe Moore's law points to an underlying truth in the drive of humankind that will actually constrain physics and change forever the way we progress and develope as a society. I'm not trying to troll here, I just didn't like the way he trash talked one observation while relying on the "inherent truths" of another. Who made him god?

Re:Moore's Law vs Physics (1)

sup4hleet (444456) | more than 13 years ago | (#133010)

You are correct to a certian extent, technology and knowledge are man made, nature is not. Physics, however, is man made. In your own counter argument you said Moore's observation applies to technology and knowledge, two inherent ingredients to physics. As our technology and knowledge grows so does our ability to penetrate to the "underlying truths of nature". Hence why we no longer believe newtonian physics to be accurate. I'm just pointing out that it _MAY_ be the case that Moore's law will constrain physics, instead of physics constaining Moore's law. (something the author doesn't even consider)

Technology does have the ability to alter how we interpret observations about nature. =)

SciFi/Science get closer (2)

pacman on prozac (448607) | more than 13 years ago | (#133014)

His comments about the computer operating in black hole was interesting. Similar to the minds in the Iain M Banks Culture series of books which are basically supercomputers with the outer shell in real space and the "cpu" in hyperspace......


Its also worth noting that our two main theories, Relativity and Quantum Mechanics don't work together in that they cannot both be correct, Since a new theory to bring them together is being looked for I personally don't believe quantum computers are the limit.

Re:the assumptions seem wrong... (1)

shawnseat (453587) | more than 13 years ago | (#133015)

quote: "One kilogram of ordinary matter contains approximately 1025 nuclei" maybe i'm wrong but wouldn't the mass of a given number of nuclei be a function of what type of nuclei they are? for instance a uranium nucleus would be much heavier than a hydrogen nucleus.

The approximation is the 25 in the exponent. 25 plus-or-minus one gives you a factor of 100 range; normal matter is just a little more than that (1-244 for Pu244, the heaviest nucleus naturally occurring on earth).

Research in progress (1)

carambola5 (456983) | more than 13 years ago | (#133016)

UW-Madison is actually working on something similar [wisc.edu] . Sure, it's no black hole laptop, but it is quantum computing. Remember... It's not paranoia if they're really after you.

Umm.. sorry but I had to say it... (1)

tempestdata (457317) | more than 13 years ago | (#133017)

Imagine a Beowulf cluster of those!!! .. Wait there you go! I solved the problem! When one Quantum computer is no longer enough, we link them up into clusters!! Whoopee! * goes running to the patent office *

Re:Ok -- but Moore's law gets us there fast enough (1)

Zac Thompson (460069) | more than 13 years ago | (#133020)

Actually, in normal matter he estimates about 10^40 operations per second. Right now we're at approximately 10^9 (1 GHz), so that gives us a factor of 10^31 to increase. That's about 2^103, which Moore's law (x2 = 18 mo) equates to about 155 years.

I don't know about you, but I'm planning on still being alive at that time! I don't like the idea that I'm not going to be able to upgrade my neural implants any further when I'm 182.

the assumptions seem wrong... (1)

progbuc (461388) | more than 13 years ago | (#133021)

quote: "One kilogram of ordinary matter contains approximately 1025 nuclei" maybe i'm wrong but wouldn't the mass of a given number of nuclei be a function of what type of nuclei they are? for instance a uranium nucleus would be much heavier than a hydrogen nucleus. so what extactly is "ordinary matter"? quote: "The maximum energy an ultimate laptop can contain is given by Einstein's famous formula relating mass and energy: E = m c2." wouldn't the amount of evergy available be based on the voltage supplied to the computer and the resistance of its circuits? so wouldn't the relevant equation be V=IR, not E=MC2?

Re:the assumptions seem wrong... (1)

progbuc (461388) | more than 13 years ago | (#133022)

whatever the power source of choice may be in the future for quantum computers, i gaurantee that it won't be the computer's own mass....quantum computers may not be powered by batteries, but they won't eat themselves....my point is that if the speed of the computer is based on the energy it has then to find its fastest possible speed you would have to assume that it could use all the energy in the universe

Re:the assumptions seem wrong... (1)

progbuc (461388) | more than 13 years ago | (#133023)

lol. hopefully win will be long gone by then

i don't understand why if you are calculating the ultimate computer you would limit its energy...if its speed is based on the energy provided to it, wouldn't the ultimate computer be able to harness all of the energy in the universe?

Re:the assumptions seem wrong... (1)

progbuc (461388) | more than 13 years ago | (#133024)

i understand, but my point is that my computer has an outside energy source, why can't the ultimate computer have one too. in fact the truely ultimate computer would be able to use all the energy in the entire universe

Scientific American (1)

ziggy_zero (462010) | more than 13 years ago | (#133026)


Hey, pick up this months issue of SciAm. It has a cool article on the physical limits of supercomputers. A good read.

Re: Divide more (1)

TheBigDinK (462123) | more than 13 years ago | (#133027)

Well, to nitpick, when you refer to the maximum number of "operations" in this sense, I think it means a 0-1 transition rather than a multiply. =)

So in the worst case where every "transistor" on your chip changes state, you'd have to divide that by a few more million/billion/whatever

I think the article mentioned something about 10^35 operations on each of 10^16 bits. So maybe it meant the CPU state is defined by 10^16 internal bits (or transistor states) and you could run that at 10^35 Hz.

I realize those numbers might be wrong... I'm just using them off the top of my head. I read it before lunch.

Re:This is not a real distinction (1)

pi_3 (462127) | more than 13 years ago | (#133030)

>"That's like saying something can go slower than 0 mph."
Things can go slower than 0 mph.. we just havent learned how to do this yet. If you can control time, then you can control space and energy. Therefore.. if by theory we do learn to control time.. there are no limits to computational power
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>