Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

IBM Scientists Measure the Heat Emitted From Erasing a Single Bit

timothy posted more than 2 years ago | from the part-of-a-healthy-weight-loss-program dept.

IBM 111

ananyo writes "In 1961, IBM physicist Rolf Landauer argued that to reset one bit of information — say, to set a binary digit to zero in a computer memory regardless of whether it is initially 1 or 0 — must release a certain minimum amount of heat, proportional to the ambient temperature. New work has now finally confirmed that Landauer was right. To test the principle, the researchers created a simple two-state bit: a single microscopic silica bead held in a 'light trap' by a laser beam. (Abstract) The trap contains two 'valleys' where the particle can rest, one representing a 1 and the other a 0. It could jump between the two if the energy 'hill' separating them is not too high. The researchers could control this height by changing the power of the laser, and could 'tilt' the two valleys to tip the bead into one of them by moving the physical cell containing the bead slightly out of the laser's focus. By monitoring the position and speed of the particle during a cycle of switching and resetting the bit, they could calculate how much energy was dissipated."

cancel ×

111 comments

Sorry! There are no comments related to the filter you selected.

one thing we know for sure (-1, Troll)

Anonymous Coward | more than 2 years ago | (#39316749)

You can bet none of these scientists are black. Blacks really dont contribute much to society. Collectively they are a liability really.

Re:one thing we know for sure (4, Funny)

Black Parrot (19622) | more than 2 years ago | (#39316775)

Wonder how much heat is dissipated when you mod a post down?

Re:one thing we know for sure (-1, Flamebait)

lostmongoose (1094523) | more than 2 years ago | (#39316813)

I dunno. I generally don't waste mod points on ACs.

Re:one thing we know for sure (0)

Anonymous Coward | more than 2 years ago | (#39316821)

Bitch, I'm flowin' straight from the survival scrolls!

Re:one thing we know for sure (1)

Anonymous Coward | more than 2 years ago | (#39316831)

None as it contains no information.

Re:one thing we know for sure (1)

BenJCarter (902199) | more than 2 years ago | (#39316917)

Millions of bits worth...

Re:one thing we know for sure (0)

Anonymous Coward | more than 2 years ago | (#39318847)

1.21 picowatts

Re:one thing we know for sure (2)

K. S. Kyosuke (729550) | more than 2 years ago | (#39318853)

Wonder how much heat is dissipated when you mod a post down?

Less than the heat that is saved by not displaying the down-modded post in millions of basements all over the world.

Re:one thing we know for sure (-1, Offtopic)

Anonymous Coward | more than 2 years ago | (#39316793)

Blacks add nothing to our culture
Subtract from the wealth of our nation
and Divide us against each-other.
They sure can Multiply though.

The Remainder of white people must refuse to Integrate!

Spirit (0)

AlienIntelligence (1184493) | more than 2 years ago | (#39316753)

Was that really the spirit of what Landauer was considering?
Why not measure the computer memory such as he envisioned?

-AI

Re:Spirit (4, Interesting)

Ihmhi (1206036) | more than 2 years ago | (#39316785)

Yeah, it's kind of like a piece of armor being considered arrow-proof, and then you fire an arrow out of a railgun.

I wonder how you would even measure it, though, and distinguish the heat from a bit changing from the ambient heat from drive operation.

Re:Spirit (5, Insightful)

MacTO (1161105) | more than 2 years ago | (#39316807)

It probably reflects the spirit of Landauer's claims. Claims such as this depend upon an understanding of physics, which was much more common in computing back in the days when innovation depended upon an understanding of physics in order to develop new hardware. You also have to consider that a variety of different techniques were used to make computer memories back then, so his claims had to be based upon the underlying physics rather than a particular memory technology. So it is fair game to apply different physical models to prove his claims.

Re:Spirit (1)

msobkow (48369) | more than 2 years ago | (#39318349)

I can appreciate that. But I question the actual relevance of the results, given that the "memory technology" used doesn't resemble anything I've ever heard of being used in a production computer in 30+ years.

The fact that energy would be needed to force a state change should have been intuitively obvious to anyone with even a Grade 12 physics education.

Re:Spirit (2)

HiThere (15173) | more than 2 years ago | (#39319283)

I've seen serious claims that "reversable computation" can be done with no energy input at all. What this doesn't cover, of course, is setting up the initial conditions, or extracting the results of the computation. One requirement is that at the end of the computation, the state of the system should be identical to the initial state.

I must admit that I don't understand either the utility, or the feasibility, of such a system. But there have been serious claims that computation does not, itself, require any energy at all.

OTOH, I don't see this experiment as any proof that all writing of a bit requires a minimal amount of energy. It shows that using THIS technology it requires THAT minimal amount of energy. This is a *far* different statement.

Re:Spirit (1)

Raenex (947668) | more than 2 years ago | (#39323739)

I must admit that I don't understand either the utility, or the feasibility, of such a system.

Wikipedia gives an answer [wikipedia.org] :

"Although in practice no nonstationary physical process can be exactly physically reversible or isentropic, there is no known limit to the closeness with which we can approach perfect reversibility, in systems that are sufficiently well-isolated from interactions with unknown external environments, when the laws of physics describing the system's evolution are precisely known.

Probably the largest motivation for the study of technologies aimed at actually implementing reversible computing is that they offer what is predicted to be the only potential way to improve the energy efficiency of computers beyond the fundamental von Neumann-Landauer limit [2] of kT ln(2) energy dissipated per irreversible bit operation.

[..]

Although achieving this goal presents a significant challenge for the design, manufacturing, and characterization of ultra-precise new physical mechanisms for computing, there is at present no fundamental reason to think that this goal cannot eventually be accomplished [..]"

Re:Spirit (1)

Shavano (2541114) | more than 2 years ago | (#39319341)

No kidding? You DO WORK and ENERGY IS RELEASED? Is anybody surprised to see that Landauer was right? Nobody?

What's surprising is that somebody bothered to verify a result that's obvious to everybody with a basic understanding of physics. If the claim weren't true, the machinery that they used to perform the experiment wouldn't have worked either.

Science publishing is not what it used to be.

Re:Spirit (1)

ultranova (717540) | more than 2 years ago | (#39326307)

What's surprising is that somebody bothered to verify a result that's obvious to everybody with a basic understanding of physics. If the claim weren't true, the machinery that they used to perform the experiment wouldn't have worked either.

Science publishing is not what it used to be.

You are absolutely right. And that's why we have modern technology and, in fact, physics themselves: because people began verifying obvious "facts".

Re:Spirit (4, Insightful)

jpate (1356395) | more than 2 years ago | (#39317857)

Landauer's claim was about the relationship between entropy as used in information theory and entropy as used in thermodynamics: specifically, that entropy in information theory is identical to the entropy in thermodynamics. The scientists used this set-up so they could measure a change of exactly one bit (the information-theoretic conception of entropy) while controlling outside heat influences (the thermodynamics conception of entropy), and see if the change in information corresponded to the change in heat as predicted by thermodynamics and information theory.

Without precisely controlling the change in information and precisely measuring the change in heat, the result is much less clear. That's why they used this methodology and equipment. Moreover, as this is empirical evidence for a very general identity between heat and information, the result will hold for computer memory as well.

Re:Spirit (1)

neonKow (1239288) | more than 2 years ago | (#39317893)

In that case, I could have used a mechanical switch to represent 0 and 1 and told you that heat was dissapated. There needs to be a little more to draw a parallel between a random experiment and computer memory.

Re:Spirit (1)

jpate (1356395) | more than 2 years ago | (#39317949)

Computer memory is a bunch of mechanical switches. The point is that they have a lot of sources of heat aside from reductions in the information content of the physical system. The researchers built a switch that was as efficient as possible so the vast majority of heat dissipation could be attributed to changes in the information content of the switch. Real computer memory will have heat dissipation due to changes in information content along with heat dissipation from such things as moving read/write heads.

Re:Spirit (1)

jpate (1356395) | more than 2 years ago | (#39317955)

additionally, the point isn't just that heat was dissipated, but rather that a specific quantity was dissipated as predicted by thermodynamics and information theory.

Infophysics will be the new physics (1)

presidenteloco (659168) | more than 2 years ago | (#39319485)

I'm going out on a limb here, not having had the time to study this stuff enough,
but my intuition says that the unification of information theory and physics will yield a great breakthrough in physics.

I take the view that thermodynamics and Shannon information theory are literally about the same thing exactly, not just by weak analogy.

Related factoids:
1. All information is embodied mutual information.
a. It must be embodied in some local configuration of matter/energy.
b. It must be mutual in that the information in some clump of matter/energy is either about itself (the various parts/bits that comprise itself) or the information must be about some other configuration of matter/energy and spacetime somewhere else. Those things somewhere else also got information about the clump during the interaction.

2. clumps of matter/energy gain information about external parts of the universe only by interacting with them (during which bits of (mutual) information are transferred).

3. Light speed (and planck length) places a limit on the rate of mutual information transfer across a boundary (of a certain area) in spacetime. (Holographic cosmology stuff?) Q: What is that limit, in bits/second/m^2 ?

4. It is not just special things like human/slug brains and computers that have information about their surroundings. Every clump of matter/energy i.e. every non uniform local configuration of spacetime has (embodies) such information, which is some function of the interactions that clump has had.

5. Complexity of sequence of interactions over time as clump evolves through spacetime means that the information a clump has about any particular past interaction (or past encountered other thing) is necessarily always decreasing/dissipated/radiated into a larger space over time.

6. Such information about specific past/far off things is also necessarily intermingled (within the clump's boundary) with more and more noise (information about other things). This may be saying the same thing as the "local information" dissipation statement.

7. The second law of thermodynamics is explained by 5. and 6.

8. Information (the amount of local embodied mutual information) is what fundamentally characterizes configurations of matter/energy, space, and time. Other laws of thermodynamics are implied by this. And 1st law, conservation of energy, is the same exactly as saying conservation of (the amount of embodied, mutual ) information in the universe.

Re:Spirit (0)

Anonymous Coward | more than 2 years ago | (#39319221)

What if your switch was already at 0? How would you still turn it off?
RTFA:

say, to set a binary digit to zero in a computer memory regardless of whether it is initially 1 or 0

Stop bullshitting a well-written reply.

Re:Spirit (1)

ShakaUVM (157947) | more than 2 years ago | (#39318081)

>>entropy in information theory is identical to the entropy in thermodynamics

Is there a name for this law?

Also, what does this say about the reality of information itself?

Re:Spirit (5, Insightful)

canajin56 (660655) | more than 2 years ago | (#39319073)

It says that information is disorder. And thermodynamic entropy is (for some definitions of order) order as well. If you have all of the air molecules in a room compressed into the corner, maybe that's ordered? But that's one small lump of air, and a whole lot of vacuum. Evenly distributed air is more ordered because it is uniform. If you let a system starting in any arbitrary corner-gas configuration (and there are a lot, since each molecule can have any number of different values describing it) progress for X amount of time, you find that almost certainly you have ended up in an even-gas configuration. On the other hand, if you start in an even-gas configuration, and progress for X amount of time, you will almost certainly still be in an even-gas configuration. This may seem at odds with the fact that laws of motion are time reversible (at least if you assume that molecules are like frictionless billiard balls, as physicists are wont to do). But it's not. If you take some specific corner-gas start A , and run it for X time, you will (probably) have an even-gas configuration B. If you take B, reverse the velocity of all molecules, and run it for X time again, you will be at A (again, assuming molecules are frictionless billiard balls). But, with discrete space and velocity, you can count the possible velocity and position vectors. There are a LOT more even-gas configurations than there are corner-gas configurations. So, with a tiny room and only a few molecules, you can establish the chance that after X time starting at even-gas, you end up at corner-gas. And even for very small systems it basically 0. Entropy is the concept of changes to a system that are not reversible, not because of laws of PHYSICS but laws of STATISTICS. The second law is the observation that, by statistics, you will tend to a uniform (ordered) system because there are a lot of ways to go that direction, and very few ways to go the other direction.

Landauer's observation is that any computational device, at the end of the day, stores information mechanically (again, I refer you to the fact that for our purposes, subatomic particles are frictionless billiard balls, so even things like the atom-trap from TFA are mechanical devices). So if you have a 32 bit register, it has 2^32 configurations. If you consider how many possibilities there are for ordered bit flips involving X bit flips total, it's 32^X. And if you start at 0, almost all of those ordered flips will take you to a pretty chaotic state. But if you start from a random state, almost none of those same bit flip orders will get you to 0. So treating the system as a completely mechanical one, thermodynamics applies and puts limits statistical limits on such changes. What Landauer did is establish a maximum circuit temperature T for your memory/CPU, and observe that you won't want Brownian motion breaking your system, so 0/1 need a minimum separation for the system to be useful at temperature T. This puts a lower bound on the state counts, and lets traditional thermodynamics establish a minimum energy dissipation to go from a high entropy state to a low one (like a 0'd out register). What information entropy does is take the same thing and say that therefore the disordered information has intrinsic entropy, since regardless of system design it requires a certain minimum entropy to store that information. It's avoidable if your system is reversible, which is possible if you have more ways to represent a bit pattern the more ordered that bit pattern is. So if you have fewer ways to store 10010101 compared to how many ways you have to store 00000000. It's also beatable if you find a way to store information non-physically. But good luck on that front.

Neat, huh? I took a course on Kolmogorov Complexity [wikipedia.org] , which is somewhat related, and pretty cool.

Re:Spirit (1)

subreality (157447) | more than 2 years ago | (#39320953)

Fantastic. Thanks for writing this up.

Re:Spirit (1)

ShakaUVM (157947) | more than 2 years ago | (#39321033)

Excellent response, thanks.

Pretty far afield followup question: every time Work is performed, Entropy increases. Using the Landauer Principle, it seems like you could you consider information processing to be a sort of Work being done, leading to a similar increase in entropy. If our conscious minds are a form of information processing engine, could consciousness be a byproduct of the Work being conducted by the information processing, which manifests itself simply as extra heat being radiated by the system?

Re:Spirit (1)

Qzukk (229616) | more than 2 years ago | (#39322349)

It's also beatable if you find a way to store information non-physically.

I think this is what throws everyone when they think about the physics of knowledge. The vast majority of people don't realize that the physical embodiment of information must obey the laws of physics, and even many who do seem to believe knowledge ought to have some form of "soul" not shackled by physical constraints.

Re:Spirit (1)

blueg3 (192743) | more than 2 years ago | (#39320229)

It is in the spirit of what Landauer was considering. The larger question is if information entropy and thermodynamic entropy are related.

Quick and Easy (-1)

Anonymous Coward | more than 2 years ago | (#39316769)

I have one of those handheld laser temperature readers. Open up a running hard drive, point, and bingo!

Re:Quick and Easy (1)

chromas (1085949) | more than 2 years ago | (#39316919)

Yeah but how do you know which bit you measured?

Re:Quick and Easy (1)

jones_supa (887896) | more than 2 years ago | (#39317083)

Duh! You divide the result by the HDD capacity!

What a very very stupid test (-1)

Anonymous Coward | more than 2 years ago | (#39316795)

THERE IS ABSOLUTELY NOTHING YOU CAN DO IN THE WORLD THAT DOES NOT CREATE HEAT. http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Heat_death_of_the_universe [wikipedia.org]

Re:What a very very stupid test (4, Informative)

Spy Handler (822350) | more than 2 years ago | (#39316817)

except for endothermic reactions

Re:What a very very stupid test (2, Insightful)

Anonymous Coward | more than 2 years ago | (#39317167)

its not necessarily stupid test.. in terms of science, we can estimate the amount of energy from various sources, suchas nuclear plant, or total earth energy, or our solar system, or galaxy... using that estimate, we can put an upper bound on the maximum amount of computational power we have at our disposal.. such as, a certain problem is shown to require X calculational complexity, and X exceeds or the amount of disposable energy in our solar system, thus, X is uncalculable given current technology.

Now, let X be some sort of encryption complexity. now do u see how it could be useful?

Re:What a very very stupid test (0)

camperdave (969942) | more than 2 years ago | (#39317563)

its not necessarily stupid test.. in terms of science, we can estimate the amount of energy from various sources, suchas nuclear plant, or total earth energy, or our solar system, or galaxy... using that estimate, we can put an upper bound on the maximum amount of computational power we have at our disposal.. such as, a certain problem is shown to require X calculational complexity, and X exceeds or the amount of disposable energy in our solar system, thus, X is uncalculable given current technology.

Now, let X be some sort of encryption complexity. now do u see how it could be useful?

All the more reason to buy a five dollar wrench [xkcd.com] .

Re:What a very very stupid test (1)

St.Creed (853824) | more than 2 years ago | (#39317827)

Or, in reverse:

Cool down the disk to a point where you can measure the temperature changes really well. Now start the encryption. How much information does the change in temperature of the disk (or SSD, or RAM) give you? Could be interesting.

Re:What a very very stupid test (1)

Cyberax (705495) | more than 2 years ago | (#39317311)

Which require you to put strictly more energy to prepare reagents for the reaction than would be consumed by the reaction.

Re:What a very very stupid test (2, Insightful)

Avoiderman (82105) | more than 2 years ago | (#39316983)

Oh it has a law on Wikipedia, must be a waste of time to test or verify it then! Seriously, have a read about how science works before attempting to comment again. A "law" in science is not like a legal law - i.e. it is not a fact merely by self-assertion (a legal law is a law because law makers say so). Scientific "laws" require test and proof; they often require refinement in details. Scientific "laws" do not exist as abstract facts about the universe - they are human attempts to model the universe from the knowledge we currently have. Our limited knowledge means that the detail may be imperfect. A quick survey of the history of science demonstrates that we often get them wrong.

I'm not attempting to challenge the "laws" of thermodynamics - my guess would be that we have the broad picture right (we have a lot of evidence in favour), but again, given the history of science I would be surprised if every detail of taught theory in that area survives the next few hundred years without some modification.

Yes the scientists doing this probably expected some heat to be measured. They were more interested in precisely how much. This is science - an ongoing process.

Re:What a very very stupid test (1)

drinkypoo (153816) | more than 2 years ago | (#39317353)

I'm not attempting to challenge the "laws" of thermodynamics - my guess would be that we have the broad picture right (we have a lot of evidence in favour), but again, given the history of science I would be surprised if every detail of taught theory in that area survives the next few hundred years without some modification.

Having the broad picture right just means you have a working model, though. It doesn't mean you've actually discovered how the universe works, just that you can make accurate predictions. Maybe later it turns out that what happens, happens for a totally different reason than what you thought.

Re:What a very very stupid test (1)

sFurbo (1361249) | more than 2 years ago | (#39317775)

Having the broad picture right just means you have a working model, though. It doesn't mean you've actually discovered how the universe works, just that you can make accurate predictions. Maybe later it turns out that what happens, happens for a totally different reason than what you thought.

Science is all about making predictions, and not about discovering how anything works (formally, anyhow). Or as a physics professor put it: "There are no particles, only clicks in my Geiger counter".

Re:What a very very stupid test (1)

jpate (1356395) | more than 2 years ago | (#39317909)

Additionally, the prediction was a great deal more specific than "durrr it will get more hot," it was more: "the heat will change by this particular amount, relative to the ambient temperature, as predicted by these equations,"

Re:What a very very stupid test (1)

jbengt (874751) | more than 2 years ago | (#39318165)

Scientific "laws" require test and proof . . .

You are thinking of scientific theories or hypotheses. Scientific laws are based on observations, but they are not proven. In fact, they are the assumptions and axioms upon which proofs are built.

Re:What a very very stupid test (0)

Anonymous Coward | more than 2 years ago | (#39317321)

Entropy does not mean what you think it does.... and at least in this example neither does the 2nd Law of Thermodynamics.

What am I missing? (1)

SmlFreshwaterBuffalo (608664) | more than 2 years ago | (#39316805)

To store information, you need the ability to set something into at least two possible states, one of which can be the intrinsic state. No matter what you use for storage, you'll always need energy to reach the non-intrinsic state(s), since the intrinsic state is, essentially by definition, the state achieved with no external energy applied.

If you must add energy to enter a non-intrinsic state, it makes perfect sense that the energy would need to be dissipated to return to the intrinsic state (which equates to erasing the bit). I expect something so obvious wouldn't warrant experiments and articles, so what am I missing that makes this more complicated than it seems to be?

Re:What am I missing? (3, Informative)

epte (949662) | more than 2 years ago | (#39316895)

Say you have two valleys named 0 and 1, and a mountain between. Setting our bit by rolling a ball from 0 to 1 would require energy expenditure, but once the ball is in the valley it is stable and won't roll out again without further input. 0 and 1 may be at different heights relative to each other, but need not be. They might even be at the same altitude. But if 1 were higher than 0, then yes, you would be storing energy in some sort of potential energy form, and may be able to recover that energy when coming back to zero. But you cannot expect to recover all the energy it took to push the ball up the mountain. Any energy required to raise the ball above its destination will have been wasted.

Re:What am I missing? (4, Informative)

FrangoAssado (561740) | more than 2 years ago | (#39317453)

It's theoretically possible to change the state of a bit without spending energy. Here's a dumb example: think of a closed system (so no energy is being gained or lost) consisting of a box filled with oxygen and only one molecule of water. Divide the box in two halves and say a bit is "0" if the molecule of water is in the left half and "1" if it's in the right half. If you wait a while, eventually the bit will flip with absolutely no change in energy. That's a dumb example, but it shows that there's nothing that requires a "intrinsic state" and energy loss when you move away from it, like you described.

The only time energy dissipation is unavoidable (in theory) is when you erase information. That's a strange concept because, usually, we don't think about "conservation of information" in the same sense of conservation of energy, but there's a relation [wikipedia.org] . A little more discussion with more relevance to computing can be found here: http://en.wikipedia.org/wiki/Reversible_computing [wikipedia.org] .

Re:What am I missing? (1)

hellop2 (1271166) | more than 2 years ago | (#39317965)

Would this be true at absolute zero? No? Then probably the system is using some heat.

Re:What am I missing? (1)

jo_ham (604554) | more than 2 years ago | (#39318613)

Yes, because of the zero point energy, since we're using a molecule. The bond has a minimum vibrational energy of 1/2 h*nu when the vibrational quantum number is 0 (ground state), so even when the temperature is 0 K, the bond still has energy and the molecule will still move around.

Re:What am I missing? (0)

Anonymous Coward | more than 2 years ago | (#39318445)

You aren't resetting anything here. You let a system evolving. In your example you should either consider the system in a state X and want voluntarily set it in state Y. So, you want to pick the water molecule and put it back where it was in first place.

Re:What am I missing? (1)

Polo (30659) | more than 2 years ago | (#39319617)

I was going to post something about reversible computing. I found it an interesting concept when I read that Richard Feynman did some work in computation and was a proponent of it. As far as I can tell, the idea was largely ignored.

I think reversible computing would not only be more energy efficient, but from what I understand might make for some interesting debugging, because I think you could run the program counter backward to an error.

You are missing the energy of detection (0)

Anonymous Coward | more than 2 years ago | (#39320225)

Your mechanism would not work, because you wouldn't know when it reset. If you declare it reset at the wrong time, you would get the wrong result.

Re:What am I missing? (1)

SmlFreshwaterBuffalo (608664) | more than 2 years ago | (#39324045)

Changing the state of a bit is not necessarily the same as storing information. To be used for information storage, the system can only move between valid states through external stimuli. If it changes to a different state without external stimuli, then it either doesn't store information or the states are not defined correctly.

The whole point of storing something is to have it maintain its state. If an item is not maintaining a single state, then it's not storing information. And if the item is maintaining the state, then you must apply external stimuli (and therefore energy) to change its state, otherwise it's not maintaining the state, now is it?

Re:What am I missing? (1)

FrangoAssado (561740) | more than 2 years ago | (#39324329)

You're thinking in terms of a storing information the way a normal (irreversible) computer does. Not all computation must be done that way, I was describing a specific way that's not like that. Imagine that in the system of my (dumb, as I said) example the problem being calculated was, conveniently, the equivalent of "in which side of the box the water molecule will be after 3 days". In this case, I have to spend no energy at all to compute that, assuming the box is perfectly isolated from the environment.

Admittedly, that's a contrived (and dumb) example, I'm just showing that you don't have to spend energy to flip bits around as you would in a normal computer. In a real system, to solve real problems, you'd probably want to devise a way to implement a CNOT gate (described in one of the Wikipedia links in my original post) or a Toffoli gate [wikipedia.org] and use that to simulate a "usual" AND gate. Wikipedia shows a toy implementation of a gate like that using billiard balls. In a system like that, the only energy that must be spent (in principle) is the initial velocity of the billiard balls, regardless of how many gates and how long the computation took. That's strictly different from a normal (irreversible) computer, where you must spend energy [wikipedia.org] each time you change a bit.

This is just entropy, right? (4, Informative)

global_diffusion (540737) | more than 2 years ago | (#39316815)

I mean, this is demanded by Maxwell's demon, right? You need to expend energy to store information in order to not violate the 2nd law of thermodynamics. Awesome that they measured it, for sure.

Re:This is just entropy, right? (1)

Anonymous Coward | more than 2 years ago | (#39316847)

It is confirmation that scientists and anyone working to promote information are enemies of the universe. Processing information brings forward the end of the universe.

Re:This is just entropy, right? (5, Funny)

MaskedSlacker (911878) | more than 2 years ago | (#39316873)

Quick, you'd better stop thinking to slow the process down.

Re:This is just entropy, right? (2)

Avoiderman (82105) | more than 2 years ago | (#39316933)

Way ahead of you. Stopped this sometime ago. Also took out a couple of scientists - and burnt them, so that will help.

(note contents of message may contain unexamined irony)

Re:This is just entropy, right? (0)

Anonymous Coward | more than 2 years ago | (#39316879)

It is confirmation that scientists and anyone working to promote information are enemies of the God. Processing information brings forward the Apocalypse.

Fixed that for you, with a sad undertone because too many people of this planet think the sentences are true.

Re:This is just entropy, right? (1)

Dunbal (464142) | more than 2 years ago | (#39316959)

too many people of this planet believe the sentences are true.

FTFY. I can't see how thinking has anything to do with it at all. Too many people simply refuse to do it.

Re:This is just entropy, right? (1)

Avoiderman (82105) | more than 2 years ago | (#39317135)

too many people of this planet believe the sentences are true.

FTFY. I can't see how thinking has anything to do with it at all. Too many people simply refuse to do it.

And hence they are saving the universe ... Must try to remember this when next debating with the unreasoning - they are doing it for all of us...

Re:This is just entropy, right? (0)

Anonymous Coward | more than 2 years ago | (#39317073)

Asimov's The Last Question [multivax.com] , bitches!

Re:Yes, it's the entropy (5, Interesting)

Anonymous Coward | more than 2 years ago | (#39316923)

Specifically in the calculation of the Landauer limit, E = kT(ln2), the minimal energy needed to transform a single bit. The interesting thing is that 10^20 bit operations is just a watt. This means that the efficiency of today's computers is just 0.00001%. More details at http://tikalon.com/blog/blog.php?article=2011/Landauer.

Re:Yes, it's the entropy (3, Insightful)

Kjella (173770) | more than 2 years ago | (#39317649)

Not really that surprising, a silicon atom is about 0.11nm and the lattice grid in a silicon crystal 0.54nm, which is still way smaller than the 32nm processors he's talking about. I don't know how many electrons flow down each 32nm path but they're between 0.1nm and 0.000006nm in diameter depending on what model you use - quantum mechanics makes a mess of this anyway - so it's way more than one. If you want single electron calculations you'll have single electron signals, one quantum event and your signal is lost. So the limit is likely to remain a very theoretical limit.

The other thing is that this only includes the operation itself, no clock cycle, no instruction pointer, no caching, prefetching, branching, this is the ideal you could get out of a fixed-function ASIC that only does one thing, not even as programmable as a GPU shader. We already know that there's a significant gain to that, but even supercomputers aren't built that specifically to the task. Formulas must be tweaked, models adjusted, parts must be able to be used in many computers. We've already seen that a GPGPU can beat a CPU by far on some tasks, but even they aren't close to such an ideal.

If you think about this in encryption terms it's not that much... it says you can at most improve 23-24 bits, in encryption most have used the Landauer limit to "prove" there's not enough energy to break a 256 bit chipher by brute force. In some places I don't think it's that relevant either, in for example mobile I think the energy involved in bandwidth use will be more significant. Want to stream a HD movie? It's not the decoding that kills the battery, it's the 3/4G data connection. Just like cameras get better but good optics still isn't small, light or cheap.

Re:This is just entropy, right? (1)

Jimbookis (517778) | more than 2 years ago | (#39316975)

Sure, Leonard Susskind talks about computer memory and entropy on some recent Youtube video as part of a lecture on the holographic principle and the total amount of information in a system. OK, this is sort of a duh moment but I suppose it's good science to test it anyway.

It isn't working (0)

Compaqt (1758360) | more than 2 years ago | (#39316833)

I'm trying to post some whitespace to decrease the temperature in here, but the lameness filter keeps getting in the way!

Re:It isn't working (1)

Anonymous Coward | more than 2 years ago | (#39316875)

Are you sure you weren't trying to post some lameness, but the whitespace filter kept getting in your way?

Global warming solution? (0)

ehiris (214677) | more than 2 years ago | (#39316949)

Does this mean that we can stop global warming by no longer censoring movies on TV?

Re:Global warming solution? (1)

SeaFox (739806) | more than 2 years ago | (#39317013)

What? How will we decrease global temperatures by making prime-time television more steamy?

Re:Global warming solution? (0)

Anonymous Coward | more than 2 years ago | (#39317383)

Heat of vaporization? [wikipedia.org]

Seems rather convoluted (0)

Anonymous Coward | more than 2 years ago | (#39316981)

... or they could have measured how hot a hard disk got when setting all the bits on the platters to 0? Seems like an easier solution using off the shelf products.

What if I store bits as heat? (1, Interesting)

Anonymous Coward | more than 2 years ago | (#39317027)

Let 0s be room temperature and let 1s be somewhat below room temperature. Then to erase the memory I expose it to the room. As it erases the memory will absorb some heat from the room instead of releasing heat.

Not really a practical form of computer memory, but seems sufficient to disprove Landauer.

Re:What if I store bits as heat? (1)

Avoiderman (82105) | more than 2 years ago | (#39317123)

Interesting.

That the mechanism you select absorbs heat does not mean that some heat was not released. Possibly a small amout of heat was released by the state change whilst the mechanism also absorbed heat. It is not the net effect (overall heat absorbed) that is the critical point here.

Not convinced that the thought experiment disproves Landaue (but very interesting - thank you AC), but I am not an expert in this area - very interested if someone with deeper specific knowledge could enlighten?

Re:What if I store bits as heat? (2)

hankwang (413283) | more than 2 years ago | (#39317151)

Let 0s be room temperature and let 1s be somewhat below room temperature.

Yes, the memory will absorb heat, but it costs heat from the hot room. You have to consider the total energy of a closed system and it your naïve approach, the best you can get is a net neutral energy balance. The argument is primarily about the fundamental increase of entropy associated with erasing a bit, and thermal equilibration (between a hot and a cold object) definitely represents an increase in entropy.

Re:What if I store bits as heat? (4, Insightful)

Mr Thinly Sliced (73041) | more than 2 years ago | (#39317249)

Your example only erases in one direction.

To be correct, your experiment must complete erasure in both directions (0 to 1, 1 to 0).

As such, I think you'll find going the other direction is radically more difficult to get energy neutral since you'll be trying to keep thermal bleed from happening whilst flipping your bit.

Re:What if I store bits as heat? (2)

jbengt (874751) | more than 2 years ago | (#39318219)

By your own example, it took energy to erase the bit, just that the energy came from the pre-erasure difference in temperature between the bits and the environment. And the end result of the erasure in your example is an increase in entropy for the (assumedly closed) system of the room plus the bits. So, no, your example does not come close to disproving Landauer.

Undoing of the Universe (0)

Anonymous Coward | more than 2 years ago | (#39317049)

That theory that information processing undoes the universe is not new; it exists in occult circles. On the journey from Satan to Seraph we go through changes. When this universe ends it will become something else. Relax; you're in eternity.

Ut oh's (1)

AgNO3 (878843) | more than 2 years ago | (#39317075)

So can we start blaming google for more global warming yet, swiching all those bits?

This gets to the core of the subject (1)

ve3id (601924) | more than 2 years ago | (#39317143)

In 1961, resetting a bit involved passing a huge current through the wires surrounding a toroidal core which represented one memory bit. So to say that it releases heat is ridiculous, it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.

Re:This gets to the core of the subject (1)

drinkypoo (153816) | more than 2 years ago | (#39317349)

So to say that it releases heat is ridiculous, it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.

You're misunderstanding the statement. Actually flipping the bit releases heat. Doing the work required to flip the bit also involves the generation of heat, but that heat isn't flipping the bit, and therefore it's not CONSUMED in the process of flipping the bit, just WASTED.

Re:This gets to the core of the subject (1)

Avoiderman (82105) | more than 2 years ago | (#39317661)

it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.

What? Scientific models of how our Sun work exist, and as do measurements of heat from it. They are big. I strongly doubt even any old/inefficient human made computing system has yet got anywhere near our Sun.

Could you be writing from some amazingly dangerous alternative universe with a massive energy cost on information? Or am I the victim of a sophisticated troll employing meaningless hyperbole?

Re:This gets to the core of the subject (0)

Anonymous Coward | more than 2 years ago | (#39317783)

1) The *core* surrounded the wires.

2) You really don't understand the article.

answer (1)

Anonymous Coward | more than 2 years ago | (#39317413)

A little bit of heat

Slow erasure? (2)

mattr (78516) | more than 2 years ago | (#39317445)

Does this suggest that by saving up erasures to be done more slowly, perhaps by flipping bits to 0 near the time when they are flipped to 1, could energy be saved and the Landauer limit approached? Also, are there architectures in which a flipping a bit in one direction uses less power, or when blocks of bits can be deselected by some pointer instead of actually erased, trading memory hardware space for power usage?

Re:Slow erasure? (3, Informative)

sFurbo (1361249) | more than 2 years ago | (#39317831)

A more practical way of improving efficiency would be to move to reversible computing [wikipedia.org] . However, we are far, far away from the Landauer limit in any practical computers, so this is not what is limiting efficiency.

Unpossible! (2)

Prof.Phreak (584152) | more than 2 years ago | (#39317935)

By monitoring the position [AND] speed of the particle...

Unpossible! Measure one or the other, but not both...

Re:Unpossible! (1)

ShakaUVM (157947) | more than 2 years ago | (#39318121)

>>Unpossible! Measure one or the other, but not both...

Well, kinda.

Re:Unpossible! (0)

Anonymous Coward | more than 2 years ago | (#39319101)

Thanks. I came here looking for the Uncertainty Principle reference; left satisfied.

Re:Unpossible! (0)

Anonymous Coward | more than 2 years ago | (#39319245)

You fail at physics. Position and momentum can be measured simultaneously, with the limiting uncertainty being a fraction of Planck's constant.

Re:Unpossible! (0)

Anonymous Coward | more than 2 years ago | (#39323343)

Good luck convincing the judge that the presence of a speed camera means there's no way of knowing whether your car was on that road.

I'm cold (-1)

Anonymous Coward | more than 2 years ago | (#39318099)

Its a little cold in my house....does this mean it is time to erase my hard disk?

Hot smoke? (1)

Anonymous Coward | more than 2 years ago | (#39318479)

"IBM Scientists Measure the Heat Emitted From Erasing a Single Bit"

All of this seems like a bunch of hot smoke to me. Can't these scientists find something better to do with their pay?

Re:Hot smoke? (1)

dargaud (518470) | more than 2 years ago | (#39318743)

And we all know that if the magic smoke [catb.org] comes out of the electronics, the data is truly lost...

1/2*cp*V^2 (0)

Anonymous Coward | more than 2 years ago | (#39321467)

A memory element is typically a capacitor. It's charged to the memory VDD. The amount of energy released in discharging to zero is 1/2 * c * v^2....

I wonder how much these idiots spent proving that.

Re:1/2*cp*V^2 (0)

Anonymous Coward | more than 2 years ago | (#39322455)

The amount of energy released in discharging to zero is 1/2 * c * v^2

That's nice. Now, read the paper and tell us the equation you use for finding the minimum heat released in erasing a bit from any medium, whether it's an inefficient, noisy electrical capacitor or a single molecule held in place by a laser.

SHOCK: Thermodynamics proven by CS grads (0)

Anonymous Coward | more than 2 years ago | (#39322061)

Suddenly that useless course in physics that was part of my CS degree makes HEAPS more sense...

cheapghd (0)

cheapshoes (2593267) | more than 2 years ago | (#39323431)

hhref='www.christiansale.net'>christian sale planet when compared with my own mobile, that?s another matter, altogether.)Actually, the key reason why I?m hesitant to easily bin straightener orderonline [cheapestgh...teners.org] my phones is really because I?ve paid great money for them. Yes, I?ve used them, however the thought of simply tossing ghd straighteners [ghdstraigh...scheap.com] something that cost me personally over a $ 100 or so pounds a year ago makes me think that I?m just frittering money boots [snowbootsvip.com] away. Selling my mobile and recycle mobile phones for cash just eases that guilt slightly, not to mention it allows me personally a ugg boots [classicuggbootsvip.org] bit of cash for the brand new mobile phone that has just been launched.I found that selling my mobile or recycling mobilea

Finally! Some evidence that Dark Energy = entropy (1)

ArcSecond (534786) | more than 2 years ago | (#39324239)

So in an expanding universe there is a loss of information -- and by Landauer's principle this loss of information should release dissipated energy -- and Gough claims that this dissipated energy accounts for the dark energy component of the current standard model of universe.

There are rational objections to this proposal. Landauer's principle is really an expression of entropy in information systems -- which can be mathematically modeled as though they were thermodynamic systems. It's a bold claim to say this has a physical reality and a loss of information actually does release energy -- and since Landauer's principle expresses this as heat energy, wouldn't it then be detectable (i.e. not dark)?

Well, so much for *that* objection. :)

http://http//www.universetoday.com/85855/astronomy-without-a-telescope-holographic-dark-information-energy/ [http]

designing circuits around this theory? (1)

Pflipp (130638) | more than 2 years ago | (#39324593)

Back in the Uni library, I once had an old ('60's?) book in my hands which stated that for every logical AND circuit, combining two '1' bits would also result in heat. The author suggested designing AND circuits so taht they would have two results: the logical outcome, and the overflow 'exhaust', both connected to the rest of the circuitry. This would be used to keep the processor from generating heat, but might also have more practical, logical uses. (He probably said similar things for other kinds of circuits.)

I thought it was a wonderful book at the time, and wondered if anyone ever tried to work out this man's arguments.

Now I wonder if anyone is familiar with this? Haven't remembered the author or anything.

Get ready for the inevitable (0)

Anonymous Coward | more than 2 years ago | (#39325443)

What the article doesn't mention is that IBM has registered this process as a patent. Good luck trying to avoid the licensing fee on that one.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>