Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Next Generation CPU Refrigerators

samzenpus posted more than 6 years ago | from the just-a-little-cold dept.

Technology 154

Iddo Genuth writes "Researchers at Purdue University are developing a miniature refrigeration system, small enough to fit inside laptop computers. According to the researchers, the implementation of miniature refrigeration systems in computers can dramatically increase the amount of heat removed from the microchips, therefore boosting performance while simultaneously shrinking the size of computers."

Sorry! There are no comments related to the filter you selected.

eat my shorts slashdot !! (-1, Troll)

Anonymous Coward | more than 6 years ago | (#24313281)

Eat my shorts slashdot !!

Re:eat my shorts slashdot !! (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#24314229)

Bart! Go to your room!

Excellent (5, Funny)

Anonymous Coward | more than 6 years ago | (#24313283)

The implementation of miniature refrigeration systems in computers can dramatically increase the amount of heat removed from the microchips.

Of course, the next step will be to dramatically increase the heat output of high-end CPUs. Aren't arms races fun?

Re:Excellent (2, Funny)

RuBLed (995686) | more than 6 years ago | (#24313577)

This will pave the way for the Year of Linux on the Desktop! (or Laptop)

with apologies to our aquatic, flightless and mostly cold-loving friend

Re:Excellent (3, Insightful)

Anonymous Coward | more than 6 years ago | (#24313975)

Don't you mean the Year of Vista on the Desktop? A more unattainable goal, and more related to the issue of insane heat generation...

Re:Excellent (-1, Troll)

Anonymous Coward | more than 6 years ago | (#24315481)

no you suck. and your not funny.

Re:Excellent (5, Funny)

plover (150551) | more than 6 years ago | (#24313817)

I thought the next step would be to dramatically decrease the size of beer cans to fit in these miniature refrigerators.

Re:Excellent (5, Funny)

Anonymous Coward | more than 6 years ago | (#24314513)

Or dramatically increase the size of beer cans to fit these inside. I'm not interested in "Fun Size" beers.

(Fun for who? Beer companies?)

Re:Excellent (4, Insightful)

The Dancing Panda (1321121) | more than 6 years ago | (#24314663)

Why would you EVER want to DECREASE the size of a beer can?!?!

Blasphemer...

Re:Excellent (2, Funny)

Tubal-Cain (1289912) | more than 6 years ago | (#24314941)

You removed the water.

Re:Excellent (1, Insightful)

Anonymous Coward | more than 6 years ago | (#24315477)

"Of course, the next step will be to dramatically increase the heat output of high-end CPUs. Aren't arms races fun?"

I know you were being cheeky, but the "heat problem" is a problem only when trying to use cheap materials to make CPU's, the "heat problem" can move farther, or closer to you, depending on how much you are willing to spend on materials, research, etc.

Personally as far as I'm concerned, I've been thinking a lot about CPU specialization lately, with the success of 3D add in cards over general purpose CPU, I'm wondering if it wouldn't be in our best interest to simply look at the most processor intensive functions that can be either 1) parallelized or 2) highly specialized, and offload them to specialized hardware.

I don't think multi-core is going to cut it, it seems to me each processor needs it's own mememory and bandwidth to do massive calculations, and then sends the results of this information to where it is needed.

I'm really wondering if anyone has done any research into the geometry of information processing functions, of what can be specifically offloaded and what should not

Re:Excellent (4, Interesting)

somersault (912633) | more than 6 years ago | (#24316315)

I don't think multi-core is going to cut it, it seems to me each processor needs it's own mememory and bandwidth to do massive calculations, and then sends the results of this information to where it is needed.

While multi-core isn't amazingly effective for doing 'massive calculations' of the variety that scientists usually do (compared to a supercomputer with thousands of nodes anyway), it is great for general purpose computing. It definitely helps for everyday use - whenever I use a single core computer (even with a high clockspeed), I notice the difference in responsiveness, especially when booting into Windows and all the system tray apps are loading, or running lots of applications at the same time. You have to remember that even if you're just running a single application on your dekstop, there are plenty of background processes too.

Not that I want to dissuade you from researching into more efficient processor methodologies, even if it's only for specific tasks - go ahead :) But when you get down to it, most tasks your average computer user does during the day are neither suitable for parallelisation, nor are they considered highly specialised. I'm just thinking of web browsing, chatting, checking email. Modern games do involve lots of operations that 20 years ago would be considered 'specialised', like 3D sound, graphics and physics processing, but we already have specialised processors for all of these things.

I'm really wondering if anyone has done any research into the geometry of information processing functions, of what can be specifically offloaded and what should not

I don't think you're giving the guys at places like Intel and AMD much credit.. if they hadn't thought about stuff like that then where did the idea for 'hyperthreading' and different CPU 'pipelines' come from? To me it seems that the only things that have changed in the last couple of decades is that we've gone from having computers that were mainly designed for integer arithmetic as far as hardware was concerned, to having computers with addons for floating point calculation, and now we have units capable of massively parallel floating point calculations and amazing amounts of memory bandwidth (graphics cards and supercomputers), and now we are getting APIs like CUDA to make use of graphics cards to do more supercomputer like things with our graphics cards. I'm not a CPU design engineer though, so the true progression is probably a bit more complex ;)

Hotter? (1)

MasterThis (903688) | more than 6 years ago | (#24313317)

Won't they dramatically increase the amount of heat they radiate as well?

Re:Hotter? (3, Informative)

treeves (963993) | more than 6 years ago | (#24313325)

Yes, but heat flow != temperature.

Re:Hotter? (2, Insightful)

aliquis (678370) | more than 6 years ago | (#24313537)

Yeah I don't get this, the heat need to leave the laptop somehow, and since the refrigerator will have to be within the laptop the heat remains inside it. Also since the refrigerator won't be 100% effecient this will both generate MORE heat and energy from the battery.

Sure the CPU may get colder, but your lap will get warmer. Bad trade I'd say.

Re:Hotter? (5, Insightful)

Bloodoflethe (1058166) | more than 6 years ago | (#24313785)

Yeah I don't get this, the heat need to leave the laptop somehow, and since the refrigerator will have to be within the laptop the heat remains inside it

The refrigerator's exterior heat exchanging pipes don't have to be inside the refrigerator itself. They didn't give any technical specs, so what are you worried about? Surely if they are working on this project, they'll have thought or experienced this problem if they were putting all items in the same location.

Also, consider that, to a point, the ambient heat inside a laptop can be higher, as long as the PUs are kept cool. Of course if this were the only consideration, eventually the ambient heat would screw all the components except for the processors, but, as I said, they've considered this already. I'm sure of it.

Re:Hotter? (0, Insightful)

Anonymous Coward | more than 6 years ago | (#24314075)

You're saying this to a guy who obviously doesn't understand why his eggs aren't frying inside of his kitchen refrigerator.

Re:Hotter? (2, Insightful)

aliquis (678370) | more than 6 years ago | (#24314431)

Well, personally I don't care if my CPU is 45 degrees or 75 degrees as long as my lap isn't 70 degrees.

And the sad fact with my MBP is that it probably is :D (no it's not but it's too hot.)
I'd so take 5 mm fatter computer for better cooling :/

Re:Hotter? (4, Funny)

KGIII (973947) | more than 6 years ago | (#24315625)

they've considered this already. I'm sure of it.

More famous last words have ne'er been spoken.

Re:Hotter? (0, Offtopic)

el_coyotexdk (1045108) | more than 6 years ago | (#24315909)

+1 funny too bad i dont have any mod points ;(

Re:Hotter? (1, Insightful)

Enderandrew (866215) | more than 6 years ago | (#24313801)

Refrigeration systems use compressors, which are big power drags. The battery drain here can not be overlooked.

Refrigeration systems from desktops exist, and they are called water-coolers.

Re:Hotter? (3, Informative)

0111 1110 (518466) | more than 6 years ago | (#24314557)

No they are called phase change systems. Much more expensive than water cooling.

Re:Hotter? (3, Insightful)

txoof (553270) | more than 6 years ago | (#24313897)

Of course the cooling system will use power and generate heat, but that heat won't necessarily be as much as a fan. A fan uses power to dissipate heat and in the process generates heat. I don't know the proper thermodynamic way to state this, but it's possible to make a more or less efficient cooling system. For example, it would be exceedingly inefficient to use a V8 engine to cool a laptop. It would do a hell-of a job of cooling the thing, but it would generate a whole lot more heat and suck down a whole lot more energy than a small electric fan.

This thing might use less power and do a better job of moving power than a fan. I have no idea if this thing works better. If this device is more efficient than a fan (uses less energy and releases less heat), then it would be superior and would not make a lappy hotter. Otherwise, it's really only good for server applications where the heat can be pumped outside the box that holds the server.

Re:Hotter? (1)

Yetihehe (971185) | more than 6 years ago | (#24315483)

No, your lap won't get hotter, as more heat will be sucked from laptop. This means you will be able to use your laptop as hair dryer. (More heat taken from cpu, radiator will be hotter).

Re:Hotter? (0)

Anonymous Coward | more than 6 years ago | (#24315867)

why dont you get your laptop covered in styrofoam then. if your lap is getting warm that heat is no longer in the case of your laptop also, you suck

Re:Hotter? (2, Insightful)

lorenzo.boccaccia (1263310) | more than 6 years ago | (#24316273)

agreed, the problem remain exchange that heat with the ambient, but a refrigerator have two nice property: the first is that it's very efficient to move heat from one point to another (imagine using the back of the lcd as heat exchanger)
the second property is that using heatsink you have the cpu at 60C, the sink at 60C and the ambient at 20C so the exchange is at 40 degree difference. using a refrigerator you will have the cpu at 40, the dissipation surface at 80, and the exchange at 60 degree, which is more efficient, so you will effectively end up at 30C at the cpu and 70C at the refrigerator end, due to this increase in the heat exchange efficiency the refrigerator will dissipate more heat per second, leaving the overall system colder. so you could build the bigger heatsink in am ore convenient place and exploit the better heat transfer (in the end, there is no such thing as "refrigerator", only heat pumps)

How much juice? (4, Insightful)

fyoder (857358) | more than 6 years ago | (#24313337)

And how much electricity will this consume? It may not be that appealing to laptop users if it eats significantly into their battery life. And for servers many colo companies are finding themselves less constrained by space than by available electricity.

Re:How much juice? (5, Interesting)

megaditto (982598) | more than 6 years ago | (#24313593)

Could be pretty damn efficient if it's a heat pump.

A good AC unit usually consumes less than 10 times the energy it moves (a 1 kW window unit rated for 40,000 BTUs for example), but that depends how much colder the inside needs to be compared to the outside air.

In case of CPU coolers (cooling things hotter than ambient air), one could even GENERATE electricity if the size and cost of the "cooler" is not a concern (A thick diamond heatpipe to conduct heat away to distant thermocouples is how I would do it).

Re:How much juice? (1)

NoobixCube (1133473) | more than 6 years ago | (#24313637)

I've always thought a small solar panel on the back of the screen would be a good idea; solar panel technology has been a little limiting for that, though. While it wouldn't be able to power the whole computer, maybe, if the technology is good enough and cheap enough, it could be used for a little refrigeration. I wouldn't want it to go below about 28 Celsius here, anyway, since I live in the tropics. The humidity would condense if the computer were kept much colder.

Re:How much juice? Not as much as a Peltier I hope (1)

danwat1234 (942579) | more than 6 years ago | (#24313851)

Good thing they aren't doing the Peltier route! They'd be wasting their time.

Revolutionary (4, Funny)

Rui del-Negro (531098) | more than 6 years ago | (#24313369)

the implementation of miniature refrigeration systems in computers can dramatically increase the amount of heat removed from the microchips, therefore boosting performance

Really? So my CPU will perform faster if I put it in a refigerator?

Re:Revolutionary (2, Informative)

Anonymous Coward | more than 6 years ago | (#24313383)

Yes, if you take advantage of the extra heat absorption by overclocking the CPU to run faster.

Re:Revolutionary (3, Informative)

Anonymous Coward | more than 6 years ago | (#24313509)

So it's the overclocking (i.e., increasing the clock frequency) that makes your CPU run faster, not the fact that it's cooler, as the article implies. And some CPUs generate more heat than other CPUs with lower clock speeds, so that relationship isn't a linear one, either.

Also, most modern high-end CPUs can't be overclocked by much, regardless of how cold you make them. The problem isn't heat, the problem is how fast the transistors can switch while remaining in sync. Sure, if you buy a low-end CPU from a high-end "family", you can usually overclock it a lot, because it's basically identical to a high-end models (just set to a lower speed at the factory). But, again, that has nothing to do with temperature, and temperature itself does not have any influence on a CPU's performance.

Re:Revolutionary (0)

Anonymous Coward | more than 6 years ago | (#24313913)

So it's the overclocking (i.e., increasing the clock frequency) that makes your CPU run faster, not the fact that it's cooler, as the article implies. And some CPUs generate more heat than other CPUs with lower clock speeds, so that relationship isn't a linear one, either.

Wow, you're pretty dense.

If a chip can be kept cool, all things being equal, it can run faster at the same temperature (which really is the first limiting factor in modern processor designs). Keeping "transistors in sync" might be a problem, but it is solved up to about 3.8GHz, if not faster (POWER7 apparently breaks 4GHz). My Core2Duo is "just" 2.4 GHz -- it would be over 50% faster at 3.8GHz.

Re:Revolutionary (1, Interesting)

Anonymous Coward | more than 6 years ago | (#24313993)

You haven't kept up on Core architecture chips then. The cheap ones will make it up to >3 ghz from sub 2ghz, and the high end ones can hit 4ghz+.

So transistor speed/pathing issues are, at least for the current generation, a non-issue.

Re:Revolutionary (2, Funny)

billcopc (196330) | more than 6 years ago | (#24314103)

most modern high-end CPUs can't be overclocked by much, regardless of how cold you make them

The half-dozen Core-2 Q6600s I've taken from 2.4ghz to 3.6ghz would argue otherwise, as would the QX9650 that I pushed to 4.7ghz. But hey, what do I know, right ?

Actually yea... (2, Informative)

Kaeles (971982) | more than 6 years ago | (#24313989)

You need to remember that 90% of laptop CPUs will automatically downclock themselves if they are overheating (or over a certain temperature threshold.) They also do this if the cpu is more idle.

But... but... (1)

dauthur (828910) | more than 6 years ago | (#24313371)

I've been doing this for years! I bought a minifridge for the very specific purpose of cannibalizing it for parts. Yeah, my case is 10lb heavier, but the internal temperature is 60 at any given time. It's indispensable nowadays with the exceedingly hot-running GPUs out there (I'm looking at you, GeForce 8800!) I think that a commercial product is a good idea, but I can see electric bills all over the country screaming in pain.

Condensation? (4, Insightful)

SoapBox17 (1020345) | more than 6 years ago | (#24313393)

Don't air conditioning units tend to produce a bit of water condensation during cooling? I guess we'll have to start emptying the water out of our PCs....

Re:Condensation? (4, Informative)

Smidge204 (605297) | more than 6 years ago | (#24313517)

Only because they cool below the dew point - which, in turn, is dependent on the humidity levels.

People who build active cooling into their computers (for overclocking) typically insulate the chip(s) and cooling block to keep air-exposed surfaces at or above ambient temperatures for just that reason.

Also, even if it does produce condensation I'd say there's little reason to worry... just recycle the condensate to provide evaporative cooling on the (much hotter) heat sink side of the system.
=Smidge=

An alternative... (3, Interesting)

jd (1658) | more than 6 years ago | (#24313729)

...is to position the computer upside-down. Condensation does not form on the hot surfaces, only the cold surfaces. If the cold surfaces cause the water to drip away, there is no way for the water to interfere. Another option is to refrigerate the entire computer (which is done by overclockers), as the coldest point will then be far away, and you've the added bonus that the air will be very dry within a short timeframe.

A third option would be to run copper from each chip surface to the refrigerator. The heat gradient will prevent any chip running hot, you only need one refrigerator, and you can handle the case of the heavy workloads shifting from one part of the system to another.

Re:Condensation? (3, Interesting)

Enderandrew (866215) | more than 6 years ago | (#24313835)

Not to mention the reason you get condensation in a fridge is often that a single compressor operates both the fridge and freezer. Systems with different compressors for the two systems are more segregated, and have less condensation problems. Each system stays at a controlled humidity level.

Re:Condensation? (1)

Myrcutio (1006333) | more than 6 years ago | (#24314795)

It wouldn't be a bad idea for laptop manufacturers to start waterproofing their systems. It would have a nice side benefit of helping against spilled liquids as well.

Nothin' better (0, Redundant)

Audent (35893) | more than 6 years ago | (#24313415)

than a game and a beer. Preferably a cold one.

and now... the Dell Smeg Fridgo'matic Dual Core.

(yes, I did just write smeg. tee hee. tee hee hee).

Side Question??? (3, Interesting)

TheCastro (1329551) | more than 6 years ago | (#24313421)

Whenever I hear about new cooling solutions I remember a few years back someone had developed that liquid (or gel) that you could submerge computers and tvs into, but it wouldn't fry them. Everyone was talking about using this nonbonding liquid to cool computers and use to put out fires in places with paintings since it didn't ruin the paint. Does anyone know or remember what I'm talking about, or do I just sound like a crazy man,HAHAHAHAHAH! P.S. Bill Gates probably bought it to throw away.

It costs something like $300/gallon (2, Informative)

localroger (258128) | more than 6 years ago | (#24313467)

I remember a piece linked here where a couple of morons immersed a computer in the stuff and cooled it with liquid nitrogen, oblivious to the fact that liquid nitrogen was cold enough to freeze the stuff. I was thinking "one small room air conditioner..." Apparently the miniaturized and practical version of that is what TFA is, although I say that as conjecture since I haven't read TFA.

Nitrogen costs less than beer (1)

mangu (126918) | more than 6 years ago | (#24314019)

I think you need to get some price information. Liquid Nitrogen does NOT cost $300/gallon [uark.edu]

Re:Nitrogen costs less than beer (4, Informative)

billcopc (196330) | more than 6 years ago | (#24314179)

You're right, liquid nitrogen does not cost anywhere near $300/gallon, but the GP wasn't talking about nitrogen, they were talking about 3M Fluorinert, which does indeed cost an arm and a leg.

The problem with these fluids is they can't keep up with today's processors. Immersing a PC in a vat of mineral oil won't magically cool the damned thing. You still need to extract the heat from that big pool of sludge; natural convection just doesn't cut it anymore. In fact, the fluid acts kind of like an insulator, because it moves so slowly that heat builds up right on your processor. You'd need propellers to move the flooz around, probably pump it through some sort of radiator.

On the plus side, I could use my overclocked PCs to cook me some french fries for my poutine :)

Re:Nitrogen costs less than beer (1)

moosesocks (264553) | more than 6 years ago | (#24315253)

Although those costs seem somewhat realistic for small quantities, the cost of Liquid Nitrogen can go down to pennies per liter if you're producing it in large enough quantities. (If you've got your own Cryo plant on-site, this can be as low as dollars-per-ton, ignoring capital costs associated with building said plant)

Transportation and storage are the two key expenses, and though still not particularly expensive, are nevertheless an order of magnitude more than the what it takes produce the stuff. Really, as far as commodity liquids go, LN2 is amazingly among the cheapest you'll ever find, apart from tap water.

Re:Side Question??? (5, Informative)

jaxtherat (1165473) | more than 6 years ago | (#24313583)

You mean mineral oil immersion?

linkage: http://www.engadget.com/2007/05/12/puget-custom-computers-mineral-oil-cooled-pc/ [engadget.com]

Re:Side Question??? (1)

TheCastro (1329551) | more than 6 years ago | (#24313775)

No, it was able to have direct contact, unless I misread about the mineral water. The other comment about the high cost sparked a memory about it being expensive.

Re:Side Question??? (1)

jaxtherat (1165473) | more than 6 years ago | (#24313895)

mineral oil mate, not water. You're probably thinking of some sort of liquid silicate polymer, but they are hugely expensive.

Re:Side Question??? (4, Informative)

SQL Error (16383) | more than 6 years ago | (#24313803)

I think he's probably thinking of Fluorinert [wikipedia.org] , which was used to cool the Cray 2.

Re:Side Question??? (1)

ScrewMaster (602015) | more than 6 years ago | (#24313951)

Or maybe Stabilant 22, although the stuff is kinda expensive to be used for immersing entire computers.

Re:Side Question??? (0)

Anonymous Coward | more than 6 years ago | (#24314379)

or... maybe water cooling. or oil.

Didn't intel already develop and debut this?

Re:Side Question??? (1)

Artuir (1226648) | more than 6 years ago | (#24314307)

Sorry, it doesn't matter. Patent has it covered!!

http://www.freepatentsonline.com/7403392.html [freepatentsonline.com] - Hey, check it out! They patented something in 2007/2008 that Cray did with their supercomputers at least in 1985, possibly sooner! As if this isn't proof that patenting as it is in this country, is bullshit.

Here's information on the Cray-2, which used liquid cooling: http://en.wikipedia.org/wiki/Cray-2 [wikipedia.org] but I don't know if it was submerged cooling. I know Cray WAS looking at substances they could submerge boards and whatnot in sometime around then, as well.

As far as recently, Tomshardware did a bit where you can use standard cooking oil as a submersion setup:

http://www.tomshardware.com/reviews/strip-fans,1203.html [tomshardware.com]

...boosting performance while shrinking the size.. (0)

Anonymous Coward | more than 6 years ago | (#24313453)

So we're going to put more stuff into the computer and make the computer smaller at the same time... is this a computer or the Tardis?

Re:...boosting performance while shrinking the siz (1)

txoof (553270) | more than 6 years ago | (#24313799)

The Tardis' outside dimensions and volume remain the same (it's stuck as a Police Call Box), it's interior volume, however, is limitless.

I would trade my lappy for a Tardis any day of the week though.

It still drains the battery (2, Insightful)

Anonymous Coward | more than 6 years ago | (#24313459)

Regardless of the cooling ability, it will put the same load on the laptop's battery, likely a little bit more because it has to run the compressors.

And that heat still needs to be dumped somewhere...

I guess this would be great for certain difficult hot-spots on the board, but a well-designed heat sink can usually handle it. The trade-off is that it adds more weight.

Re:It still drains the battery (1, Funny)

Anonymous Coward | more than 6 years ago | (#24313545)

"And that heat still needs to be dumped somewhere."

Well this is Next Generation - they will send the heat out the main warp exhaust ports. Either that or Geordi will reconfigure the main deflector to emit a tachyon pulse...

Re:It still drains the battery (1)

txoof (553270) | more than 6 years ago | (#24313759)

It still drains the battery, but not necessarily the same way that a fan drains a battery. Think of the difference between a compact fluorescent versus an incandescent bulb. They can both be tuned to release a certain amount of light, but the CF will release a whole lot less heat and use less energy to do the same job as the incandescent light. In this case, the CF bulb is vastly more efficient in doing the work of emitting light.

The article doesn't say anything about the amount of energy the new pump uses, or the efficiency. It's possible that the device can use less power and dissipate more heat than a fan. It's also possible that the thing is wildly inefficient too. Typically, liquid based cooling is more efficient at moving heat than gas based systems. This is why we cool nuclear power plants with liquid instead of air.

Dump it near the Wifi module (4, Funny)

EmbeddedJanitor (597831) | more than 6 years ago | (#24314011)

And make your own hotspot.

Finally (1)

krkhan (1071096) | more than 6 years ago | (#24313463)

My issues are resolved. If I accidentally spill coffee on my laptop, I'll have iced coffee as the byproduct and that ain't too bad, is it?

Re:Finally (1)

Crazyswedishguy (1020008) | more than 6 years ago | (#24313783)

Better yet, your kegerator and computer can be one and the same!

Hey dicksuckers! (-1, Troll)

Anonymous Coward | more than 6 years ago | (#24313473)

still wasting your lives on linfux?

Hype (5, Insightful)

MojoRilla (591502) | more than 6 years ago | (#24313487)

The article says:

The researchers developed an analytical model for designing tiny compressors that pump refrigerants using penny-sized diaphragms. This model has been validated with experimental data.

Translation:
This is completely impractical hype so far. We are looking for grant and startup money.

Mini-Aircon DIY (0, Flamebait)

hardburn (141468) | more than 6 years ago | (#24313519)

Thought of this idea recently while trying to think of ideas for cooling my computer while making less noise:

Take a small engine from an RC car or airplane (two stroke will probably be best). Jam the throttle wide open and unscrew the high speed needle all the way. Put one pipe on the carb (with a good seal), and then another one on the exhaust with a smaller internal diameter than the first. Couple the two pipes together and have a fan on each side. Then have a small electric motor spin the engine. The side coming off the exhaust is hot (since the engine is compressing it), and the other end of pipe is cold (since the coupler expands the gas).

I figured this would be a great way to make a small air conditioner that could be used to cool incoming ambient air, and it should be more efficient than a pelt. However, even a small reciprocating engine being turned by a motor is going to be noisy, so I don't think it'll work for what I want. Might be nice in a hardcore gaming rig for someone who doesn't care about noise. Might also work with a small wankel rotary, but I doubt you could source one this small.

Re:Mini-Aircon DIY (1)

Vectronic (1221470) | more than 6 years ago | (#24313751)

I don't think so, other than making a lot of noise and using more electricity I dont think it would do much of anything, you could use it to create a vortex heat exchanger, but your engine would have to be running at RPM's far beyond what a normal combustion engine can run (like in the 100,000+ RPM range).

Besides, if you are doing that, you may aswell go for full condensing cooling, which a combustion engine wouldnt be able to handle because they dont seal well enough to create enough pressure needed, thats why diaphragms are used instead of cylinders.

Re:Mini-Aircon DIY (0)

Anonymous Coward | more than 6 years ago | (#24314945)

Well, the IC engine alone won't quite do the job, but a Stirling engine will in fact operate in this way quite effectively. Normally such an engine has a hot end and a cold end, and generates mechanical power utilising the temperature difference. Turn it with an external source of power and instead one end will get hot and the other end will get cold. It still requires power to run it of course. Probably less than the Peltier devices, but of course the Stirling engine has moving parts and so will have wear and tear. The Stirling engine will be quite quiet, much more so than an IC engine.

Great... (1)

sabithewanderer (1331759) | more than 6 years ago | (#24313571)

I already spend most of my time in front of either my PC or my refrigerator as it is.

Re:Great... (1)

negRo_slim (636783) | more than 6 years ago | (#24313811)

Why not just cut out the middleman [patentstorm.us] ?

Re:Great... (1)

sabithewanderer (1331759) | more than 6 years ago | (#24314003)

Thanks, but no thanks. The walk back-and-forth is about the only exercise I get.

Now if they could just make a little alteration (1)

LM741N (258038) | more than 6 years ago | (#24313635)

and bring that cooling around to that cupholder thing on the side of my laptop, I could keep my beer cool as well while I'm downloading pr0n.

Power Requirements? (1)

txoof (553270) | more than 6 years ago | (#24313693)

I wonder what kind of power these little diaphragms suck down. I imagine that a liquid based cooling system is more efficient than one based on circulating air. It's great that this technology can move heat away faster, but I wonder if it can do it at a lower power cost.

Is it small enough... (1)

spankymm (1327643) | more than 6 years ago | (#24313773)

to fit *inside* a beer can?

What's the problem? (1)

Aphoxema (1088507) | more than 6 years ago | (#24313857)

Miniature? I personally don't see the problem with lugging around something like this...

http://teeksaphoto.org/Archive/DigitalTimeline/NewTimelineImages/osborne1.jpg [teeksaphoto.org]

Hell, I might just get one of those for checking my email on the go.

Re:What's the problem? (0)

Anonymous Coward | more than 6 years ago | (#24316239)

Is that you, Strong Bad?

Refrigerator CPUs is (1)

davidsyes (765062) | more than 6 years ago | (#24313929)

what I initially thought, hehehe....

But, I suppose there are some hi-tech fridges out there...

Cool! (0)

Anonymous Coward | more than 6 years ago | (#24313941)

Finally the built-in beer cooler in my computer :9

why does this work? (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#24313945)

Say pieces on a board, make each a pair with another piece.

like...

|55|33|66|
|44|66|55|
|33|44|22|
|22|11|11|

so figure out how a piece can move.

pick any piece, try to move it somewhere.
when you move a piece you have to move it's pair at the same time.
when you move to a piece it's pair has to move at the same time too.
a piece always becomes a pair with the piece it moves to.
no matter how many pairs, there's only one answer to how a piece can move.

A common problem type, I forget what it's called.

There's only one answer for how any piece can move.

A piece always goes where a piece leaves.

You can't move a piece that moves where the piece came from.

No such thing as a free space, a piece always moves to another piece.

A pair never moves to a pair.

A piece works out to move where another piece can get back to where a piece moves from.

so try this...

draw starting at each piece a line that shows the piece it moves to, and each piece to move for how a piece moves back
where it starts.

see this as a machine diagram.

move a piece then figure the machine diagram again, it's the same machine though...

see how every other piece moves another way now?

what happened for how the machine moved?

bad idea (2, Interesting)

ILuvRamen (1026668) | more than 6 years ago | (#24313955)

First of all I've been saying for years, just screw the motherboard into the back of your mini-fridge and keep installing from there. You just open the door to put in a CD lol. But also, not all components can handle active cooling. My old laptop got really hot playing games. So I used ice packs under it to cool it. It got the temp way down but the hard drive died after about a month from the extreme hot-cold difference. I assume some external parts contracted while internal ones remained hot and expanded and some parts rubbed against other parts and it got damaged. I was able to get the data off after like 10 blue screens. So the moral of the story is, active cooling that can cool it lower than the surrounding air temperature is REALLY, REALLY BAD for some internal parts.

Law of Unintended Consequences (2, Funny)

Loopy (41728) | more than 6 years ago | (#24313999)

"In today's news, a new CPU refrigerant system causes massive data loss for users as hard drives overheat and fail prematurely from abnormally high case temperatures. Film at 11."

Active cooling means more heat output (2, Insightful)

giorgist (1208992) | more than 6 years ago | (#24314017)

Check the back of your fridge, it's hot.

So by cooling by this method you may cool the cpu surface, butyou will produce alot more heat out of the laptop.

Fried laps ?

G

Next Generation CPU Refrigerators? (1)

Veggiesama (1203068) | more than 6 years ago | (#24314041)

Don't they call those "replicators"?

Lap Burners (1)

Mishotaki (957104) | more than 6 years ago | (#24314211)

Welcome to the future! what they are presenting here is a way to cool the components by transferring heat... so when it's transferring the heat away from the processor, it will take it away to dissipate it, among the heat of the compressor and everythign lse of the system... therefore having much more heat to dissipate than the normal portable computer, meaning it will probably give you second degree burns whenever you try to use it wherever you don't have a flat surface and need to use your lap instead...

Unless (1)

RevWaldo (1186281) | more than 6 years ago | (#24314327)

Unless they're able to do some clever-bastard engineering to route the heat elsewhere - the top edge of the laptop screen, perhaps? If they could somehow route the tubing through the hinge the coiling could sit inside a metal "handle".

If there's no patent on something like this, I got dibs. :-)

(No, seriously, I got dibs.)

The best way to get the watts out of your PC (1)

symbolset (646467) | more than 6 years ago | (#24314349)

I say it every year and this is as good a place to say it as any.

The easiest way to get the watts off your processor and out of your PC is...

not to put them in. Duh. Fortunately, somebody is listening. [intel.com] Finally.

Followup (1)

symbolset (646467) | more than 6 years ago | (#24315329)

The next billion users just don't have the watts to put in. See this firehose article [slashdot.org] referencing this news report [a2to.info] : IT capital Bangalore to face power cuts.

Intermittent power outages are going to pay hob with your VOIP-based tech-support unless they've got their redundant power bases covered. Normal users? If you aren't solar and wireless, you're offline for up to two hours each day. You kilowatt gamers? You'll have to goldfarm later I guess.

Re:The best way to get the watts out of your PC (1)

ejecta (1167015) | more than 6 years ago | (#24315773)

I agree completely! Right here I run two PC's first based on VIA PC2500E motherboard like the GPC which was on /. awhile ago and the second is the VIA MM/PC3500 which is essentially the same + PCI-Express x16 slot and HDTV.

They're not bad, the motherboard & onboard CPU costs less than most other plain motherboards for Intel/AMD.

Re:The best way to get the watts out of your PC (1)

symbolset (646467) | more than 6 years ago | (#24315979)

That's a good choice. If you need another Mythbuntu client though you might try the Intel Atom motherboard [newegg.com] . They've made great strides in power efficiency and it seems they'll make more.

I recommend the Pico PSU [mini-box.com] power kit to go with both yours and this new one. DC is the wave of the future.

Re:The best way to get the watts out of your PC (1)

ejecta (1167015) | more than 6 years ago | (#24316047)

I would like to invest in 2x PicoPSU's but simply can't afford it at this stage... am just using a bogdy 40 watt psu out of a circa 2004 IBM Netvista.

Uses alot more power due to inefficency but I figure cutting down from a Athlon 2000XP to this will save a fair bit of juice considering it's on 24x7 365 days a year.

I've had a look at the Intel offerings but they're hard to lay your hands on over here (Australia) and from what I've read whilst the cpu is low power the northbridge can burn quite a bit of juice (although I could be wrong).

Obligatory (0)

Anonymous Coward | more than 6 years ago | (#24314733)

Cool!

Smaller already exist (0)

Anonymous Coward | more than 6 years ago | (#24314753)

I am surprised that noone seems to have mentioned peltier coolers (TEC). These can be much smaller yet push a lot of heat. This is a good intro into them

http://www.heatsink-guide.com/peltier.htm

I have a pair that I ordered online which came to $15 a piece including shipping. They are 50x50x3.6 mm and can pump up to 133 W from the cold side with a max temperature difference of 68 degrees C. At least one overclocking vendor sells a 437 W module (62x62x3.5 mm) for $50 http://www.frozencpu.com/products/2411/exp-04/437W_Qmax_Peltier.html?tl=g30c105s187

The disadvantage is that they are not quite as efficient as phase change systems. But this seems to be much more promising for research. They are solid state, don't require greenhouse gases and are extremely small. They are also extremely cheap. A small phase change system that can output 1/10 HP (~72 W) will cost several hundred dollars. Why not throw research money into making more efficient peltiers instead? Eventually we can all hav AC units on our houses that never break down.

Re:Smaller already exist (0)

Anonymous Coward | more than 6 years ago | (#24314867)

... they are not quite as efficient as phase change systems

One of the real battles is against the carnot efficiency (maximal theoretical efficiency of an active heat pump), rather than the devices comparative efficiency. You probably wouldn't want either peltier or phase changers in you laptop, which will drain a battery fast.

Small is what the article touts, not efficiency, which is what peltiers already achieve.

... AC units on our houses that never break down

Might I add that these enduring units would also be a lot smaller.

More power efficient CPU (0)

Anonymous Coward | more than 6 years ago | (#24314903)

Why not put some research into making CPU's run a bit cooler? (I'm sure someone is doing it though).

My gut is telling me that I don't want to have tiny refridgerators in my laptop. Seems to me as the problem solving is starting from the wrong end in this case

Putting household appliances into computers (1)

ciaran.mchale (1018214) | more than 6 years ago | (#24315303)

I was going to make a smart ass comment: "A few years ago manufacturers put computers (packaged as Internet browsers) into refrigerators; now they are doing it the other way around. I wonder what other household appliances will follow the trend?"

But then I thought about it and realized that there are already several other household appliances and gadgets in computers. CD players, DVD players, Televisions (TV tuner cards), radios (Internet radio), telephones (Skype and the Telecrapper 2000), cameras (webcams). Heck, you can even buy USB-powered lava lamps, pencil sharpeners, and a small hot plate to keep a cup of coffee warm.

I know some people will say the refrigeration technology is used to cool the CPU rather than being a general-purpose refrigerator in which you can store food and drink. However, it wouldn't surprise me if a combined computer-and-general-purpose refrigerator is manufactured at some point in the future and marketed at geeks. "When I've been coding for 4 hours straight and I need a drink, I just reach down and pull a can of Jolt from my computer's built-in refrigerator."

Green energy? (1)

Lord Lode (1290856) | more than 6 years ago | (#24315463)

A laptop with a freezing system inside? But what about battery life power? I'd prefer to be able to work 8 hours at normal speed than 4 hours at "boosted" speed! In regular desktop PC's I'd also like to see the research go to reduced power consumption in more powerful chips instead of combatting more power consumption by adding cooling that also consumes power to it...

Sad, they haven't made (1)

viosz (258498) | more than 6 years ago | (#24315653)

any comment about the expected power consumption of the system.
At least I haven't seen one.

Would be nice to compare it against anactual water cooling system, which would be on the same level of cooling.

Why heat the bottom anyway? (1)

VanessaE (970834) | more than 6 years ago | (#24316283)

A lot of people have mentioned how this would warm your lap unacceptably. So why not put the warm side of the heat pump somewhere in the display section? I don't know of too many reasons why one need to keep their hands on the display, let alone hold the laptop by it.

Better yet, skip the fridge idea altogether, use something passive, and turn the entire backside of the display section into a giant heat sink. I'm sure someone can figure out how to make a flexible heat pipe to pass across the hinges.

You could even get fancy and have everything covered by exhaust vents/louvers that automatically close when you shut down or put the machine to sleep, and a small turbine-style fan on each side, to move the air without wasting a lot of space.

Just make sure you deal with any excess heat from the display panel.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?