Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Asetek LCLC Takes Liquid Cooling Mainstream

kdawson posted more than 6 years ago | from the formerly-hot-hardware dept.

Hardware Hacking 118

bigwophh writes "Liquid cooling a PC has traditionally been considered an extreme solution, pursued by enthusiasts trying to squeeze every last bit of performance from their systems. In recent years, however, liquid cooling has moved toward the mainstream, as evidenced by the number of manufacturers producing entry-level, all-in-one kits. These kits are usually easy to install and operate, but at the expense of performance. Asetek's aptly named LCLC (Low Cost Liquid Cooling) may resemble other liquid cooling setups, but it offers a number of features that set it apart. For one, the LCLC is a totally sealed system that comes pre-assembled. Secondly, plastic tubing and a non-toxic, non-flammable liquid are used to overcome evaporation issues, eliminating the need to refill the system. And to further simplify the LCLC, its pump and water block are integrated into a single unit. Considering its relative simplicity, silence, and low cost, the Asetek LCLC performs quite well, besting traditional air coolers by a large margin in some tests."

cancel ×

118 comments

Sorry! There are no comments related to the filter you selected.

Liquid Cooling already mainstream (2, Insightful)

Anonymous Coward | more than 6 years ago | (#23049758)

Heck, I'm typing this on an out-of-the-box ~4 year old liquid-cooled Power Mac G5....

Re:Liquid Cooling already mainstream (0)

Anonymous Coward | more than 6 years ago | (#23049940)

Yep, power macs sure are mainstream

Re:Liquid Cooling already mainstream (2, Insightful)

MightyYar (622222) | more than 6 years ago | (#23049986)

Let me know when Asustek sells as many kits as Apple sells computers.

Re:Liquid Cooling already mainstream (1)

DAldredge (2353) | more than 6 years ago | (#23050044)

Asustek makes some computers for apple.

Re:Liquid Cooling already mainstream (1)

MightyYar (622222) | more than 6 years ago | (#23050084)

Asustek makes some computers for apple.
Indeed they do, but I made a typo. The manufacturer of this kit is Asetek.

Re:Liquid Cooling already mainstream (2, Informative)

Anonymous Coward | more than 6 years ago | (#23050094)

Asetek != Asus

Asetek makes vapor phase change coolers, Asus makes motherboards and graphics cards. Neither Asus nor Apple makes commercial phase cooling or liquid cooling gear.

You managed to troll the wrong industry entirely!

Re:Liquid Cooling already mainstream (1)

MightyYar (622222) | more than 6 years ago | (#23050302)

I mis-typed. Sue me.

Also, I wasn't the original troll. I was responding to a guy that seemed to think that a niche hardware maker was more mainstream than Apple.

Re:Liquid Cooling already mainstream (1)

infosix (1271586) | more than 6 years ago | (#23055264)

If the "quality test" in the number of units sold then you should change your Mac for a Pc !

Re:Liquid Cooling already mainstream (1)

bill_mcgonigle (4333) | more than 6 years ago | (#23051006)

What's strange is the TFS extols the virtues of the new part that sounds just like the AC Delco part that Apple used.

FLCL (1)

StCredZero (169093) | more than 6 years ago | (#23054952)

They should've found a way to put F on the front of the acronym, so that it could be FLCL-Cooling.

For those of you who don't like stop & go traf (5, Informative)

bobdotorg (598873) | more than 6 years ago | (#23049776)

Re:For those of you who don't like stop & go t (2, Interesting)

Clay Pigeon -TPF-VS- (624050) | more than 6 years ago | (#23049848)

Too bad they didnt compare it to a good air cooling solution like the thermalright ifx-14 or ultra-120.

Ummmmm (2, Insightful)

Have Brain Will Rent (1031664) | more than 6 years ago | (#23050930)

Wouldn't "is a totally sealed system" take care of "evaporation issues, eliminating the need to refill the system" without requiring "plastic tubing and a non-toxic, non-flammable liquid"???? I'm just saying....

Re:Ummmmm (4, Informative)

ncc74656 (45571) | more than 6 years ago | (#23051744)

If you had RTFA, you would've found that making a sealed system apparently isn't enough by itself. The silicone tubing used in most liquid-cooling rigs apparently is somewhat permeable, so water can seep through it and evaporate. Replacing silicone with vinyl fixes that, at the expense of slightly increased rigidity.

Re:Ummmmm (3, Interesting)

kd4zqe (587495) | more than 6 years ago | (#23052570)

This is very true. I just recently disassembled my system in favor of a Core2 Duo machine. I built the rig because my 1st gen P4 3.6GHz was a pain to air-cool efficiently. I noticed that about a year after assembling the system that the temps climbed rapidly moments after power up. I found that almost all my fluid had gone from the system.

What I thought was fluid was actually UV dye that had permeated the silicone tubing from the cooling solution. Additionally, when I stripped the system, all the tubing ends had swelled dramatically, presumably by the liquid accessing the non-heat fused cut ends of the tubing.

Also, in a rebut to the statements by the article, my system was a WaterChill system from Asetek, and included a CPU block and VGA block in addition to the pump and 120mm heat exchanger, and I found the cost to be quite reasonable at only about $250US. It was very easy to install, and made a nice evening project. Because I transport my system to LAN parties, I decided to reverse the radiator and route the fittings to the inside of the chassis. This took a little common sense, drilling, and planing, but I was very specific in my wants.

Without the cosmetic changes, this still was a very sensical kit to own. I'd recommend for ANYONE to try to build a water rig at least once. If Asetek is trying to move liquid cooling into a more mainstream arena, more power to them.

Liquid cooling for datacentres? (5, Interesting)

mrogers (85392) | more than 6 years ago | (#23049782)

I'm surprised liquid cooling is still seen as a fringe/hobbyist technique, with water (or oil) having a much higher heat capacity than air I would have thought liquid cooling would make sense for datacentres - instead of huge electricity bills for A/C you could just plumb each rack into the building's water system (via a heat exchanger of course, I don't really want to drink anything that's passed through a server rack). Does anyone know if this has been tried, and if so why it didn't work?

Re:Liquid cooling for datacentres? (3, Interesting)

jfim (1167051) | more than 6 years ago | (#23049840)

As far as I know, that's what project Blackbox uses for cooling. Note the blurb where it specifies the water connectivity requirements [sun.com] .

Re:Liquid cooling for datacentres? (1)

algae (2196) | more than 6 years ago | (#23051812)

I believe that Sun's Blackbox uses water cooling for refrigeration between racks, not as a method of cooling the server hardware directly. Like the sibling poster says, too much risk of leakage near the electrical bits. With however many gallons/sec Blackbox requires though, you can turn a lot of hot air back into cold air and just move it around in a circle.

Re:Liquid cooling for datacentres? (0)

Anonymous Coward | more than 6 years ago | (#23054876)

Actually the Blackbox has chillers between each rack and the air flow moves in a circular direction around the box in an enclosed system. Basically the racks are inside the air duct. You slide the racks out into the hallway in the middle of the box to work on a system. The server CPUs have regular fans on them. Its really a great well thought out idea. I'm trying to get my boss to buy one of these and a SAT dish and a windmill for power for a off-site data center on a mountain top in Costa Rica.

Re:Liquid cooling for datacentres? (2, Funny)

notgm (1069012) | more than 6 years ago | (#23049860)

do you really want plumbers called in when your site is down?

Re:Liquid cooling for datacentres? (1, Funny)

Anonymous Coward | more than 6 years ago | (#23051010)

Not sure what you mean, isn't that what sys admins are anyway?

Re:Liquid cooling for datacentres? (5, Insightful)

ZeroExistenZ (721849) | more than 6 years ago | (#23049910)

i would have thought liquid cooling would make sense for datacentres - instead of huge electricity bills for A/C you could just plumb each rack into the building's water system

There are a few things that come to mind:

  • - A datacenter might have different clients renting a cage, owning their own servers you can't enforce the use of watercooling. AC will have to be present and running in any case.
  • - Water + electricity is a risk. With tight SLA's, you don't want to fry your server with your extra investments in its redundant failover hardware altogether.
  • - Available server hardware isn't typically watercooled. Who's going to convince the client hacking a watercooled system on your most critical hardware is a good decision? For defects, a support contract with the hardware vendor is typical. If you mod it, soak it, you're out of warranty and can't fall back on your external SLA.
  • - electricity "bills" aren't an issue, you have so much amps you can run on each cage if you rent you keep under it or you'll have to rent another cage (notice an advantage for the datacenter here?) It's always part of the calculated cost, it's a non-issue really for datacenters or for you when you want to rent a part of the datacenter.

Re:Liquid cooling for datacentres? (2, Insightful)

pavera (320634) | more than 6 years ago | (#23051238)

I don't know where you are hosting where "electricity bills" don't matter.

I have systems hosted in 3 different DCs, 3 different companies. All of them raised their rates in the last year by 20-30% in one way or another. One DC includes the electricity in your flat monthly bill, the only incremental charge in that DC is bandwidth (IE you get 100GB of transfer, if you go over its some dollars per GB), they raised their flat rate 20%, citing higher electricity costs.

The other 2 DCs provide metered electricity to the cage, some amount is included in the cage rental, overages are billed incrementally. These 2 data centers have both increased their incremental charges by 100% in the last year, and increased their cage rental rates by 10-15% citing increased electricity costs. Now you can say "they're just increasing their margins" but I live within 25 miles of 2 of the facilities, I know my electric costs at my home have more than doubled in the last year, up almost 250% in the last 5, so no they aren't just marking things up unnecessarily, its all the same electric co.

All in all, this means an additional $5-600/mo in cost for our hosting. from $2000/mo to $2500-2600/mo depending on electricity and bandwidth usage (and a hint, we've only gone over on our bandwidth 1 time for a total charge of $12). I can only imagine if we were grown out (we plan in 3-5 years to have multiple racks in these three DCs and have budgeted in our plan ~75k/mo for hosting costs (based on the prices from a year ago). Well, a 20-30% increase in that turns into real money like 15-25k/mo increase. Being able to save that money would mean being able to hire 3-5 full time engineers at 60k/yr each. I'd much rather have the engineers than give that money to the electric company.

Re:Liquid cooling for datacentres? (0)

Anonymous Coward | more than 6 years ago | (#23053066)

and don't forget:

- legacy servers. you will find a big proportion of servers in a datacenter that range back up to 15 years...
- legacy datacenters. big datacenters where build in the 90s. they can only be upgraded as much as the building structure allows.
- liquids. the reason why most datacenters are moving away from liquid based fire protection or liquid based air conditioning (yes most aircon in datacenters use water in the airconsystem itself) is that it leaks! it always leaks. Having thousands of volts running through the datacenter on uninsulated busbars you don't want leaks.
- $$$$. no one wants to pay for this. I have seen datacenter customers who are willing to pay electricity bills that are huge, but not willing to invest in new aircon systems that would pay for themselves in a year or two.
- downtime. nearly any rebuild of a working datacenter would mean downtime. this is unacceptable in our modern economy.

Re:Liquid cooling for datacentres? (2, Insightful)

Paul server guy (1128251) | more than 6 years ago | (#23055280)

Y'all are basically idiots.

I just came from NASA Ames research center, (Talk about heavy supercomputing!) and they are heavily water-cooled. Right now they have coolers on each of the processor blocks, and radiators on the backs of the cabinets, but are quickly moving to directly chilling the water.
They use quality hoses and fittings, no leakage.
The efficiency is so much higher than air, and it makes the operating environment much nicer. (They have people in there regularly swapping out drives tapes, whatever.)

Of COURSE water cooling is what you want to use for any high-performance computing. It's purely a matter of efficiency. (And you can use the hot water elsewhere.)

Re:Liquid cooling for datacentres? (5, Insightful)

greyhueofdoubt (1159527) | more than 6 years ago | (#23049954)

Because air has some undeniable advantages over water:

-Free (both source and disposal)
-Non-conductive
-Non-corrosive
-Lightweight
-Will not undergo phase change under typical or emergency server conditions (think water>steam)
-Cooling air does not need to be kept separate from breathing air, unlike water, which must be kept completely separate from potable water

Imagine the worst-case scenario concerning a coolant failure WRT water vs air:
-Water: flood server room/short-circuit moboard or power backplane/cooling block must be replaced (labor)
-Air: Cause processor to scale down clock speed

I don't think water/oil cooling is ready for mainstream data farm applications quite yet. I also think that future processors will use technology that isn't nearly as hot and wasteful as what we use now, making water cooling a moot point.

-b

Re:Liquid cooling for datacentres? (5, Informative)

KillerBob (217953) | more than 6 years ago | (#23050238)

-Non-corrosive


Air is one of the most corrosive substances there is. Specifically, the oxygen in the air is. It just takes time. Normally, a server won't be in operation long enough for this kind of corrosion to happen, especially if it uses gold-plated contacts, but it will happen.

Air is less corrosive. But depending on the liquid that's in use in a liquid cooling rig, it usually isn't corrosive or dangerous to a computer anyway. Liquid cooling rigs are usually an oil such as mineral oil or an alcohol like propanol, neither of which is particularly harmful to electronics.

Also... while it's a technicality, air *is* conductive. It just has a very high impedance. It *will* conduct electricity, and I'm pretty near certain you've seen it happen: it's called lightening.

Finally... if your server is running hot enough that mineral oil is boiling off, you've got more serious things to worry about than that. (its boiling point varies, based on the grade, between 260-330'C -- http://www.jtbaker.com/msds/englishhtml/M7700.htm [jtbaker.com] )

Re:Liquid cooling for datacentres? (3, Insightful)

evanbd (210358) | more than 6 years ago | (#23051328)

Also... while it's a technicality, air *is* conductive. It just has a very high impedance. It *will* conduct electricity, and I'm pretty near certain you've seen it happen: it's called lightening.

If you want to get all technical about it, you're basically wrong. The resistivity of air is exceedingly high. However, like all insulators, it has a breakdown strength, and at electric field strengths beyond that, the conduction mode changes. It's not simply a very high value resistor -- nonconducting air and conducting air are two very different states, which is the reason lightning happens. The air doesn't conduct, allowing the charge to build higher and higher, until the field is strong enough that breakdown begins.

For materials with resistivity as high as air in its normal state, it's not reasonable to call them conducting except under the most extreme conditions. Typical resistance values for air paths found in computers would be on the order of petaohms. While there is some sense in which a petaohm resistor conducts, the cases where that is relevant are so vanishingly rare that it is far more productive to the discussion to simply say it doesn't conduct.

This is one of those cases. Claiming that air is conductive is detrimental to the discussion at best.

Re:Liquid cooling for datacentres? (0)

Anonymous Coward | more than 6 years ago | (#23052286)

Also... while it's a technicality, air *is* conductive. It just has a very high impedance. It *will* conduct electricity, and I'm pretty near certain you've seen it happen: it's called lightening.
Actually, it's called lightning [merriam-webster.com] (no 'e').

Re:Liquid cooling for datacentres? (1)

Artuir (1226648) | more than 6 years ago | (#23050366)

I'm sure there's some engineering to be done to solve that problem for servers. You could make copper piping through the entire solution (this IS a server, no expense is spared, no need for flimsy tubing) that would be good for 70+ years. You don't keep servers operating 70 years typically. Well, at least *I* don't. Some of you guys might just to brag about how much uptime your Linux box has to your great grandchildren.

Linux bastards.

Re:Liquid cooling for datacentres? (1)

Cancel-Or-Allow (1073192) | more than 6 years ago | (#23052522)

If you live in a desert climate, air cooling sucks and if the dust is not dealt with at regular intervals things fail quickly. First dust starts to accumulate on the fan blades (unevenly) putting it out of balance thus placing greater strain on its barrings. Meanwhile, intel's ingenious design of their retail cooling fan and heatsink ends up being clogged with dust. The ambient temperature inside the chassis begins to increase as the chassis fan and PSU fans have now ceased, leaving only the higher power cpu fan spinning faster in vain packing the dust more tightly in an increasingly overheating heatsink. With no airflow coming into the chassis, the first thing that usually goes is the HDD. And is usually the first time a service call is generated from a user.

In my server closets I buy my own chassis with an easy to remove air filter on the front of each chassis, about every month I take it to the sink rinse it out, shake off the excess water and put it back. The closets have their own independent AC and room monitor with alerting sophisticated enough if for some reason the AC quit Servers would begin treating it like a power failure and begin shutdowns.

But for the poor PCs sitting on the floor of 100s of users all working as digital air filters. Unless regular maintenance is done (the dust storm blow job with the air compressor) they crash within 6 months to 1 year.

I really like the idea of liquid cooling. Just leaving a rig that I'm setting up for some running 24/7 with 2 8800GTX, QX9770, 2 Seagate 1TB, 3, WD 150 Raptor, evga 680i mobo, Tagen 1100W PSU 2 19" Dell 2007WFP in portiat mode sandwiching their 30" model in the middle.
Compared to last year, my electricity bill increased by 40% for this month My Kill-A-Watt reader registered an average 600W, but the heat would activate the main HVAC in my house much more often. All in all to run this rig costs close to $100 per month in energy costs.
If I could easily get that heat outside of the home, then I would only be expending 600W for the computer and saving 3KW each time the HVAC kicks on.

Another thing, the Cooler Master CMStacker is a POS. At least in the desert, it is nothing more than a dust storm in a box. So far I am using a great quality filtered rackmount chassis, but he wants CM with pretty lights lights. I'll show him where to the canned air is in costco and how to make small dust storms on his front porch every month.

Sorry for not having much of a point, my ambien as begun to kick in and will be typing in my sleep soon.

think big - the heat is still there in the room (1)

gelfling (6534) | more than 6 years ago | (#23049972)

A DC might have 20,000 servers. That heat has to go SOMEWHERE. If it's pumped into the ambient air just like an aircooled machine, you're still going to need large AC units to move that hot air out of the DC

Re:think big - the heat is still there in the room (0)

Anonymous Coward | more than 6 years ago | (#23050140)

Ever see those big boxy things on the roofs of buildings, with mist coming out of them when it's humid? Those are cooling towers. Their purpose is to take hot water and remove the heat outside the building. It's simple, well-known technology. You can do the same thing with the cooling water from a data center.

Re:think big - the heat is still there in the room (1)

gelfling (6534) | more than 6 years ago | (#23050534)

Yes I know what those are. But carrying the heat away from the servers and venting it to the room isn't going to help the overall need to cool everything. It may make each server slightly cooler but it's not going to alleviate the need for power or cooling in any data center overall.

Re:think big - the heat is still there in the room (1)

Dare nMc (468959) | more than 6 years ago | (#23051482)

considering most CPU's can run at 90C but most air cooled systems will recommend something below 23C ambient. Efficiency of a cooling system maintaining 60C would be much greater, which with a much higher conductivity should be fine hence the cost savings of liquid cooling.

Heat Pump? (1)

bill_mcgonigle (4333) | more than 6 years ago | (#23050976)

With the caveat that thermodynamics scares and confuses me, if you have a bunch of heat coming out of the servers' water-coolers, couldn't you pipe that into a heat pump and recover some cooling energy or even electricity? I'm familiar with a local facility which operates its air conditioning systems on steam, though I forget the name of the technology at the moment.

Re:Heat Pump? (1)

flappinbooger (574405) | more than 6 years ago | (#23051282)

With the caveat that thermodynamics scares and confuses me, if you have a bunch of heat coming out of the servers' water-coolers, couldn't you pipe that into a heat pump and recover some cooling energy or even electricity?

Yes. Now, THAT would be smart. Eliminate the cost of water heaters, augment winter HVAC bills, etc. Steam power plants use "waste" energy, the heat left over in the water after it runs the main turbines, to preheat the water going into the boiler. There's usually heat left over after THAT, and it is at a good temp for use in the power plant building itself. Any heat sent back out to the environment is wasted, and wasted energy = wasted $$.

Now, if it's enough wasted energy to warrant the cost of capturing it... Depends. That's what they pay engineers for, to figure that out.

This is, at least, what I remember from my thermo 1, thermo 2, heat transfer, and advanced thermo courses in college. MMmmmmm.... Thermo.

Re:Heat Pump? (1)

bill_mcgonigle (4333) | more than 6 years ago | (#23051696)

Now, if it's enough wasted energy to warrant the cost of capturing it... Depends. That's what they pay engineers for, to figure that out.

Which means Google already probably knows the answer. ;) Those guys are ruthlessly (and brilliantly) cheap.

Re:Liquid cooling for datacentres? (0)

Anonymous Coward | more than 6 years ago | (#23049984)

Probably because the liquid cooling systems leak too much.

Re:Liquid cooling for datacentres? (1)

Artuir (1226648) | more than 6 years ago | (#23050428)

Yes, because they are consumer grade solutions, not enterprise grade. You can engineer a solution that uses a 100% solid copper piping/waterblock framework with absolutely no joints or gaps anywhere in the actual box. Design the thing so the weak points, or any point where water could possibly leak is on the outside of the system (obviously not above or below, might need a custom server cabinet for this) and you can have cooling solutions that will not harm server hardware due to leaking.

I'm not saying my idea is the best (or even decent) as obviously it makes components impossible to upgrade.. but it's one possible solution of many more, I'm sure.

Re:Liquid cooling for datacentres? (0)

evilviper (135110) | more than 6 years ago | (#23050000)

instead of huge electricity bills for A/C you could just plumb each rack into the building's water system

Or you could use magical pixie dust...

The ONLY THING water cooling does is (potentially) provide a larger surface area to disperse the heat. It does not magically "cool" anything. Unless ambient temperatures are always much lower than you want your datacenter to be, you'll still be running the water through an A/C. And if you're lucky enough to be someplace that ambient temperatures are always that low, you'll need some huge radiatiors... much larger than an equivalent A/C condenser, as the temperature differences involved are much, much lower.

If you want to use a ground-based heat-pump, you can... It's every bit as easy to do with air it is with liquid. And with air, you don't have to spend ludicrous amounts of money and man-hours retrofitting every single server for liquid.

Re:Liquid cooling for datacentres? (0)

Anonymous Coward | more than 6 years ago | (#23050042)

Well the tap water I am drinking feels cooler than my CPU (I'm not going to lick my CPU to find out). So liquid cooling systems to heat exchanger, to building cold water pipes, to water heater. Ohh look at what I just did.

Re:Liquid cooling for datacentres? (4, Informative)

eagl (86459) | more than 6 years ago | (#23050096)

The ONLY THING water cooling does is (potentially) provide a larger surface area to disperse the heat.
So totally wrong/ignorant... Is this a troll? Water cooling does a lot more than that.

1. Can be a LOT quieter than normal air cooling.
2. Allows for heat removal with a much smaller heat exchange unit on the heat source.
3. Allows for heat transfer to a location less affected be the excess heat being dumped (such as outside a case) instead of just dumping the heat in the immediate vicinity of either the item being cooled or near other components affected by heat.

There are other reasons, but these alone are more than enough. Did you not know these, or were you just trolling?

Re:Liquid cooling for datacentres? (1)

MightyYar (622222) | more than 6 years ago | (#23050110)

It's also cheaper to pump a small amount of water than a huge volume of air.

Re:Liquid cooling for datacentres? (2, Insightful)

evilviper (135110) | more than 6 years ago | (#23050698)

1. Is a result of the larger heat exchange area. And makes no difference in a data center.
2. No benefit for any practical application. Definitely makes no difference in a data center.
3. Does not affect the cooling costs of a data center in the slightest.

Nothing about water cooling will reduce the cooling and energy costs of a data center IN THE SLIGHTEST. You're doing a lot of magical thinking, with NO experience in the subject.

Re:Liquid cooling for datacentres? (1)

eagl (86459) | more than 6 years ago | (#23051182)

My experience is with datacenters that are apparently not as generic as the ones you seem to be claiming experience with. Just because you do things one way and maybe always will, does not mean that another customer will have requirements that your cookie-cutter approach can satisfy.

Put a datacenter 300 ft underground, and see how far simple air cooling gets you. In that case, there MUST be a way to dump the heat that doesn't involve simply blowing air around. If it works for you, that's fine. But attempting to imply that whatever datacenter you are responsible for cooling or designing would fit every customer's needs is pretty narrow minded. Maybe you really believe what you're posting, but your level of experience outside a cookie-cutter place to stack computing power seems to be somewhat lacking.

Re:Liquid cooling for datacentres? (1)

evilviper (135110) | more than 6 years ago | (#23052350)

Put a datacenter 300 ft underground, and see how far simple air cooling gets you.

Do you have ANY experience with ANY air conditioning? You don't see to have the foggiest idea how they fundamentally work.

An A/C will work superbly underground, with less work on the building itself (ie. never mind the per-machine hook-ups).

Re:Liquid cooling for datacentres? (1)

eagl (86459) | more than 6 years ago | (#23054150)

You keep going on about experience, when it's obvious you're the one with the limited experience since you have not seen any actual applications that require any more thought than "stick some more fans on it and it'll be ok" or "well, just put another AC exchanger on the roof and it'll probably work fine". You still have to get the heat out. And that's the whole point about water cooling. Getting the heat out without relying on whooshing air around (whether it's air blowing on heatsinks or air in an climate control system).

But again, you've obviously never worked on any projects that can't be solved by sticking more fans on it and you don't have the imagination to think about any applications that might not be solvable by whooshing more air around, so we may as well end the discussion.

Re:Liquid cooling for datacentres? (2, Interesting)

cheier (790875) | more than 6 years ago | (#23052146)

Liquid cooling can affect the energy costs in a big way depending on how well integrated the system is. As an example, CoolIT systems had developed a server rack with an integrated liquid cooling system that they had shown off at CES this year. The rack essentially used hydraulic fittings to allow you to hot-swap systems from the chassis, while still keeping the cooling centralized.

They had essentially used the radiator from a Honda Accord, which they found to be able to dissipate between 25 and 35 KW of heat. With a system like this centralizing the area where heat is dumped, fluids can be piped out to a radiator sitting outside, so essentially, a large portion of the heat produced from a rack of computers, can be relocated outside of the data center.

Even without moving the heat outside, you can still save on cooling costs. Because you have the capacity to dissipate so much heat, less AC costs are required simply because you can used a forced air system to move the gobs of hot air out and outside air in. This could potentially save up to 30% in cooling costs alone, let alone if you were to just relocate the exchanger to the exterior of the building.

Re:Liquid cooling for datacentres? (1)

evilviper (135110) | more than 6 years ago | (#23052422)

Liquid cooling can affect the energy costs in a big way

No, it really can't.

With a system like this centralizing the area where heat is dumped, fluids can be piped out to a radiator sitting outside, so essentially, a large portion of the heat produced from a rack of computers, can be relocated outside of the data center.

You could similarly open up a data center, with just large fans blowing ambient air in and out.

With either method, it just wouldn't work. A $50,000 server rack is not your home PC. It's not okay to let it operate on 40C+ ambient temperatures. It's not specifically out of spec, but it's a very, very high temperature to start off with, which means you don't have a very large margin for error, and you have to be careful of the extra heat output by the server when loaded.

I already also mentioned that, with such a method, you need a MUCH larger radiator, and much higher throughput forced convection, since you're trying to cool it with warm air, rather than chilled datacenter air... The temperature difference is just too low. It quickly gets to the point that your non-optimal cooling solution is almost as expensive as just running an A/C, and allowing less airflow, smaller radiators, etc. Heat-pumps are continually increasing in efficiency, after all.

Re:Liquid cooling for datacentres? (1)

BLKMGK (34057) | more than 6 years ago | (#23054878)

The biggest issue with running a datacenter on 40C ambient air with big fans blowing it in and out the doors is that air cooling is so inefficient that the cooled components would overheat as they pick up so little temp drop from AIR. 40C WATER cooling on the other hand would bring those CPU and HDD temps down a good bit.

You're failing to understand just how much better water transfers heat vs air.

Re:Liquid cooling for datacentres? (1)

BLKMGK (34057) | more than 6 years ago | (#23054854)

This is actually pretty amusing as when I setup my home office I designed much the same thing! Sadly I didn't put it into place but indeed it would have worked quite well I'm sure. Radiator in the crawlspace, temp sensed electric cooling fan mounted on radiator - Ford Taurus fan most likely. Copper or PVC piping up through the floor into the office in a loop with a shutoff valve in the middle to regulate bypass flow. An agro pump to move the liquid or perhaps a small pool pump. Fittings on the pipe mounted to the wall to allow lossless connection and disconnection of cooling fittings for computers. Water blocks on computers are cheap enough and no pump would be needed nor radiator inside the case. Concerns were mostly surrounding filling it, purging it, and maintaining the radiator in the crawlspace which is tight. Google a bit and you'll find a guy who DID build a system like I had intended, only his "radiator" is his swimming pool - seriously!

In the end the SO veto'd my system and the work\mess would've been a hassle but it was QUITE doable. I've built systems smaller than this and was doing it 10 years ago with Peltiers to boot in order to overclock at near freezing temps. The primary advantage I found to this was not just that it cooled so well but that it allowed me to move the heat exchange *away* from my computer. When my home office was WAY small temps climbed enough that the air exiting would trigger my thermostat - in Winter the house was cold, Summer it was freezing - except in the office.

When I water cooled my hottest computer the radiator was placed outside my window with a small fan - temps IN my office dropped dramatically as did the dB level and the CPU ran COLDER than ever. Ambient temp outside is often low enough compared to a blazing hot CPU that the temp drop is awesome. 90F outside is no biggie when your CPU inside is pushing 50+C - as my quad is next to me right now. Lots of industrial tech exists to chill water far cooler than ambient temp outside too and some datacenters likely already use it for their A/C today.

These days I water cool only a little having lost hardware to water leaks over the years - twice to be exact. I do have a system on my desk waiting to be resuscitated that ran for 2 years nearly trouble free - yes normal tubing allows evap BTW but this one was retired because it's older not from coolant loss. I am a firm believer in using water, I will use it again, but it IS more trouble and each new damned video card requires a different block than the last at notable cost. The temp drops you achieve are amazing - especially on today's hot video cards. New systems built like the one in this story and the new Swiftech stuff make water easier for laypeople who don't want maintenance.

Anyway, what I have learned with my own systems I believe can be used in a datacenter to advantage. The primary thing I've learned being that the heat exchange can be efficiently made "elsewhere" and not someplace where exchange from one machine effects another. I'd also imagine it might be nice to be able to walk into a datacenter and not sweat hearing protection as much!

Re:Liquid cooling for datacentres? (2, Informative)

jack8609 (1217124) | more than 6 years ago | (#23054328)

As someone who makes their living figuring out how to move heat from A to B (in avionics, not datacenters), this comment makes my head hurt for a number of reasons... First off, as others have pointed out, liquid cooling in data centers is a reality and folks like IBM have worked on liquid cooilng for decades. Due to many of the reasons already mentioned, everyone avoids liquid cooling as long as they can and a number of technologies have helped on this. For example, the transition from Bipolar to CMOS around the time I finished grad school put a lot of thermal engineers out of work for a while. However, liquid cooling is used in plenty of places - Cray has done it for a long time on their supercomputers (not on the latest one - at least not for local cooling), F-22 & F-35 have liquid cooling for their avionics (for weight reduction), nuclear reactors (using liquid metal), etc. Every thermal conference held in the last 5 years seems to have had at least one session on data center cooling and most of the work is on implementing some aspect of liqud cooling. The electricity required for data center cooling is now on the order of 30-40% of the total power (don't quote me on that - I'm actually thinking it is higher than that, but as I said I don't work in that market). Air cooling is great for simplicity, but it has limits that we are fast approaching. The simple methods for air cooling involve just dumping hot air into the room and once you do that and are using the A/C on twice as much warm air as the the hot air that you would otherwise be cooling without the mixing, your cooling power requirements shoot up considerably. Liquid cooling has two potential benefits (as well as the numerous challenges already described). As people have pointed out, the thermal conductivity of water is much higher than air (k = 0.6 vs. 0.027 W/m^2K). This is important because it higher local cooling (the convection coefficient is ~20x higher). More importantly, the volumetric heat capacity of water is massive compared to air. The temperature rise (in C)of cooling air is ~(8*W dissipated)/(lb/hr of cooling air). So when you are dissipating MW of power and trying to keep the electronics no more than 20-40C above ambient, you need (literally) tons of air. Water is about 4x better in specific heat and about 1000x more dense, so you reduce the volumetric flow rates by ~4000x and get more effective heat transfer at both the electronics and heat exchanger ends (so they can be smaller). Pumps can generally be much more efficient than fans and the amount of pumping power for a 100% device scales with the density * volumetric flow rate^(1.5) [assuming that I am doing the math in my head correctly...]. If either the price of electricity or the heat dissipation levels in data centers continue to go up (fairly safe bets...), you will see increasing use of liquid cooling in that application. Keeping things leak free, and other related maintenance issues, and dealing with legacy architecures seem to be the biggest hurdles.

Re:Liquid cooling for datacentres? (1)

Have Brain Will Rent (1031664) | more than 6 years ago | (#23050954)

As any scuba diver could have told him, water conducts heat far more efficiently than air. IIRC it is a factor of about 25.

Re:Liquid cooling for datacentres? (1)

adolf (21054) | more than 6 years ago | (#23051852)

Ok, you've got a brain, so you should also know this:

Given a water cooled rig and an air cooled rig which operate at the same efficiency (in terms of Watts dissipated per Watt of cooling power), water cooling and air cooling perform just about identically as long as things remain inside of the case.

Move the water cooling system's radiator outside of the case, and things start to slant toward water cooling.

Observations:

1. They're equal in cooling capacity, but the air-cooled system is simpler and has fewer single points of failure (and is not wet).
2. Nobody ever seems to put the radiator outside of the box in mass-produced PC-based systems.

Conclusion:

Water cooling is a sham. Bigger/more efficient heatsinks, cooled by air, would bring a more substantial improvement in the quest for quiet, reliable, or rugged computers, than does the current crop of water cooling arrangements.

Re:Liquid cooling for datacentres? (1)

BLKMGK (34057) | more than 6 years ago | (#23054888)

You haven't actually ever built and run a water cooled rig have you?

Re:Liquid cooling for datacentres? (1)

phantomcircuit (938963) | more than 6 years ago | (#23050226)

You could run the hot water for the building through a heat exchanger before you heat it up with a boiler. Cold Water -> Heat Exchanger -> Warm Water -> Boiler -> Hot Water Over all the energy used to go from Cold Water -> Warm Water is saved.

Re:Liquid cooling for datacentres? (1)

evilviper (135110) | more than 6 years ago | (#23050710)

The same is equally true for an A/C unit, as it is for a liquid system.

Re:Liquid cooling for datacentres? (2, Funny)

Have Brain Will Rent (1031664) | more than 6 years ago | (#23050980)

Not only that but if you attach a Maxwell's Demon to the output you can get cooled water separated out from the hot and then feed that back in to the cooler while sending the separated hot water to the boiler!!!

Re:Liquid cooling for datacentres? (1)

Murphy Murph (833008) | more than 6 years ago | (#23051064)

You could run the hot water for the building through a heat exchanger before you heat it up with a boiler. Cold Water -> Heat Exchanger -> Warm Water -> Boiler -> Hot Water Over all the energy used to go from Cold Water -> Warm Water is saved.

Unless your datacenter is collocating with a (large) laundromat, there just isn't that much demand for hot water at a datacenter. No laundry, no showers, little to no cooking.

Someone check my numbers.
Tap water = 7 degrees C.
Water heater hot water = 50 C?
CPU = 65 watt thermal output = 55889 calories / hour.
56,000 calories can raise one liter of water 56 degrees C in an hour, or raise 1.3 liters of 7 degree water to 50 degree water in an hour.
So every CPU needs 1.3 liters of water an hour, minimum (assuming we can super-concentrate all the heat and not just slightly heat larger volumes of water.)
The Google "datacenter in a trailer" rumors were saying 5000 CPUs per trailer.
That's a 6500 liters of water per hour output.
LARGE laundromat.

Re:Liquid cooling for datacentres? (1)

ibeleo (319444) | more than 6 years ago | (#23050086)

IBM recently released the p575 which states "Cooling requirements Chilled inlet water supply/return required for all systems". They also have a kit to turn the rear door of a rack cabinet into a Heat Exchanger. So there is a move in that direction

p575 spec http://www-03.ibm.com/systems/power/hardware/575/specs.html [ibm.com]
Rear door exchanger http://www-132.ibm.com/webapp/wcs/stores/servlet/ProductDisplay?catalogId=-840&storeId=1&langId=-1&dualCurrId=73&categoryId=4611686018425028106&productId=4611686018425023461 [ibm.com]

Re:Liquid cooling for datacentres? (2, Interesting)

diablovision (83618) | more than 6 years ago | (#23050214)

The Black Box [sun.com] is a complete watercooled data center in a shipping container.

Re:Liquid cooling for datacentres? (0)

Anonymous Coward | more than 6 years ago | (#23050232)

Water-cooled mainframes have been around for at least the 20 years I've been paying attention.

Re:Liquid cooling for datacentres? (1)

RiotingPacifist (1228016) | more than 6 years ago | (#23050408)

Given that they use AC because they cant be bothered to organise a proper air cooling system (pumping the hot air out of the back of server, instead of cooling all the air in the room, etc), its simply because its cheaper to use AC than actually organise anything.

You do have a good point though, use of a non conductive oil, that was cooled against water pipes, would mean the servers are just as safe as they are at the moment.

Well, maybe. (1)

jd (1658) | more than 6 years ago | (#23050510)

If a server came ready-built with fail-safe plumbing and cooling mechanisms, the answer would be yes. Water, oil, flourinert - these would all be excellent. Total immersive cooling would be more logical than piped cooling, as there are fewer parts that could fail and less possible damage from a failure. You could have a completely sealed compute unit that contained everything and was ready to go, so eliminating any special skills on the part of the data centre or any special requirements in the way of plumbing for that centre. It would, however, require that things like the PCI bus be external to the main unit, or the system would be unmaintainable. That increases fragility. For pre-specced servers that aren't going to require significant upgrades, this would be an obvious solution.

Re:Liquid cooling for datacentres? (0)

Anonymous Coward | more than 6 years ago | (#23051218)

From the top of my head, APC and SGI have products with water cooled racks.

Re:Liquid cooling for datacentres? (0)

Anonymous Coward | more than 6 years ago | (#23051338)

Well, I don't often see people complaining about air leaks killing their entire electrical installation. Also, handling liquids is much more expensive since you can't just leave a gap, you need to have pipes all over the place. Liquid cooling is more complex and expensive, while being much less adaptable. So it might be great in a few cases, but mostly, it's not worth the effort.

Re:Liquid cooling for datacentres? (1)

TheVoice900 (467327) | more than 6 years ago | (#23051408)

It's being done though not on the system level but on the rack level. SGI's ICE platform has water-chilled doors: http://www.sgi.com/products/servers/altix/ice/features.html [sgi.com]

This is a great bonus for high density HPC applications. Typically in a datacenter you are blowing air up from the raised floor in front of the servers. However, a good deal of it is taken up by the servers in the lower part of the rack, leaving the top servers running warmer than the lower servers. Supposedly the water chilled doors help a lot in this scenario.

Re:Liquid cooling for datacentres? (1)

nospam007 (722110) | more than 6 years ago | (#23053186)

It did work. IBM used it in the past. Check Google for "water-cooled ibm"

Also from the article, it doesn't work (1, Insightful)

gelfling (6534) | more than 6 years ago | (#23049978)

Even the article tries hard to tout its benefits but their own stats show its not worth it. Either it's a crappy implementation or its simply not relevant.

Re:Also from the article, it doesn't work (2, Insightful)

eagl (86459) | more than 6 years ago | (#23050050)

Even the article tries hard to tout its benefits but their own stats show its not worth it. Either it's a crappy implementation or its simply not relevant.


How so? They show that it's quieter and more effective than stock cooling, and significantly quieter than an aftermarket air cooling solution. What exactly are you looking for then? You gotta be more specific than just a completely unsupported criticism that doesn't even reflect the test results, let alone explain your personal criteria.

Here, try something like this next time:

It looks like a good/bad item because the performance was/was not what I'd expect from a water cooling system costing [insert price here]. You can get similar/better performance from [insert alternative product here] for less. Tradeoffs with the alternative are it's quieter/cheaper/louder/expensive but based on my own critera of [insert your own priorities here], I think this product is great/teh suck.

Give it a shot, you might like it.

Re:Also from the article, it doesn't work (1)

gelfling (6534) | more than 6 years ago | (#23050560)

You don't have to be an entirely patronizing asshole. But I'm guessing you don't work in sales.

Ok so it's marginally quieter. As for its absolute cooling power it's on par with whatever air cooled unit you can get today with a lot less complexity. All in all that's pretty weak justification. If that's your definition of "it works well" then you clearly care about noise above all other criteria. There are probably better ways to make your PC quieter than this.

Re:Also from the article, it doesn't work (1)

eagl (86459) | more than 6 years ago | (#23050764)

You don't have to be an entirely patronizing asshole.
It's the internet, so actually I do (heh).

But you are still arguing from a position lacking in factual information. Water cooling can be almost completely silent, and can remain so even when cooling hardware that would otherwise require very loud fans for conventional air cooling.

This does not even address the additional cooling requirements seen in overclocking, small form factor, or otherwise special use equipment. A water cooled HTPC for example typically has to trade off performance for noise, as high performance video cards and cpus get far too hot for silent air cooling even if you don't use a small enclosure that looks like it belongs next to a TV or stereo. Water cooling, especially simple ones like the review subject, would let you use a top of the line cpu and fast video card in an HTPC without having to worry about how to get the heat out of the case in a quiet manner. That means the HTPC could be used not only for home theater applications, but also for PC gaming on your widescreen plasma, LCD, or projector.

And that's just one application. There are many others, including high density rackmount or server applications that cannot sound like a jet engine or must be able to dump the heat outside of the environment containing the computers. And as for costs, and complexity, a properly designed system should not be any more trouble than the air cooling requirement of replacing fans and cleaning dust out of the filters (or system if filters are not used).

I don't water cool because I don't need to. I don't run computer hardware that generates a heat load requiring loud air cooling, and I have no problem with the heat from the cpu getting mixed with the air inside my case before it gets sucked out by the exhaust fan. I also do not have the spare time or inclination to "get into watercooling" and a half-assed approach is a good way to dump a gallon of coolant into your case and onto the floor. But an easy to use system like the review system, at a nice price, is an option I will consider for the next time I build a system from scratch because it appears to be no more difficult to install than a conventional heatsink, is quieter, and cools at least as well as a high-end (and noisy) air cooling system. And it's sealed up so it is unlikely just slapping it in and forgetting about it will result in destruction of my computer and carpet.

Re:Also from the article, it doesn't work (1)

gelfling (6534) | more than 6 years ago | (#23051716)

So it's just quieter. Again, if that's your main concern, then fine. There are probably less problematic ways to address that. I for one would not want to ever have to worry about hundreds of little water cooling systems in a data center each with the potential to break down and cause a catastrophic failure among many machines.

Back in the old TCM days you might have a dozen CEM complexes each with a TCM ganged into a single chiller pump system. That's a kind of failure rate that's manageable. But with 5000 server blades in a data center today the probability of something going horribly wrong from 5000 water coolers is unacceptable.

Re:Also from the article, it doesn't work (1)

eagl (86459) | more than 6 years ago | (#23051794)

In a datacenter using blade servers, I'd expect some sort of hybrid heat exchange system would be more useful than pure water cooling. I strongly disagree that the only benefit is lower noise, but also remember that we're not just talking about datacenter applications either. All sorts of applications (such as the htpc setup I described) could get not only lower noise but also allow higher performance due to the better managed thermal load.

And that is all water cooling does - allow a better and more managable way to move heat from one area to another. Saying that the only benefit of managing heat transfer is lower noise is a bit like saying the only benefit of owning a car is because you can choose the color.

Overclocking, here we come! (1)

billy901 (1158761) | more than 6 years ago | (#23050106)

With something like this I can overclock my "Hackintosh" to 4 GhZ without worrying about setting my apartment on fire. Once I get my dual quad core machine, I'm gonna be running it so fast with a cooler like this. I really like the cheap price! I'm considering getting rid of my MacBook because it gets too hot, but if they can manage to make one for the MacBook, then hell ya! I'm gonna get one. I wanna run nothing but vim on an 8 core machine. ;) No, not really.

Re:Overclocking, here we come! (2, Informative)

BLKMGK (34057) | more than 6 years ago | (#23055012)

Swiftech makes a system you might be interested in that's also self contained. The pump sits right on top of the CPU and the heat exchanger fits where your 120mm exhaust fan is normally mounted. I'm not using it and would only consider it if I were cooling my vid card too but a friend is using it and REALLY likes it - trouble free install on his box.

To nitpick the summary (no, did not rtfa) (0)

Tmack (593755) | more than 6 years ago | (#23050108)

plastic tubing and a non-toxic, non-flammable liquid are used to overcome evaporation issues,

If it is truly sealed, there should never be any "evaporation issues," as there is no where for it to evaporate to. Being non-toxic, non-flammable has nothing to do with it, I can think of another very common non-flammable non-toxic (in most of its forms and uses) compound thats readily available but is NOT used specifically because it tends to boil at relatively low temps and low pressures: Dihydrogen monoxide. As for plastic tubing, what else are you going to make it from? Metal? You could, but most systems I have seen use clear PVC tubing, with florescent coolant and blacklights to add the "bling" effect. Copper piping would actually be more efficient (by allowing much higher pressure in the loop) and less likely to leak if done correctly, but would cost a bunch more due to the metals current pricing.

Tm

Re:To nitpick the summary (no, did not rtfa) (1)

dwater (72834) | more than 6 years ago | (#23050496)

I noticed this too. I wonder what they were trying to say.

...or is it just marketing crap?

Re:To nitpick the summary (no, did not rtfa) (1)

ChrisMaple (607946) | more than 6 years ago | (#23050558)

The article says that most water cooling systems use silicone tubing, which the author seems to think is not a plastic. I'm not an expert on plastics, but PVC seems like a poor choice to me. It's too likely to degrade over a decade or so and become brittle or fragile.

Re:To nitpick the summary (no, did not rtfa) (1)

Murphy Murph (833008) | more than 6 years ago | (#23051132)

I'm not an expert on plastics, but PVC seems like a poor choice to me. It's too likely to degrade over a decade or so and become brittle or fragile.

Like the PVC drainpipes in modern houses?
Like the insulation on your home's wiring?

Re:To nitpick the summary (no, did not rtfa) (0)

Anonymous Coward | more than 6 years ago | (#23051936)

Who keeps a system for a decade anyway?

Re:To nitpick the summary (no, did not rtfa) (1)

glitch23 (557124) | more than 6 years ago | (#23050934)

As for plastic tubing, what else are you going to make it from? Metal?

Rubber?

To nitpick the metal (no, did not rtfa) (0)

Anonymous Coward | more than 6 years ago | (#23053046)

You can also use stainless steel as well.

Re:To nitpick the summary (no, did not rtfa) (1)

BLKMGK (34057) | more than 6 years ago | (#23054968)

The clear plastic "bling" tubing is often medical grade and guess what? Over time liquid EVAPS from such systems. How? It actually manages to be absorbed by the tubing and slowly dissipate into the air, which is why this system uses a different kind of tubing and why they highlighted it's lack of evaporation issues. You haven't run a liquid cooled rig have you?

Oh and plain old water is a BAD idea in a liquid cooled PC. For one it tends to oxidize things like copper heatsinks over time and for another you get biological growth that QUICKLY kills cooling performance as the heatsinks are covered in slime. Figure two weeks on plain water before it all goes to hell. Those "bling" dyes you see are generally antibio agents too. Chlorine bleach is also effective but has other side effects, I for one prefer Water Wetter for a multitude of reasons.

mainstream my ass (0)

Anonymous Coward | more than 6 years ago | (#23050196)

Not shipping on Dells = not mainstream

"mainstream" (1)

iroll (717924) | more than 6 years ago | (#23050286)

I thought the G5 Power Mac took liquid cooling mainstream in 2004.

I guess this is one of those phrases, like "the Year of Linux on Desktop," that we'll hear ad infinitum.

Re:"mainstream" (1)

dwater (72834) | more than 6 years ago | (#23050538)

Is that only one machine?

If so, I don't see how that could be considered mainstream. Perhaps I misunderstand the term, but to me it means it is used on many different computers, not just one.

Perhaps 'mainstream' is valid in this case because the one model sold a lot? I don't think that fits with my uderstanding of the word, but it's at least debatable, I suppose.

What's new (1)

theeddie55 (982783) | more than 6 years ago | (#23050310)

It doesn't seem much different to the gigabyte kit i put in my computer 2 years ago http://www.cluboc.net/reviews/super_cooling/gigabyte/galaxy/index.htm [cluboc.net] the only difference being the pre built bit which could cause great difficulty if you want to do something sensible like mount the radiator on the outside. (Note: soon after i got mine they released a second version with a different pump and reservoir, and i can tell why, after 13 months, just out of waranty, my reservoir cracked)

Mainstream (1)

Have Blue (616) | more than 6 years ago | (#23050574)

Didn't liquid cooling go mainstream when Apple used it in the last generation of Power Mac G5s?

Necessity is the mother of invention (1)

damburger (981828) | more than 6 years ago | (#23050716)

This is kind of inevitable, and IMHO overdue. Monolithic heat sinks and fans the size of jet engine intakes have been a pain in the arse for top of the range gaming machines for years. Also, I don't know about anyone else, but the air cooling of my computer is a depressingly efficient mechanism for sucking dust and fluff into the computer and keeping it there.

Shuttle PCs have had this for years (1)

Animats (122034) | more than 6 years ago | (#23050726)

Shuttle PCs have had a heat-pipe and heat exchanger liquid cooling system for years. This made possible their little "breadbox" systems.

Re:Shuttle PCs have had this for years (1)

BLKMGK (34057) | more than 6 years ago | (#23054988)

Intel and AMD systems are also using heat pipes just like the Shuttle XPCs and have been for a year or three now. All of the best "air" heatsinks I am aware of use some liquid in them in the same fashion. Shuttle just managed to build it such that the radiator was a little further divorced from the heatsource is all.

Uh...Johnny-Come-Lately (1)

FredThompson (183335) | more than 6 years ago | (#23051008)

Welcome to 2008, OP. Sealed systems have been on the market for months. You can even find Cooler Master Aquagatte (on the market since 2007) in some of the larger retail stores.

Re:Uh...Johnny-Come-Lately (2, Informative)

Paul server guy (1128251) | more than 6 years ago | (#23055340)

Hell, I've been using a "Consumer grade" easy to use water cooling system in my desktop for over a year in the form of the Titan Robela (http://www.titan-cd.com/eng/watercase/robela.htm or http://www.inland-products.com/singleproduct.asp?search=accessories&partnum=03011 [inland-products.com] )

I have the black Al faced one for longer PSs. It was extremely easy to set the water cooling up, and has kept my machine cool even with two extra blocks for the SLI cards and a chipset cooler. Yes it's not sealed, but then again, is that really a big deal? If it WAS sealed I couldn't have added my extra blocks, and this went together so simply, I doubt I would really have noticed the difference.

This is almost old news by now...

CFCs? (1)

bcmm (768152) | more than 6 years ago | (#23052940)

What is this non-toxic, non-flammable liquid, given that it probably isn't allowed to be a CFC?

Patch the symtoms or wait for the cure? (1)

turing_m (1030530) | more than 6 years ago | (#23053314)

I can't help but think that this is a stop-gap measure. I used to read up about all the various methods of silencing a computer (with the intention to implement myself) but for consumer-grade applications I'd prefer to wait for a variant of Moore's Law to do its work - the propensity for performance per watt to keep increasing until it nears whatever limits are predicted by information theory.

At that stage there will be an option to cool with no moving parts for typical desktop/laptop applications, and it will be a superior solution in all aspects compared with any combination of cooling and sound minimization.

Another welcome change will be for the idle power consumption to drop, which it certainly can.

Efficiency is inherently hard - it's all about approaching zero loss, and losses seem to almost never present a target suitable for a single silver bullet approach. That means work - killing all the losses on your loss budget one by one, and figuring how to integrate all the changes so it still works. The good thing is that once you come up with a design that lops off as many heads of the efficiency hydra as possible, it's just a matter of mass production.

Tagged: Slashvertisement (0)

Anonymous Coward | more than 6 years ago | (#23053574)

Are these just to make up for lost revenue due to people blocking the ads, or is it just to piss off subscribers? Surely this many can't get through by accident.

Theory for Open Letter to Universe Idea (0)

Anonymous Coward | more than 6 years ago | (#23054392)

My Open Letter to the Universe: âoeKnock and it shall be opened-Seek and ye shall findâ I ask for winning numbers on next lottery ticket purchase. Although Iâ(TM)m solvent, I need to help extended family members. Would like for my #s as purchased to be selected. Thanks for previous favors granted too. Iâ(TM)m at peace & wish to share good fortune that comes my way. TRY YOUR OWN âoeOPEN LETTER TO THE UNIVERSEâ!!
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>