×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Cooler Servers or Cooler Rooms?

CmdrTaco posted about 9 years ago | from the six-of-one-half-a-dozen-of-another dept.

IT 409

mstansberry writes "Analysts, experts and engineers rumble over which is more important in curbing server heat issues; cooler rooms or cooler servers. And who will be the first vendor to bring water back into the data center?"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

409 comments

Why not both? (3, Insightful)

tquinlan (868483) | about 9 years ago | (#12153735)

Unless you make things so cold as to prevent things from working properly, why not just do both?

Re:Why not both? (5, Insightful)

Anonymous Coward | about 9 years ago | (#12153805)

Cost.

The question isn't whether it's good to keep both cool. The question is, which makes more financial sense? Cooling the whole room? Spending the money to purchase servers with a low heat-to-computation ratio?

Probably a combination. But to say that people should equip every computer with VIA ultra-low-power chips _and_ freeze the server room is silly.

Re:Why not both? (2, Insightful)

FidelCatsro (861135) | about 9 years ago | (#12153890)

This is exactly the point , you need a dammed good balance.
If i have to be working on the server localy in some fashion i would rather not be boiling or freezing .In the average working enviroment We cant have it in a refrigerated room as this would bring up alot of working issues and not to mention design of the room itself .on the other hand we don't want a drasticaly complex cooling system that would add another possible avenue for failure to occur.
The best is possibly a nicely air conditioned room with a nice simple cooling system on the server , good airflow and a comfertable working enviroment .

#GNAA irc.uk.gnaa.us (-1, Offtopic)

Anonymous Coward | about 9 years ago | (#12153740)

FO LIFE

Aquafina... (5, Funny)

vmcto (833771) | about 9 years ago | (#12153742)

Will probably be the first vendor to bring water into the datacenter... I believe I've seen evidence in some datacenters already.

Re:Aquafina... (0)

Anonymous Coward | about 9 years ago | (#12153825)

Yes, and since liquid (molten) sodium chloride is used in some nuclear reactor designs as a heat transfer mechanism, the obvious conclusion would be to use water + sodium chloride to cool the server room! Yup, salt water...just spray it everywhere!
Ooops...gotta go, my toyota truck is in the shop...for some reason, it get's really rusty after the Winter each year....

Re:Aquafina... (-1, Offtopic)

ari_j (90255) | about 9 years ago | (#12153969)

Finally, we can have a flamewar over something outside the computer realm. I say Aquafina is the best bottled water that can possibly ever exist. Fiji sucks donkey balls, and Arrowhead just tastes like dirt. And don't even get me started on Evian - that stuff will turn you gay.

Or should I have just posted: "I, for one, welcome our new aquatic overlords." ?

uh, "rumble" (2, Funny)

koreaman (835838) | about 9 years ago | (#12153745)

it sounds like they're having some kind of gang warfare over the topic...what the hell?

Re:uh, "rumble" (4, Funny)

Master_T (836808) | about 9 years ago | (#12153828)

Yeah man.... I got stuck in one of them cool room or cooling system "rumbles". They are hardcore. I lost a buddy in one those... pen wound. I still have a scar from when a Cool Roomer stabbed me with a ISA Modem.

uh (-1, Offtopic)

Anonymous Coward | about 9 years ago | (#12153753)

first p0st! cheese ...but by the time my computer decides to send the p0st it won't be first... so...

fourth/fifth/sixth p0st!

first post (-1, Offtopic)

Anonymous Coward | about 9 years ago | (#12153755)

water in server housings O_o much too dangerous ...

well I've always wondered this (5, Insightful)

Saven Marek (739395) | about 9 years ago | (#12153757)

I've always wondered this. why have duplication of a function in a server across every single server box when it could all be done in the environment. For example all servers get electricity from the server room and all servers get network from the server room so why not all servers get cooling from 10F cooling in the server room.

It makes sense!

Re:well I've always wondered this (2, Interesting)

green pizza (159161) | about 9 years ago | (#12153817)

Blade servers are a noble start. Less duplication of power supplies and network gear. I imagine the situation will continue to get better over time.

Duplication is nice in some respects, more redundancy is a big plus. That and you actually have several useful machines when you finally tear it all down. Who's going to buy 3 blades off ebay when they can't afford the backplane to plug 'em into?

Re:well I've always wondered this (0, Flamebait)

El_Servas (672868) | about 9 years ago | (#12153858)

Duplication is nice in some respects, more redundancy is a big plus.

definitely. More redundancy is more redundant.

Re:well I've always wondered this (0, Offtopic)

Bastian (66383) | about 9 years ago | (#12153862)

What makes sense is what solution is cheap and reliable.

On one hand, it should take much less juice to cool just a few boxen directly rather than keep an entire room cold enough to keep all those CPU cores cool as well.

On the other hand, high-performance cooling systems inside every box means a lot more points of failure.

On the other hand, if everything in your server room requires a working HVAC to function, you're in trouble if the HVAC goes out - while if the cooling system in one server goes out, you can just swap it with a backup server while you're waiting for repairs.

I'm sure there's a whole lot more pro/con that I could parrot if I had bothered to RTFA. . .

Re:well I've always wondered this (5, Insightful)

T-Ranger (10520) | about 9 years ago | (#12153926)

What I have never understood is why servers virtually always have AC power supplies. Yes, you can get NEBS(?) compliant servers that take DC, but this isnt really a general option, but a distinct model line compleatly.

UPSs take AC, turn it to DC, charge their batteries. A sepearate system takes DC from batteries, inverts it and sends out AC. (Good UPSs, anyway. Otherwise they are "battery backups", not uninteruptable) Computer power supplies take AC and distribute DC inside the case. WTF?

Why doesn't APC start selling ATX power supplies? Directly swap out AC powersupplies, have them plug into the DC providing UPS and/or per-rack (or even per-room) powersupplies.

Electrical codes are a BS excuse. Even if you needed verdor specific racks, a DC providing rack is, so far as the fire marshal should be concerned, just a very large blade enclosure, which are clearly acceptable.

I cant beleive that Im the first one to ever come up with this idea. So there must be some problem with it.... Some EE want to explain why this wouldnt work?

Re:well I've always wondered this (0)

Anonymous Coward | about 9 years ago | (#12154027)

Shut up, troll.

Err, well, both? (3, Insightful)

Pants75 (708191) | about 9 years ago | (#12153763)

The servers are the heat source and the cool room air it the cooling mechanism? Yes?

So take your pick. To make the servers cooler, either buy new more efficient servers or buy a whacking great air con unit.

Since the servers are the things that actually do the work, I'd just get feck off servers and a feck off air-con unit to keep it all happy!

Everyones a winner!

Cray still has water cooling! (4, Informative)

green pizza (159161) | about 9 years ago | (#12153764)

Unlike most companies that are considering going back to water cooling, Cray has always used water cooling for their big iron. In fact, the only air cooled Crays are the lower end or smaller configured systems.

All hail the Cray X1E !

Re:Cray still has water cooling! (4, Informative)

1zenerdiode (777004) | about 9 years ago | (#12153882)

I thought Cray's originally used Fluorinert(tm), which is definitely *not* water... *spark* *spark* *fizzle*

All hail non-conductive fluorochemicals!

Re:Cray still has water cooling! (1)

green pizza (159161) | about 9 years ago | (#12153946)

You're right. I guess I should have said "liquid cooling".

Like the original Cray 1, the current Cray X1E uses Fluorinert to cool the system itself (check out the swanky video of the Fluorinert vapor jets on cray.com!). There's also a loop of some sort of refrigerant between the heat exchanger indoors and the cooling tower outdoors.

Re:Cray still has water cooling! (1)

timeOday (582209) | about 9 years ago | (#12154017)

How about distilled water?

Re:Cray still has water cooling! (2, Informative)

Anonymous Coward | about 9 years ago | (#12154054)

the problem is it ends up getting contaminated and starting to become a rather better conductor

Not really (0)

Anonymous Coward | about 9 years ago | (#12153950)

Cray has always used water cooling for their big iron

Actually not technically true. They don't use "water", the older ones anyway used a liquid called fluorinert by 3M to do their cooling. In this way the components could be submersed in the fluid (which you obviously can't do with water since it's conductive). You could also get T3E configs that were air cooled (though you could get more processors in the liquid cooled versions)

Re:Not really (1)

javamann (410973) | about 9 years ago | (#12154066)

Actually pure water is not conductive. By measuring the resistance is one way to tell how clear your water is. (Wafer FAB background, sorry)

Both (3, Interesting)

turtled (845180) | about 9 years ago | (#12153769)

I agree, both solutions would help. Our room is a nice cool 62.5. Best condtions to work in!

Cooler rooms also keep others out... we get a lot of, its so cold, and they leave. That's golden =)

Re:Both (1)

green pizza (159161) | about 9 years ago | (#12153893)

I agree, both solutions would help. Our room is a nice cool 62.5. Best condtions to work in!

We keep ours at 73 degrees, about 2 degrees warmer than the rest of the building. We did the 60 degree thing for awhile, but it required quite a bit more electricity to maintain that temp. The servers work fine at 80 degrees, but 73 is more comfortable and provides a little more cushion.

Mabe a mix? (1)

burgeswe (873550) | about 9 years ago | (#12153774)

Right now I'm just starting up the IT department where I work, and I've noticed a substantial difference in merely moving the servers from one room to another. of course, my new it room is the old boiler overflow room, and I've heard it rumored that it used to fill up with steam... :S

Re:Mabe a mix? (1)

TykeClone (668449) | about 9 years ago | (#12153962)

I just moved my servers from a room near the boilers to a new server room with a dedicated air conditioner.

In the old room, I couldn't keep the servers cool in winter when the boilers were on. Summer wasn't so bad because I could leech off of the main air conditioners to get some extra cooling. I did have a small window unit air conditioner dumping heat from the small server room into a larger room, but it just couldn't keep up.

I guess that what I'm saying is "Good luck with that" :)

What about cold countries (1)

rescendent (870007) | about 9 years ago | (#12153779)

Does this mean Alaska, north Canada, Greenland and Antartica are soon going to be popular server centres? Saves on air con...

Re:What about cold countries (1)

Ironsides (739422) | about 9 years ago | (#12153877)

Does this mean Alaska, north Canada, Greenland and Antartica are soon going to be popular server centres? Saves on air con...

[sarcasm]But that would promote Global warming, the melting of the Polar Icecaps, the Greenland Glacier and other glaciers! [/sarcasm]

Seriously, I can honestly see someone arguing that point.

Re:What about cold countries (1)

rescendent (870007) | about 9 years ago | (#12153953)

I suppose you could have them water cooled as well and heat your installations water... Then your saving on your heating bills.

Save the globe: don't slashdot ! (3, Funny)

AtariAmarok (451306) | about 9 years ago | (#12153999)

"[sarcasm]But that would promote Global warming, the melting of the Polar Icecaps, the Greenland Glacier and other glaciers! [/sarcasm] Seriously, I can honestly see someone arguing that point."

"Hey! Did you know that when you slashdotted that server near the Ross Ice Shelf, you caused 2 icebergs to calve? You insensitve clod!!!!"

got them all beat (0)

Anonymous Coward | about 9 years ago | (#12153780)

And who will be the first vendor to bring water back into the data center?

Hah! We've got them all beat. We keep a water cooler in the server room.

If you have cooler servers... (5, Insightful)

havaloc (50551) | about 9 years ago | (#12153788)

...you won't need as much cooling in the room. Easy enough. This will save a ton of money in the long run, not to mention the environment and all that.

Outside air? (1)

jargoone (166102) | about 9 years ago | (#12153789)

Maybe my ignorance is showing here, but does any installation use outside air for cooling? It seems that it would make sense in places that have cold winters (like here in the midwest).

Re:Outside air? (3, Insightful)

green pizza (159161) | about 9 years ago | (#12153854)

Maybe my ignorance is showing here, but does any installation use outside air for cooling? It seems that it would make sense in places that have cold winters (like here in the midwest).
You'd need a lot of filtering and/or humidity control to make that a realistic option. Better yet to make use of outside air temperature. Which is exactly what your heatpump loop or your AC cooling tower is for.

Re:Outside air? (2, Interesting)

Anonymous Coward | about 9 years ago | (#12153932)

Especially in the colder areas of the world it'd be criminal to waste the heat by pumping it outside (up to a certain point, anyway)

You could be heating buildings or a greenhouse with it, after all. Or making steam to pipe heat. Maybe even turning generators? Not sure what the step-down of the efficiency of it all is.

Apparently A/C is only 1/3rd efficient... but as you're going to be losing that anyway, might just look at the output heat.

Re:Outside air? (1)

iso (87585) | about 9 years ago | (#12153872)

I'm no expert, but I would assume that it's probably too unpredictable, and too humid

Re:Outside air? (3, Interesting)

Anonymous Coward | about 9 years ago | (#12153875)

We used to do exactly this in an ops building belonging to the company I worked for in 1997. The server room was cut off from the rest of the heating system in the building, with piped cold air from outside.

It took 8 months until the first servers started dying from the intense corrosion & pitting on equipment closest to the air outlets. We were bringing in air that while it was ice cold, was unfiltered and brought pollution in from 2 storeys above street level, and I dare say more moisture than the air conditioned recycled worker-breathable air.

Filters fixed the problem.

Re:Outside air? (1)

milgr (726027) | about 9 years ago | (#12154024)

Last year, the AC died for one of my company's smaller labs. We connected some portable AC units, set up fans, removed the glass from a large window on this 10F day, and shut down about 1/2 the servers. This kept the lab temperature to about 95F. It was also a short term solution.

Liquid Oxygen Anyone? (2, Funny)

dink353 (747249) | about 9 years ago | (#12153798)

That might keep the odd CPU or two cool for a while...

Re:Liquid Oxygen Anyone? (1)

P-Nuts (592605) | about 9 years ago | (#12153887)

I realise this was a joke, as obviously liquid oxygen is going to be a fire hazard. But you can't go pouring liquid nitrogen on either, as you'll have problems with frost forming from moisture in the atmosphere.

inevitability breeds contempt (5, Interesting)

icebrrrg (123867) | about 9 years ago | (#12153800)

"Roger Schmidt, chief thermodynamics engineer at IBM, [recently] admitted that, while everyone knows servers are one day going to be water-cooled, no one wants to be first, believing that if their competitors still claim they are fine with air cooling, the guy who goes to water cooling will rapidly drop back in sales until others admit it is necessary."

you know, some times the market actually rewards innovation. tough to believe, i know, and this isn't innovation, it's common sense, but mfg's are afraid of this? come on, people, the technocenti have been doing this for their home servers for a long, long time, let's bring it into the corporate world.

Re:inevitability breeds contempt (1)

trentblase (717954) | about 9 years ago | (#12153930)

Manufacturers are worried about raising the price point. Water cooling is actually pretty pricy, so you have to measure the (potential) overclocking gain against the cost of throwing another blade in there. If you really need raw serial performance, as another poster pointed out, Cray has always used liquid cooling.

Re:inevitability breeds contempt (0)

Anonymous Coward | about 9 years ago | (#12153947)

you know, some times the market actually rewards innovation. tough to believe, i know, and this isn't innovation, it's common sense, but mfg's are afraid of this? come on, people, the technocenti have been doing this for their home servers for a long, long time, let's bring it into the corporate world.

IBM's mainframes have been water cooled since the 70s. Afterall, they did invent the technique!

Re:inevitability breeds contempt (1)

scsirob (246572) | about 9 years ago | (#12153971)

.. Care to explain how going back to a cooling system from 30 years ago qualifies as innovation?!?

Seriously, with the greenhouse effect and everything, I think preventing heat from being generated is a double-edged sword. Cut down on energy heating the servers up, so you can also cut down on cooling the remaining heat.

So everyone is playing chicken? (1)

jcuffe (873322) | about 9 years ago | (#12153802)

while everyone knows servers are one day going to be water-cooled, no one wants to be first, believing that if their competitors still claim they are fine with air cooling, the guy who goes to water cooling will rapidly drop back in sales until others admit it is necessary

So everybody knows that it's necessary, but they're just waiting for the other guy to do it first so that they don't have to take any risks? Sounds familiar to me.

No brainer for me... (2, Insightful)

Eil (82413) | about 9 years ago | (#12153806)


Ideally, you should have a cool server and and cool room. The two work in combination. If you have a hot room, then the server isn't going to be able to cool itself very well even with the best heatsinks and well-placed fans. Yes, you could use water cooling, but there are other important bits inside of a machine besides whatever the water touches. But a cool room does no good if your servers aren't setup with proper cooling themselves.

Re:No brainer for me... (0)

Anonymous Coward | about 9 years ago | (#12153965)

"If you have a hot room, then the server isn't going to be able to cool itself very well even with the best heatsinks and well-placed fans."

Tell me about it. We have a 64-node cluster here at the university and after moving into the new building, the physical plant decided that 95 degrees F was alright because the computers could cool themselves....

Fast-forward... we now have a 13-node cluster, a bunch of melted motherboards, and a physical plant that says it wasn't their fault...

Already done. (1)

nickco3 (220146) | about 9 years ago | (#12153814)

There is already water in the datacentre where I work. The site is a converted leisure centre, and has a water-sprinkler fire system. The first whiff of smoke in that place and the entire server room is toast. A Halon system is regarded as too expensive. Seriously.

Re:Already done. (1)

IckySplat (218140) | about 9 years ago | (#12153910)

Actually most new Data Centres are using water.
Halon and it's replacements are actually bad for the hardware.
These days Data Centres use de-mineralised water.
Safer for the gear (Once you've dried it out)

First thing.. (3, Insightful)

Anti Frozt (655515) | about 9 years ago | (#12153816)

That comes to mind is that it will probably be vastly cheaper to cool a rackmount specifically than to lower the ambient temperature of an entire room to the point that it has the same effect. However, I'm not entirely sure how well this scales to large server farms and multiple rackmounts.

I think the best option would be to look at having the hardware produce less heat in the first place. This would definitely simplify the rumbling these engineers are engaged in.

Cooler servers, definitely (1)

AtariAmarok (451306) | about 9 years ago | (#12153835)

Cooler servers, definitely. If you have cooler rooms, people from all over cubicledom will gravitate to the room during the dog days of August, followed by the inevitable "what does this knob do?" and secretive "what happens if I start switching around these funny phone cables in the back of the black box?".

The fewer rubes lounging by the server towers, the better.

Water cooling, pah! (5, Funny)

hazee (728152) | about 9 years ago | (#12153840)

Water cooling? Pah! Why not take a leaf out of Seymour Cray's book - build a sodding great swimming pool, fill it with non-conductive freon, then just lob the whole computer in.

Also has the added benefit that you can see at a glance which processors are working the hardest by looking to see which are producing the most bubbles.

Wonder if you could introduce fish into the tank and make a feature of it? If you could find any freon-breathing fish, that is...

Re:Water cooling, pah! (1)

LiENUS (207736) | about 9 years ago | (#12153984)

Isn't freon liquid at room temperature? Your swimming pool wouldn't last too long, you'd either have to preassurize the system (not good for equipment) or refill it every couple of minutes.

Re:Water cooling, pah! (1)

hazee (728152) | about 9 years ago | (#12154072)

Isn't freon liquid at room temperature?

Err... that's the point. Wouldn't be much of a swimming pool otherwise, would it?

Besides, without liquid in the tank, you'd have to get hold of levitating fish, which I suspect would be even harder than finding simple freon-breathing ones.

seems to me... (2, Interesting)

justkarl (775856) | about 9 years ago | (#12153846)

that you should stop the problem where it starts. Cool the servers, then the room won't get hot(duh).

Re:seems to me... (1)

bill_mcgonigle (4333) | about 9 years ago | (#12153992)

that you should stop the problem where it starts. Cool the servers, then the room won't get hot(duh).

Easy to say, but are you willing to give up fast servers for cool servers? We don't have the technology to make fast and cool microchips.

Swiftech (1)

eander315 (448340) | about 9 years ago | (#12153863)

Swiftech is my guess as the first who will offer widespread, professional watercooling solutions for 1U rack-mount water cooling solutions using the Laing DDC pump, rebadged as the MCP350 [swiftnets.com] . I don't think any of the other big players in that industry currently have the products or expertise to pull it off in the near future.

Re:Swiftech (1)

GKevK (519962) | about 9 years ago | (#12153961)

It would be interesting if there were just a standard developed for a water cooling connection... just like power and network. The machine room would get more comfortable for humans both from a temperature and sound perspective.

Re:Swiftech (1)

green pizza (159161) | about 9 years ago | (#12153993)

Swiftech is my guess as the first who will offer widespread, professional watercooling solutions for 1U rack-mount water cooling solutions using the Laing DDC pump, rebadged as the MCP350. I don't think any of the other big players in that industry currently have the products or expertise to pull it off in the near future.

Yeah, I bet Dell or HP can't figure out how to do that.

*rolls eyes*

Actually, given some of the recent decisions made by HP, they probably couldn't!

No way, not in my shop (4, Funny)

doc_traig (453913) | about 9 years ago | (#12153871)


The sign on the door clearly states, "No Food or Drink". Of course, shirts are still optional.

Re:No way, not in my shop (0)

Anonymous Coward | about 9 years ago | (#12153986)

The sign on the door clearly states, "No Food or Drink". Of course, shirts are still optional.

See, we don't need cooler servers or cooler rooms, we just need cooler admins.

Many processors for cooling (1)

kabbor (856635) | about 9 years ago | (#12153880)

The first thing I thought was - Yes, half the heat per processor x 4 chips = double the heat, right?
Then I recalled that the reason for those blindingly fast chips is so that they can get through their present job and react in a timely manner to the next task.
If you've got another chip or two waiting, then you can take your time, can't you!
(I'm also sure that power usage could be shown to be exponential, so we will make gains then.)
Hey, and who needs water? There's a few of those non-wetting liquids that we could use. Flow them around the die itself for maximum heat transfer!
Lots of fun.
(Until $RANDOM decides a glycol antifreeze would make a better chioce! Then EVEN MORE fun!)

No one... (1)

AndyCap (97274) | about 9 years ago | (#12153885)

Since rittal is already there [industrysearch.com.au] .
PDF flyer here: here [enclosureinfo.com]

Does look like a neat way to keep your beowulf cool. :-P

Cooler servers... (5, Insightful)

Aphrika (756248) | about 9 years ago | (#12153904)

From experience of aircon failing/breaking.

At least if a server fails it's one unit that'll either get hot or shutdown which means a high degree of business continuity.

When your aircon goes down you're in a whole world or hurt. Ours went in a powercut, yet the servers stayed on because of the UPSes - hence the room temperature rose and the alarms went off. Nothing damaged, but it made us realise that it's important to have both, otherwise your redundancy and failover plans expand into the world of facilities and operations, rather than staying within the IT realm.

More to the point... (0)

Anonymous Coward | about 9 years ago | (#12153917)

..when will vendors STOP taking the water ?

My anecdotal data using HD temps: (1)

ender- (42944) | about 9 years ago | (#12153938)

Lets see here. I have a server [Duron 1200] in a datacenter that's kept fairly cool. [Not as cool as it should be, it's a shitty hosting company, but I can't complain because I'm not paying for it]. The HD temperatures in that server run around 54C.

I have another server [P4 1.8Ghz] sitting in a spare bedroom of my house. The temperature in that room is usually about 80F (27C) even with the AC going. It gets even warmer if I'm in there gaming on my desktop machine during the summer. The HD temps in that server are around 33C.

The difference between the two cases is the fans. In my home server there are 4 fans blowing directly over the HD's using air coming in from the front, as well as an exhaust fan directly over the CPU, an exhaust fan in the back plus the PS fan. In the datacenter server, there's just the PS fan [and MAYBE an exhaust fan in the back, I don't remember].

So in my experience, the cooling in the server itself is more important than the ambient temp. All the cool ambient air in the world isn't going to help if the server case becomes a small insulated pocket of hot air. With that said it's important to give both as much cooling capability as possible.

Ender-

A/C and commercial solution worked well (1)

matth1jd (823437) | about 9 years ago | (#12153964)

At the university I used to work at during my undergrad our servers were cooled not only by 2 large A/C units in the room, but by large fans on the rear of the racks whose exhaust went directly out of the room.

IIRC this was a solution from APC. All together it effectively kept the room at right around 55 - 60 degrees.

People used to wonder why I'd go to work in jeans and long sleeved shirt in the summer.

It's not a colder room, it's air circulation (2, Interesting)

pcguru19 (33878) | about 9 years ago | (#12153976)

The costs for improving data centers to provide more or colder air is more than just building out more square feet of data center space.

Just because HP is sells a 42U rack doesn't mean you have to cram blades into all 42Us. It's cheaper to spread the heat load across a larger area than to figure out how to put 1500 CFM out of a floor tile so the top of a blade rack gets air.

There are studies by the uptime institute that say that 50% of hardware failures happen in the top 25% of rack space because the top of the rack doesn't get any air from the floor tiles and it cycles exhaust from the rack or ambient air for cooling.

We just put in the latest blade rack from HP. 4 50 amp circuits(2 for redundancy) for a 4 square foot space is beyond silly. That's more service and electrical consumption than a 1500 square foot home after you eliminate the two circuits for redundancy.

direct connect (0)

Anonymous Coward | about 9 years ago | (#12153982)

I wonder if the industry will come out with a standard direct AC to case connection anytime - seems to me to be more efficient than cooling the whole freakin' room.

experiments (1)

ArnIIe (814978) | about 9 years ago | (#12153983)

It would be interesting to see some stats (if available) on what happens to the servers heat, when a room temperature drops in incremental stages. Also I have seen some server farms that have fans inside the cabinets for extra cooling.

Cooler servers! (1)

Brian Stretch (5304) | about 9 years ago | (#12153985)

See the power consumption chart on this page [gamepc.com] . Buy the right CPUs and heat is much less of a problem. (Yes, I know, PowerPC is better in this regard, but if you want to run x86...)

I believe... (1)

pkx (446643) | about 9 years ago | (#12153990)

...the Cisco CR-1 (known internally as the HFR - Huge F'in Router) uses water cooling.

A Wash (1)

mattmentecky (799199) | about 9 years ago | (#12154000)

I think it is safe to say it is a wash.
If you kept servers cooler, then you wouldnt need the room to necessarily be as cool, so it is just a shift in energy consumption one way.
If you kept the room cooler, the ambient temperature would need to be much cooler to have the same affects of cooling directly on the server, so it is a shift in power consumption the other way.

Energy efficiency and Hosting- Host NORTH ! (2, Interesting)

cbelt3 (741637) | about 9 years ago | (#12154005)

OK, here's a concept.
If data center location isn't such a problem as long as we have high speed data lines, locate the data center someplace nice and cold.
Like Manitoba, or Minnesota, or Alaska, or Siberia. Heat the work area with the flow from the data center.
Hire your local population to maintain the data center.
Profit !

Heat the great white north (1)

AtariAmarok (451306) | about 9 years ago | (#12154036)

"Like Manitoba, or Minnesota, or Alaska, or Siberia. Heat the work area with the flow from the data center. Hire your local population to maintain the data center" P? You really like Innuit women in bikini's, don't you?

Our server room: (1)

raynet11 (844558) | about 9 years ago | (#12154029)

Believe it or not we use an air conditioner for the
room just as you have a central air unit on the
outside of your house. I have maybe 40 PC's and
Servers in the room and the air conditioning unit
sits just outside the room. It's been running
with no problems for 7yrs now. The few times it
did go down (blown breakers) it's amazing how
quickly the room heats up and how hot it really
gets without the air.
The setup was cheap and it's a low TCO if you
figure that air conditioner is very easy to
maintain and run.

Cooling is not an option (2, Informative)

onyxruby (118189) | about 9 years ago | (#12154033)

Water cooled servers have been out for a little while by some vendors [directron.com] . You can find rack mount water cooled [kooltronic.com] gear pretty easily. Too much damage is done too quickly when you don't have cooling. I have worked in environments where if a server room was allowed to get up to 74 F /23.3 C and an HVAC contractor wasn't already on the way there would be hell to pay.

There really isn't a question of if it will become widespread. Overclocking sites have had more than a few visits from Intel and AMD over the years. It's an inevitable problem with an inevitable solution. The only question is how long until water cooling becomes more popular. Heat needs have had people clamoring for Pentium M processors for rack mount gear for a while as well. It's a reasonably speed CPU that handles heat fairly well. It would work very nicely in rack mount gear, but motherboards that will take one are fairly rare.

As for server rooms, they will continue to be air conditioned for as long as all of your server room equipment is in their. Even if you found a magical solution for servers you still have RAID arrays, switches, routers and the like all in the same room. Server rooms are well known by HVAC people as requiring cooling. Most HVAC vendors will prioritize a failed server room HVAC over anything but medical. They know damn well that anybody that has an air conditioner designed to work in the middle of January in Minnesota or North Dakota isn't using the cooling for comfort.

water already there (0)

Anonymous Coward | about 9 years ago | (#12154035)

last I checked, Cray's were water cooled out of the box.

So what is the best temp for the room (1)

arctuniol (592174) | about 9 years ago | (#12154040)

Currently my server room is sitting around 67-70 degrees. I am curious what the best temp for a server room should be at.

The first vendor (1)

nigham (792777) | about 9 years ago | (#12154043)

... will probably be supplying Intel chips. Strangely, Intel's had far more heat-dissipation issues than AMD in the recent past; probably because they concentrated a bit too much on clock speed?

freee cooooling (1)

dingDaShan (818817) | about 9 years ago | (#12154045)

Other than some high pings, there is no reason to not just make every data center in the artic circle. The cooling would be free, and then we wouldn't have to argue about what costs less.

More efficient (ie less heat output) servers (1)

GAATTC (870216) | about 9 years ago | (#12154060)

Isn't cooling the room vs cooling the server a bit like trying to solve our energy problms by drilling for oil in Alaska vs spending $400 billion a year securing our oil interests in the middle east. Neither one is a good long term solution, while reducing our consumption helps the whole problem go away. Perhaps the same argument applies to servers. If they are more efficient and throw out less heat, then it is much easier to deal with any heat that is produced.

cooler servers... (0)

Anonymous Coward | about 9 years ago | (#12154061)

cooler servers are better. you reduce the power usage from having to cool the room and the servers themselves; this saves your company money. it is also better for the environment since you need less electrical generation (less coal burning, etc.) as well as less need for whatever nasties may be involved in modern HVAC (freon, etc.).
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...