Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Hot Aisle Or Cold Aisle For Containment?

kdawson posted more than 4 years ago | from the contain-this dept.

IT 181

1sockchuck writes "Separating the hot and cold air in a data center is one of the keys to improving energy efficiency. But containment systems don't have to be fancy or expensive, as Google showed in a presentation Thursday, which discussed the use of clear vinyl curtains in isolating hot and cold aisles. Containment systems have been in use at least since 2004, but there's an ongoing debate about whether it is best to contain the hot aisle or cold aisle. Leading vendors are split as well, as APC advances hot aisle containment while Emerson/Liebert champions a cold aisle approach. What say Slashdot readers? Do you use containment in your data center? If so, do you contain the hot aisle or cold aisle?"

cancel ×

181 comments

FIRST (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#32059146)

FIRST

Re:FIRST (5, Funny)

Anonymous Coward | more than 4 years ago | (#32059444)

Why is this modded offtopic? OP was clearly stating expressing his support for hot aisles.

The fridge is in another room (0)

Anonymous Coward | more than 4 years ago | (#32059154)

Does that count as cold aisle containment?

Supermarkets (5, Funny)

shogun (657) | more than 4 years ago | (#32059156)

I thought this article was about supermarkets, where it might be a good idea anyway too..

Re:Supermarkets (1)

arielCo (995647) | more than 4 years ago | (#32059316)

I knew Google is keen to expand into new markets, but Oracle?

Re:Supermarkets (2, Informative)

burne (686114) | more than 4 years ago | (#32059422)

Dutch supermarkets are doing that. Test your dutch: original [amsterdam.nl] or test google translate: translated [google.com] .

Re:Supermarkets (1)

Swave An deBwoner (907414) | more than 4 years ago | (#32059954)

Well the Dutch put prostitutes behind glass doors first, so I guess you could say that they have chosen now to do both hot and cold aisle containment. Why choose one or the other when you can have both!

Definitely COLD (0)

Anonymous Coward | more than 4 years ago | (#32060110)

it makes for hard nipples on your staff

Re:Supermarkets (2, Funny)

jtownatpunk.net (245670) | more than 4 years ago | (#32060936)

I'm disappointed that I don't see a single Tesla vs. Edison reference. :(

Amana (2, Funny)

ushering05401 (1086795) | more than 4 years ago | (#32059206)

Re:Amana (1)

X0563511 (793323) | more than 4 years ago | (#32059282)

Then you don't have a datacenter. Which is what this is in regards to (not server "rooms")

Re:Amana (1)

ushering05401 (1086795) | more than 4 years ago | (#32059392)

I assure you the first post was a joke (even though that is roughly the model AC I use). If serious I would have chimed in about my containment strategy like so:

My rack room vents straight up through an insulated hard lid ceiling. Back-pressure is prevented with two fans that blow the warm air from above the lid, through a full height wall, and into a non-climate controlled warehouse...

Can you try both methods? (2, Interesting)

e9th (652576) | more than 4 years ago | (#32059210)

Depending on how your facility is ducted, it might not cost much to try both options and measure the results. Even if you have to spend a few thousand doing so, the long term savings from choosing the best method for your site would probably be well worth the cost of testing.

Re:Can you try both methods? (5, Interesting)

twisteddk (201366) | more than 4 years ago | (#32059420)

Well, as most companies that have to build a new datacenter will tell you. It's cheaper to generate heat than cold. So I'd go for cold containment. Generally speaking most companies do AIM to put their new datacenters as close to the north pole as possible simply because it's cheaper to use outside air that's natually cold. That puts countries like Canada, Greenland, Denmark, Norway, Sweden and Finland in high demand for datacenters (end technicians to staff them). If the US didn't have rediculous data laws, Alska might also be ideal.

In our new datacenter we're even using the excess heating from the servers to heat the offices ontop of the giant basements below. This sort of setup is ideal for outside temperatures that generally range below the normal cooling needs of a server (or several). But in any event there's still a huge bill to pay for moving the air back and forth, so containment is definately still an issue, as is the size of the pipes when you have say... 10MW of electricity going into your servers and quite a lot of that energy coming back out as heat ;)

Re:Can you try both methods? (1)

e9th (652576) | more than 4 years ago | (#32059476)

I agree with you. But it's not clear whether the submitter is talking about constructing a new data center or adding containment to an old one.

Re:Can you try both methods? (0)

Anonymous Coward | more than 4 years ago | (#32059876)

That puts countries like Canada, Greenland, Denmark, Norway, Sweden and Finland

You forgot Iceland where there are already interesting datacenter installations. As a bonus, the geothermal energy provides a capability for those fancy "zero carbon" datacenters. The northern parts of Russia would also be very interesting if Russia could open up some areas around of its strategic military assets, the answer being obviously no, and provide some infrastructure for the more distant areas. Of course, the distances are enormous so even the speed of light is an obstacle for some applications, unless Beijing becomes the new banking mecca after formation of the New Democratic Union of The Combined China, for example. Though Iceland is closer to the City of London, the network delays have made some applications impossible for now.
Although the discussion here is mostly about air, it would be more likely that liquids are a more critical resource for datacenter cooling of tomorrow (no wonder Google converted a former pulp factory by a lake some time ago in the Scandinavia) . Nuclear power facilities could point the way here.

Re:Can you try both methods? (5, Insightful)

countertrolling (1585477) | more than 4 years ago | (#32060192)

You forgot Iceland...

Yeah well, they're having a little trouble with containment right now themselves. And it appears geothermal isn't as clean as it was made out to be.

Re:Can you try both methods? (3, Informative)

Thundersnatch (671481) | more than 4 years ago | (#32059900)

That puts countries like Canada, Greenland, Denmark, Norway, Sweden and Finland in high demand for datacenters (end technicians to staff them). If the US didn't have rediculous data laws, Alska might also be ideal.

Some datacenters perhaps, that don't need good Internet connectivity. But the latency between major populations and the far North makes those locations less desirable. We have struggles with the latency between Chicago and Dallas with some applications; Chicago-to-Fairbanks would be quite a bit more painful.

Re:Can you try both methods? (3, Interesting)

starfishsystems (834319) | more than 4 years ago | (#32060558)

You can build a switched network to connect the remote data center to the point of presence where you want it to join the backbone.

Though this does nothing to mitigate time-of-flight latency, it nicely eliminates the latency and jitter issues due to routing. It's what we did at Westgrid to connect our computing clusters to storage facilities many hundreds of kilometers away.

Re:Can you try both methods? (3, Funny)

ZorinLynx (31751) | more than 4 years ago | (#32060016)

>10MW of electricity going into your servers and quite a lot of that energy coming back out as heat ;)

All of it. The laws of thermodynamics are clear.

Sorry, I can't help being a smartass sometimes. ;)

Re:Can you try both methods? (1)

Interoperable (1651953) | more than 4 years ago | (#32060082)

Well if you want to be strictly correct, and it seems that you do, some of it will be converted to acoustic noise and escape through the walls, or be transmitted out through the wires or end up changing the magnetic potential energy in hard disk platters.

Re:Can you try both methods? (3, Funny)

loshwomp (468955) | more than 4 years ago | (#32060164)

Sweden and Finland in high demand for datacenters (end technicians to staff them).

Why would you want to staff your datacenters with proctologists?

Re:Can you try both methods? (0)

Anonymous Coward | more than 4 years ago | (#32060950)

HAHAHA GOOD ONE

wait, nevermind. Shut the fuck up.

Re:Can you try both methods? (1)

uvajed_ekil (914487) | more than 4 years ago | (#32060396)

Well, as most companies that have to build a new datacenter will tell you. It's cheaper to generate heat than cold. So I'd go for cold containment.

I agree - contain the cold to keep it cold (keeping heat out of that area, really). Cooling works not by "adding cold" but by removing heat, and the retired HVAC engineer I know (heating and cooling, not hack/virus/anarchy/carding) insists that cooling always requires more effort and care than heating. He did a lot of work for big clients that include Anheuser-Busch, Frito Lay, teh Detroit Silverdome, etc., so I trust him. Let the hot side disperse heat out wherever else it pleases, assuming you are dealing more with a problem of excess heat than heating human-occupied space.

In a modern datacenter, it seems like a no-brainer to have some sort of a system to recover the considerable waste heat that is generated, especially when the location necessitates heating in the colder months. There is certainly the initial cost to consider, but reduced utility bills can make up for this before long, plus there's the green, carbon, blah blah blah corporate talking points you'll have at your disposal. I love the idea of locating datacenters in places like Finland, Scandinavia, and Canada. Efficiency should be considered in all aspects of good datacenter design, whether you are building from the ground up or looking to update things. Separating hot/cold is a good, cheap, very easy start, and some companies have taken that and really began to expand upon it.

Re:Can you try both methods? (1)

T-Bone-T (1048702) | more than 4 years ago | (#32060496)

My apartment is backwards. The AC runs almost constantly in the summer and the heater runs about 50% of the time. In the summer, my bill is consistently around 1350 kW/h but almost 1500kW/h in the winter. It seems my heater uses over twice as much electricity.

Both might actually be best (3, Interesting)

Roger W Moore (538166) | more than 4 years ago | (#32060744)

It's cheaper to generate heat than cold. So I'd go for cold containment.

Actually containing both might be best since then you will have a "room temp" air gap between the two and air is a fantastic insulator. IF you do not contain the hot then the heat will diffuse and the air on the other side of the vinyl curtain will be warmer than room temp. This will warm your incoming cool air. The effect may not be particularly noticeable but it would be an interesting test to see if there is a noticeable improvement to doing both.

Re:Can you try both methods? (2, Insightful)

pla (258480) | more than 4 years ago | (#32060854)

Depending on how your facility is ducted, it might not cost much to try both options and measure the results.

Call me naive, but... Why not do both at once?

Cold air goes in from the bottom (or one side), through the rack, and hot air goes out the top (or the other side). I realize that companies don't really care about such minutiae, but that would allow the mere humans that occasionally need to service all those expensive racks to experience a temperature other than 40F or 120F.

Or, hey, how about just cooling the damned things with intelligently ducted outside air and cutting the electric bill by a third?

Cold (5, Funny)

Waffle Iron (339739) | more than 4 years ago | (#32059216)

What say Slashdot readers? Do you use containment in your data center? If so, do you contain the hot aisle or cold aisle?

I think that I speak for most readers here when I say that it's pretty much all cold aisle down here in my mom's dank basement. Not much containment either, other than some pegboard partitions.

Re:Cold (3, Funny)

Anonymous Coward | more than 4 years ago | (#32059562)

Yeah, it's definitely cold down there. But your mom was hot! The lack of decent containment did spoil the fun somewhat!

I suggest hot aisle containment (3, Interesting)

mysidia (191772) | more than 4 years ago | (#32059222)

Contain and exhaust your heated air, vent it up outside

That way it doesn't mix with the cold air much.

If you just contain your cold air, then you have a situation where the hot air is staying in the room, and that heat will be absorbed over a larger surface area, by all the things in your server room (including the Air handling units).

Re:I suggest hot aisle containment (2, Interesting)

nacturation (646836) | more than 4 years ago | (#32059404)

You can still vent the hot air elsewhere, but the problem with hot air only containment is that then the entire room is effectively one large cold aisle, contained within the walls and the limiting factor is how well insulated the walls are. If that logic holds, it's better to limit the size of the cold aisle as you can add a lot more really good insulation where appropriate to limit unwanted heat absorption.

Re:I suggest hot aisle containment (2, Insightful)

SuperQ (431) | more than 4 years ago | (#32059426)

I think containing the hot isle is probably the best way to go as well.

* When I'm working in a datacenter I'd rather be walking around in the cold isle (~70-80F in a modern datacenter) than the hot isle (100-120F if properly contained)
* Containing the hot isle and to a small space and using the rest of the air and space around the rack (up to the ceiling, walking isles, etc) allows more volume of cool air to be a buffer in case of low/failed cooling capacity.

Re:I suggest hot aisle containment (1)

Glendale2x (210533) | more than 4 years ago | (#32059798)

I think containing the hot isle is probably the best way to go as well.

* When I'm working in a datacenter I'd rather be walking around in the cold isle (~70-80F in a modern datacenter) than the hot isle (100-120F if properly contained)

This is probably diverging a bit on the original question, but seeing your 70-80 "modern datacenter" range reminded me of something I've wondered about lately: Has anyone researched the tradeoff point between when the server cooling fans start spooling up and turning the temperature up to run a "hot" floor? Running fans at a higher RPM certainly translates into more current draw than if they're running at their lowest speed. Sure, the equipment can stand running hotter and you're being "green" by not running the A/C as much, but are you just trading that for extra power wasted on spinning a whole lot of fans faster?

Re:I suggest hot aisle containment (1)

RobertM1968 (951074) | more than 4 years ago | (#32060178)

I think containing the hot isle is probably the best way to go as well.

* When I'm working in a datacenter I'd rather be walking around in the cold isle (~70-80F in a modern datacenter) than the hot isle (100-120F if properly contained)

This is probably diverging a bit on the original question, but seeing your 70-80 "modern datacenter" range reminded me of something I've wondered about lately: Has anyone researched the tradeoff point between when the server cooling fans start spooling up and turning the temperature up to run a "hot" floor? Running fans at a higher RPM certainly translates into more current draw than if they're running at their lowest speed. Sure, the equipment can stand running hotter and you're being "green" by not running the A/C as much, but are you just trading that for extra power wasted on spinning a whole lot of fans faster?

That really depends on datacenter design. There are lots of factors that affect it. While my place is not a datacenter, we do run a pretty decent sized stack of servers. The cooling (provided from the wrong side of the room sadly) needs to be set in the low low 60's range to keep the temperature in the server area at about 80 degrees.

We will soon be installing a hot air containment and venting system to help with our cooling. It should help considerably. And during the winter, the hot air will be blow to the other side of the server/office room. It currently keeps the room at about 65 degrees on very cold winter days and the heat disperses decently, but we'd rather have it blowing through a vent section we designed into the front of the space where the entrance door is. That space is also ideally away from the server stack.

Anyway, back to your question, rack containment (by rack design or by datacenter design), stacking order (hotter things on top? or on bottom? all hot things in one rack and the cooler ones in another?), ceiling height, natural air flow characteristics and ability, artificial air flow created by the ceiling(?) or floor (?) cooling ducts, space behind the racks where they vent... well, I could go on and on... but you get the idea... all of those affect what overall room temperature will trigger the units to increase fan speed to cool themselves.

I've been in some datacenters that were massive rooms, where, due to ceiling height, and space behind racks, room temperature could be considerably higher. In other places (that were more like enclosed corridors), when the "room area" was in the 70's, the servers were running considerably hotter and the "hot aisle" area was blazingly hot.

Ours trigger medium speed at about 80 degrees (non-server side) or 85 degrees (server side) and full speed at about 10 degrees hotter. But our servers also are designed to run pretty hot (per IBM's specs).

Re:I suggest hot aisle containment (2, Informative)

Daengbo (523424) | more than 4 years ago | (#32060546)

A recent article on Google's data centers said that they run as close to maximum temperature as possible: if the servers are rated to 90, they only cool to 88. Google is extremely efficient. The article said that the energy overhead for their data centers is only about 20%, while most data centers run 100%. Because of that, I'm sure Google has studied the server fan issue and determined that it's not a significant factor.

Re:I suggest hot aisle containment (1)

Glendale2x (210533) | more than 4 years ago | (#32060858)

The pictures I've seen with Google servers are caseless and don't have many fans [arstechnica.com] other than CPU and probably one in the PSU. I don't think that Google can be compared to the typical rackmount server the rest of us would use.

Re:I suggest hot aisle containment (2, Interesting)

Bigjeff5 (1143585) | more than 4 years ago | (#32059504)

I suggest containing the cold air.

If you contain the hot air you must cool a much larger area, which is very inefficient and makes anybody who must work in the server room less comfortable when compared to allowing waste heat to warm the main areas. More comfortable, less energy wasted cooling the cold aisle, and less energy wasted venting the hot aisle.

A vinyl partition is plenty of separation, and if you want to upgrade, use two vinyl partitions separated by an air gap. That's the same basic setup that the ski resort in Dubai uses, except they use two roofs instead of two vinyl partitions, of course. Air is a fantastic insulator when it is not allowed to mix.

Temperature lost through seepage from solid objects is going to be minimal, at best, unless they are made of large sections of aluminum or copper. Frankly, I've never seen a server room with large panels of aluminum or copper, so I don't see that being an issue.

See why there is a debate?

Re:I suggest hot aisle containment (1)

pla (258480) | more than 4 years ago | (#32060944)

I think you meant that as humor, right? But in case not...


and makes anybody who must work in the server room less comfortable when compared to allowing waste heat to warm the main areas

You work in Siberia, perhaps? The hot side of even a small server room, nevermind a data center, stays well over 100F. Not exactly comfy for most humans.


Temperature lost through seepage from solid objects is going to be minimal, at best, unless they are made of large sections of aluminum or copper.

Or, say, row after row of heavy-gauge steel boxes?


See why there is a debate?

Because most people can't accept that not all geographies have the same exterior conditions, and just want to mindlessly obey The Standard dictated from On High by the gods of HVAC? If you consider the "waste" heat valuable, you may want to contain it for use in the nearby area. If you need to run the AC even in normal office space 24/7/365, you probably don't really care where the heat goes, but wasted cold air amounts to a sin.

Re:I suggest hot aisle containment (1)

Dun Malg (230075) | more than 4 years ago | (#32059828)

Contain and exhaust your heated air, vent it up outside

As others have noted, you are then cooling a much larger space. I don't know that one is significantly better than the other. The cynic in me says that companies that like their employees will have the isolated space be the one that most resembles the predominant environmental extreme (i.e. warm isolated in hot climates), and companies that like their employees to suffer do the opposite (i.e. pipe cold air to the backs of the racks, making the room 85degF in July.

This is a problem ... (1)

DragonDru (984185) | more than 4 years ago | (#32059232)

that I would like to have.

Once your server room is to the point where you have hot and cold aisle, just contain one and go for a beer.

Datacenter containment (5, Funny)

thewiz (24994) | more than 4 years ago | (#32059244)

We only resort to using containment when the servers have been very, very naughty. We've found that chains, steel cable and duct tape are the best ways to keep servers in their racks.

Re:Datacenter containment (1)

ushering05401 (1086795) | more than 4 years ago | (#32059296)

Have you tried promising them cake?

Re:Datacenter containment (1)

Bigjeff5 (1143585) | more than 4 years ago | (#32059526)

The cake is a lieeee!!

Re:Datacenter containment (4, Funny)

tepples (727027) | more than 4 years ago | (#32059372)

We've found that chains, steel cable and duct tape are the best ways to keep servers in their racks.

Don't steel cable. It's illegel.

Cold (4, Insightful)

KingDaveRa (620784) | more than 4 years ago | (#32059264)

If we were to retro-fit it at work, I'd say cold aisle. To do so would mean curtains at the end of the aisles, as the under-floor vent grids are in front of the racks. The CRACs are at the end of the room sucking in air through the top, so it'd be cool air pumped up through the floor, into a cold-only zone, sucked through the racks, blown out the back into the rest of the room where it just swirls about until it's pulled into the CRACs again. I reckon it could be done cheaply and quickly. Do do it with the hot aisles would require more containment to get the air back to the CRACs. I think it'd be a case of which air flow it fits best.

Re:Cold (1)

Glendale2x (210533) | more than 4 years ago | (#32059396)

Yeah, what you do is going to depend on how your room was designed. Mine is 100% ducted, so it doesn't matter. Just drop curtains down from the ceiling to the top of the racks and bam, done. Each cold aisle has supply vents and each hot aisle has its own return. I don't know if you'd call that hot or cold aisle containment; I suppose it's containing both.

What you're describing with underfloor cold supply already has hot/cold containment: the raised floor. Simply extending that to enclose the front of the racks is the logical choice.

Re:Cold (1)

KingDaveRa (620784) | more than 4 years ago | (#32059466)

That was pretty much my thought. Or other data centre is a horrible mess of cages, walls, and wall-mounted AC units. It's being slowly closed down, luckily!

Re:Cold (1)

Glendale2x (210533) | more than 4 years ago | (#32059748)

Wall mounted A/C? Icky. I've only seen that in one other place that my local competitor calls their "business class colocation facility". They have the things randomly all over the place. I can't imagine any way to tame that kind of airflow.

Re:Cold (1)

KingDaveRa (620784) | more than 4 years ago | (#32059770)

It started as underfloor. Great big CRAC. However, it died, the parts were no longer available, and getting a new CRAC in was apparently impossible. So the wallmounted units went up. We've got 12 of them, and they barely keep up. Many are blowing into the back of racks. It's a right mess. There's massive hot and cold spots. As I say tho, the room is being wound down, so there won't be much left in there soon.

What not to do. (1)

attemptedgoalie (634133) | more than 4 years ago | (#32059270)

What did I find when I joined the sysadmin team at my place?

Putting cold air vents behind the racks doesn't help. Pull cold air through the front to the back? Nope. Chill the exhausted air because it sucks to walk behind the servers. Nice.

Re:What not to do. (0)

Anonymous Coward | more than 4 years ago | (#32059324)

Chill the exhausted air because it sucks to walk behind the servers.

Really? How odd: after a few hours in the data centre I find it's quite nice to go stand behind the SAN rack for a minute or two and warm up...

Re:What not to do. (1)

SuperQ (431) | more than 4 years ago | (#32059438)

That's because you have an improperly designed datacenter. A good modern datacenter should have 70-80F inlet temps for most equipment. The problem is if you don't do hot isle containment you have to use super cold air from the chilling equipment to keep that inlet temp requirement near the top of the racks.

Re:What not to do. (1)

UnderCoverPenguin (1001627) | more than 4 years ago | (#32059712)

Your desk is in the server room? At those clients of mine where I have seen their data centers, no one is in the actual server rooms unless hands-on access is needed. Otherwise the rooms are kept closed and dark with >99% of the sys admin work done from the cubicle farm. And even when console access is needed, the person is only in the server room longer enough to plug-in; then she/he works from one of the small desks just outside the room. (Though, I am told, the most important servers are permanently connected to a large matrix KVM switch)

Re:What not to do. (1)

eosp (885380) | more than 4 years ago | (#32060798)

(Though, I am told, the most important servers are permanently connected to a large matrix KVM switch)

KVM over IP is your friend.

Hot or Cold? (5, Insightful)

siglercm (6059) | more than 4 years ago | (#32059278)

Maybe I'm missing something obvious, but the answer shouldn't be complex. Base the decision to contain either hot or cold aisles on the differences to ambient temperature. If (HotT - AmbientT) > (AmbientT - ColdT), then contain the hot aisles. If it's the other way around, contain the cold aisles. This minimizes the entropy loss due to temperature mixing in the data center, I believe. Just my 2 cents.

Re:Hot or Cold? (1)

jo_ham (604554) | more than 4 years ago | (#32059454)

I think it's going too far though, if you start trying to work out Gibbs Free Energy change for your server room.

Re:Hot or Cold? (1)

jbengt (874751) | more than 4 years ago | (#32060002)

I'd like to know more about how he gets that entropy loss, though. He may have hit on the secret to making a perpetual serving machine.

Re:Hot or Cold? (2, Funny)

jo_ham (604554) | more than 4 years ago | (#32060198)

More threads!

More threads means more entropy!

Get a big enough T x deltaS and the server will cool itself!

Re:Hot or Cold? (4, Informative)

SuperQ (431) | more than 4 years ago | (#32059518)

This sorta doesn't work because what you care about in datacenter cooling is maintaining a constant equipment inlet temp. For all practical uses this means your AmbientT and ColdT are the same. What you did get right is that you want the largest delta T in your cooling equipment to provide efficient cooling. No matter what you do with hot or cold "containment" the end goal is to keep the HotT as high as possible when it hits your cooling system.

Re:Hot or Cold? (1)

Bigjeff5 (1143585) | more than 4 years ago | (#32059570)

You don't care about raw temperature transfer, you care about energy usage to either cool or heat

Is it cheaper to cool? Or is it cheaper to heat?

Can you save money and energy by containing the cool area and allowing the hot aisle to heat the rest of the room? Or is too much heat your problem, and you'd be cooling the whole space anyway, so cool the whole datacenter and contain the heat?

Honestly, I think the best solution for almost all situations would be to contain both hot and cold aisles. Chances are you're always going to need at least a little AC no matter where your datacenter is (I may be one of the few exceptions), so it doesn't make sense to let the hot aisle run rampant and only contain the cold aisle. As far as the cold aisle goes, that's always going to need cooling, so you'll definitely want to contain it.

If you're putting in a new containment system, why not duct it twice? Then if you need to warm the common areas you can just vent a little from your hot aisle to regulate the temperature, without letting it run wild. It would be more expensive to set up, but not by a lot if you plan for it from the beginning, and it will give you maximum efficiency.

Re:Hot or Cold? (3, Funny)

networkBoy (774728) | more than 4 years ago | (#32059740)

so far everyone's got it wrong.
since you don't want unauthorized people in the DC you seal it up (with only exhaust ports and a door). Pump LN2 into evaps in the room. Authorized techs are issued Scott Air Packs. Unauthorized people expire before they can do damage, and as a bonus the room has built in fire suppression ;-)

-nB

Re:Hot or Cold? (0)

Anonymous Coward | more than 4 years ago | (#32060154)

You stole my post.... but as we advance, the hot side will only become hotter (more dense cpus), whereas the cold side is generally fixed by the environment. So we may as well start containing the hot sides.

Re:Hot or Cold? (1)

PPH (736903) | more than 4 years ago | (#32060408)

Hot, Ambient, and Cold temperatures. We're starting to get close to identifying the important paremeters. Another one is Outside ambient. You've got to figure the heat transfer from indoor ambient to outside ambient. And you've got to do that for (in some cases) a wide range of outdor ambient temps. Not only do you want to minimize the unwanted transfers between your cold side or hot side air to the indoor ambient, you need to consider the transfer for the overall building envelope. Each of these heat flows is affected by the temperature difference across the ducting, cabinet sides, building envelope, etc. as well as the insulation seperating each of these zones. Also, the air handling systems pressure differential and resulting leakage must be onsidered in calculating heat losses.

Where is your datacenter? (5, Insightful)

Jave1in (1071792) | more than 4 years ago | (#32059308)

The best solution is going to based on the average ambient temperature of your location. If you're in a hot environment, why contain the cold if you need additional A/C in the datacenter for employees? Reduce costs by using the same equipment to cool both. If you're in a cold region, then let the heat also warm the datacenter. If you're in an ideal temperature environment, then you don't have much to worry about beside good air flow.

Re:Where is your datacenter? (1)

Inda (580031) | more than 4 years ago | (#32059458)

So, expanding the solution further, build one side of the data-centre in a cold environment, the other side in a hot environment. Air flow could be solved by elevating one end, causing natural ventilation (the "chimney effect").

Or have I missed something?

Re:Where is your datacenter? (0)

Anonymous Coward | more than 4 years ago | (#32059488)

So, expanding the solution further, build one side of the data-centre in a cold environment, the other side in a hot environment.

Build it on Iceland: One side facing the active volcano, the other side buried in the glacier ice cap covering the volcano.

You get the gorgeous view as a bonus.

Nothing new here (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#32059312)

Is this another bullshit "innovation" because Google is using it? Using vinyl curtains for air containment is nothing new or newsworthy, except that everyone apparently wants to suck Google's cock for every simple thing they do.

why not both? (-1, Redundant)

Anonymous Coward | more than 4 years ago | (#32059354)

Why not set up alternating cold and hot aisles. Suck air into the front of the boxes from the cold aisle and vent it to the hot aisle.

An alternating finger-like system would even let you have continuous walkways to service all the machines via hot and cold side.

Hot.........Cold
HHHHHHHHHHHHCC
HH ^ rack ^ CC
HHCCCCCCCCCCCC
HH V rack V CC
HHHHHHHHHHHHCC
HH ^ rack ^ CC
HHCCCCCCCCCCCC
HH V rack V CC
HHHHHHHHHHHHCC

Re:why not both? (0)

Anonymous Coward | more than 4 years ago | (#32060990)

The -1 mod for above is a prime example of why I don't bother reading /. anymore most of the time. No, it's not redundant. Check the timestamps and check others who got insightful for later posts.

Position of the HVAC system? (1)

mlts (1038732) | more than 4 years ago | (#32059362)

One critical thing is where are the HVAC return ducts and where the air vents are. Does the datacenter use raised flooring, or does the place have discrete ducts for its ventilation. I've seen data centers have 12-24" of clearance used as plenum space. Others may have 2-4 inches because the space is used just for wiring and not HVAC use.

This needs to be factored in to separate the aisles, elsewise spending time for meat locker curtains and endcaps like Google has done may bring few to no returns.

Re:Position of the HVAC system? (1)

jbengt (874751) | more than 4 years ago | (#32060628)

If you're using the raised floor plenum for supply, the efficient way to do it would be tosupply directly into rack ecnlosures and use the entire room as the warm side. That'd be very efficient, but might take some thought about how you build and ventilate the rack enclosures
If the air is supplied directly to the space, make the ceiling a return air plenum and provide ducted exhaust from the racks into the ceiling return and make the whole space the cold side (as illustrated in TFA). That's almost as efficient, but might still require a little thought about how to build and ventilate the enclosure.
Curtains or partitions to separate hot and cold aisles will require coordination for lighting, fire protection, egress, etc., which could pose problems, especially in an existing facility.

What does Apple do? (0)

Anonymous Coward | more than 4 years ago | (#32059442)

Whatever one Apple uses, that's obviously the evil one.

Datacenter cooling (1, Informative)

JWSmythe (446288) | more than 4 years ago | (#32059474)

    I've been in quite a few large datacenters. Some have strict rules on proper utilizing their hot and cold aisles. Some could care less.

    The ones with the best ventilation have the cold air coming through the raised floor, and the hot air being pulled from the ceiling. Brilliant. Actually understanding that hot air rises. :)

    Most

    Some I've been in had the hot air being blown from the ceiling, and the return somewhere on a vertical wall.

    At one place, they worked with us on it. We had two rows in a cage. We established the center to be the cold, and the edges to be hot. It wasn't really for temperature, it was for access. We spread the rows a little extra so we could have a couple carts and two people working at the same time. In that, we couldn't load some of the longer machines in from the "hot" side if we had wanted to. So on the aisle that we worked in, they gave us more cold air outlets, and sealed off the ones on the hot side. It worked very well. The site manager was really into making everything work as well as possible. He would walk around with a non-contact IR thermometer and spot check equipment.

    Separating the hot and cold aisles with plastic would probably never work. Most racks that I've seen either are open frame, or they have a vent fan in the top. The open frame ones are obviously worthless for a plastic barrier.

Re:Datacenter cooling (1, Informative)

Anonymous Coward | more than 4 years ago | (#32059814)

> Some could not care less.

There, fixed that for you.

Re:Datacenter cooling (1)

Xaositecte (897197) | more than 4 years ago | (#32060202)

Both phrasings are correct.

English is a fucked up language.

Re:Datacenter cooling (1)

Dragoniz3r (992309) | more than 4 years ago | (#32060878)

I dunno about correct, but they're both in common usage.

Re:Datacenter cooling (1)

nacturation (646836) | more than 4 years ago | (#32061326)

Both phrasings are correct.

No, both statements are not correct. The correct statement is the answer to the question: "Could you care less?" If you care very deeply about something, then you would say "Yes, I could care less" because, after all, it IS possible. If you don't care at all, then you would say "No, I could not care less" because you can't care less than zero.

The bastardizing of "couldn't care less" into "could care less" is the same bastardization that makes people write "could of" and "should of" instead of "could have" and "should have". Or "the dog wagged it's tail" instead of "the dog wagged its tail". Just because it might sound similar when you say it quickly and just because lots of people make the same mistake, it doesn't make it correct.

As a former employee of one of those companies... (4, Informative)

Mhrmnhrm (263196) | more than 4 years ago | (#32059490)

I can honestly say you win either way. The electricity/cost savings of containment will pay for itself regardless of where you put the doors. That said, whether you choose to go HAC or CAC is really choosing between different trade-offs.

HAC (The APC method): Seemed to be cheaper and easier to install. Since the hot aisle is being contained, if something happens to your coolers, you have a longer ride-through time as there's a much larger volume of cold air to draw from. However, at least when I got out of the business, HAC *required* the use of in-row cooling, and with APC, that meant water in your rows. Europeans don't seem to mind that, but Americans do (which provided an opening for Emerson's XD phase-change systems, dunno if APC has an equivalent or not yet). I personally wouldn't be too keen on having to spend more than a few minutes inside that hot aisle, either.

CAC (The Emerson method): Seemed to be more expensive, especially in refit scenarios (they appeared to be more focused on winning the big "green-field" jobs more than upgrading old sites), but it can usually leverage existing CRAC units, so you could potentially save enough there to make it competitive, as well as avoid vendor lock-in. The whole room becomes the equivalent of a hot aisle, but convection and the building's HVAC can somewhat mitigate that, so it'll still be uncomfortable working behind a rack, it doesn't feel quite the sauna that an HAC system does. Depending on whose CRAC equipment you buy (or already have), EC plug fans and VSD-driven blowers can save even more money if properly configured.

Other: I've seen the "Tower of Cool" or "chimney" style system, and flat out hate it. They look like a great idea on the face of it: much cheaper, faster installation, able to use building HVAC, etc. But let's be honest. Your servers are designed for front-to-rear airflow. So are the SANs, NASs, TBUs, rack UPSs, and practically everything else you've put in your datacenter, apart from those screwball Cisco routers that have a side-to-side pattern (Seriously... what WERE they thinking on that one???). Why would you then try to establish an upwards-pointed airflow that's got a giant suction hose at the center of the rack's roof, where it can just as easily pull cold air from the front (starving your systems) as it does hot air from the back?

Personally, I like cold aisle better. If I'm going to be spending two hours sitting behind a server because I can't do something via remote (forced into untangling the network cable rat's nest, perhaps), I like the idea of being merely uncomfortable and a bit sweaty than dripping buckets while cursing the bean-counters who forced me to lay off the PFY two months ago. There are also some neat controllers that work with CRAC units to establish just the right amount of airflow to fully feed the row and manage their output, so if running five CRACs at 50% is more power efficient than running three at 100%, that's what they do. I know folks who like hot aisle better. It's more fun for them to show-off their prize datacenter since all the areas you'd want to see (unless you're the one responsible for power strips or cable management) are cool.

Re:As a former employee of one of those companies. (0)

Anonymous Coward | more than 4 years ago | (#32059692)

I've seen the chimney method too, with a large fan at the top. Nice in theory, but I have seen almost no rack units made for bottom to top cooling. Almost everything goes front to back. Ironically, one place I worked at received a ton of racks with the stupid fans on top. Of course, the fans were yanked, and a piece of metal screwed on top to ensure the airflow went the proper way.

I wonder what the future of this stuff will be. It would be most efficient to replace CRACs with water chillers and having a liquid cooling system that doesn't just cool the rack, but has heat exchangers and systems to cool individual components (down to the RAM, CPUs, hard disks, and other items.) However, more sophistication is needed to be done with valves and the ability to shut off or shunt subsections, all it takes is one bad hose, and the data center suffers both water damage as well as heat failure. Until someone invents not just a valve assembly that doesn't leak after hundreds to tens of thousands of insertion/removal cycles, as well as an assembly that can detect water pressure loss and close off/shunt, liquid cooling will be only usable for systems that are essentially static and where devices are rarely connected/disconnected from, such as HVAC systems, APS racks, or PC systems.

Re:As a former employee of one of those companies. (0)

Anonymous Coward | more than 4 years ago | (#32060222)

First, what power density are you planning to run? You'll find the debate ends about 15 to 20kw. Then its all hot containment or in-row or back of cabinet.

From an expert in the field (0)

Anonymous Coward | more than 4 years ago | (#32059604)

It is pretty basic. You are paying for the cooling, and you don't care if the heat bleeds away, but you care if the cold warms before it does its job, so you partition and direct the cold air and let the hot go wherever. The bottom line here is the hand waver types have never done a cost analysis based on required cooling loads, which is what the data center is paying for.

Reality check ... (1)

BitZtream (692029) | more than 4 years ago | (#32059606)

It really doesn't matter which side you contain as the end result is the same.

The important part is preventing the cold incoming air from being tainted with hot exhaust air. Which side you do it on is largely irrelevant unless you have a large amount of foot traffic in the server room which may bring in outside air to upset the balance, but even that would require what I would most certainly consider an unacceptable amount of foot traffic to occur in a datacenter.

Just hanging vinyl like Google does takes you to the point that you're wasting time and money by putting any further engineering resources on it.

Depends on SLAs prehaps (0)

Anonymous Coward | more than 4 years ago | (#32059646)

SLAs are usually based on the temperature of the cold aisle. Containing the cold aisle along with proper blanking of cabinets and equipment venting the right way would help to keep within the contract.

contain the heat (2, Interesting)

Tumbleweed (3706) | more than 4 years ago | (#32059652)

When you contain the heat, you then have the ability to move it around and use it for cogeneration, thus vastly increasing your overall efficiency.

Both (3, Interesting)

funkboy (71672) | more than 4 years ago | (#32059654)

The answer will be specific to each implementation.

But in general, it should be pretty obvious to anyone that understands basic thermodynamics: get the "cold" into the servers without mixing it with the ambient or letting it touch any hot metal, and get the heat out of the servers without mixing it with the ambient or letting it heat up any other metal.

It should be pretty obvious that air is not really the best way to do this; air goes all over the place, and is not a very good thermal conductor (relatively speaking).

There are entire 10k+ machine datacenters in France that use only liquid cooling circuits, right up to the servers. Energy costs for running the external condensers are a small fraction of what it would cost to do the same thing with air. Of course, it helps if you only have your own machines in such an environment, but if APC, Emerson, etc were serious about efficient cooling then they'd partner with HP, Dell, etc. to make standardized systems that would allow this...

Datacentre efficiency (0)

Anonymous Coward | more than 4 years ago | (#32059656)

Hi There,

We did some testing on two rows of 27 racks which we had fully populated with rack mount servers. We measured each setup over a two month period during a european summer. We found that by containing the cold rows we saved around 20% on the power bills overall. For the hot isle containment we found that we saved only 12%. Unfortunately for us the testing that we did was not as comprehensive or detailed as it should have been. I believe that there are so many other factors which affect the overall power saving. For example some vendors recommended having the air con in the racks while others say that its best to have the air con outside of the solution with pressure forcing the air through the floor. We didn't have the opportunity to test the air con in the rack because it was a retro fit in an existing facility. Overall we found that the isolation was a good thing and saved us the cost of installing within a few months. Right now we have 18 rows of 27 racks with cold air containment working flawlessly. Some other points which need to be noted..

- a squared off ceiling over the cold rows does not provide good enough air flow. Having a curved ceiling was good.
- ensure all empty rack slots are filled with blanking panels.
- make sure that the gap between racks is filled.
- make sure the exhaust fans on the top of racks are only on the back of the rack not at the front like in some installs.
- make sure that the cabling is done in a tidy way, any air obstruction makes a difference to the efficiency. we got special power cables made up so that there was no slack hanging around.

Over the 18 rows we have now had the install running for two years and over that time we have saved an average of 28% on the power in that facility. Which when your talking about 30+ servers per rack with 27 racks in a row and 18 rows it makes a big difference.

hot level (0)

Anonymous Coward | more than 4 years ago | (#32059680)

how about really high ceilings in a funnel shape and vents that you can open to the outside (with filters). Most data centre rooms i've seen are normal office height. after all let the hot air rise.

While we are at it let's turn the racks and the pizza boxes around 90 degrees - less distance for air to travel over hot components, likley require lower fan revolutions as well. Not to mention much hardware doesn;t need all that depth and with more 'front' area we could fit more swapable hard drives in per U for the virtual machines either with single servers or SANs.

anyway just a thought.

the answer is "yes" (1)

markhahn (122033) | more than 4 years ago | (#32059728)

all the air in a machineroom is either hot or cold. anything else means you're mixing - that is, your containment leaks. there is basically no heat transfer through building conduction, for instance. 'cold' merely means that it's between the chiller outflow and front of servers; hot means ass-side of servers and chiller intake. the primary goal is to keep them from mixing.

a nontrivial machineroom will have multiple chillers and non-uniform heatload distribution. that doesn't change this principle, but does mean that the airflow design may have trouble getting enough air to the right spots. ideally, the chiller outflow would all go into a single large plenum (such as a deep pressurized underfloor), with hot ducting with controllable air intakes whisking the hot air back to the chillers.

Heat moves toward cold... (1, Funny)

Anonymous Coward | more than 4 years ago | (#32059742)

so contain the heat so that it doesn't encroach on the cold.
My limited understanding of thermal dynamics is that the more active molecules
in a heated environment tend to move toward cold which is a lower energy state,
thus entropy plays an ever present role. Contain the heat, prevent the transmission
of infrared and use any other barrier you can to prevent heat from warming the
cooler side.

Contain for the humans (1)

Culture20 (968837) | more than 4 years ago | (#32059784)

There are a lot of reasons why someone will be sitting in a server room for an hour or more. Please don't make it an unbearable hour with heat baking the poor humans.

Depends on the climate/ambient (1)

spazmonkey (920425) | more than 4 years ago | (#32059916)

Depends on the climate entirely. Here the summers are brutal and the winters severe. which one is contained is pretty much up to which one we want to be exposed to. (i.e. winter it is nice to have the heat NOT just dumped outside.) Curtains and thought to the existing doors/partitions help with that seasonal flexibility a lot.

Use Styrofoam(tm) (2, Insightful)

sydbarrett74 (74307) | more than 4 years ago | (#32059920)

I mean it worked for the McDonald's McDLT back in the 80's...

CYA (1)

martin-boundary (547041) | more than 4 years ago | (#32060030)

CYA. Flip a coin. That way, if you get complaints about the wrong choice, you can always blame the coin.

APC / Schneider vs Emerson / Liebert (1)

littleab (1002027) | more than 4 years ago | (#32060294)

This basically comes down to a question of APC vs Liebert equipment. APC pushes hot aisle containment where Leibert pushes cold aisle containment. It seems like the benefits are almost split 50/50, but there are a few other things to consider. The APC method uses in-row cooling, which means you are moving the the air less distance than the Liebert method of pushing air from large CRACs at the ends of the aisles. There are efficiencies gained in other methods, and as a poster earlier mentioned the APC method is a little cheaper up front. This does introduce the problem of water in the data center, but for us we resolved this by putting the water under the floor so that it would be very difficult to have any issues related to it. Overall, I think this comes down to the situation, but for us it made more sense to do hot aisle containment. It seems to be better in smaller scale situations, but I would have to do more research to see how it scales to large 50,000+ sqft data centers.

Does it matter? (1)

darkonc (47285) | more than 4 years ago | (#32060450)

The whole point is to separate hot air flows from cold air flows.

After that, precisely how you go about the process is more a matter of the side effect of other choices.

This feels, to me, rather like arguing if it's better to put the women's washroom to the right or left of the men's.

Well, if your goil is energy efficiency... (1)

Hurricane78 (562437) | more than 4 years ago | (#32060556)

Then it’s easy: Do the math. Seriously. It’s a question of energy transfer, the simplest thermodynamics, and isolation.
I’m not a expert in those matters. But I’d say, the goal would be, to minimize energy (heat) differences between areas right next to each other?

I’m sure there is an expert (please no wannabes) here who can quickly give a nice answer. :)

trade offs (0)

Anonymous Coward | more than 4 years ago | (#32060644)

Hot aisle containment is more efficient, as you are extracting heat from a smaller volume of air. However, remember that in a hot aisle containment system, if you ever have to work on the back of the rack, you are walking into a dangerously hot environment. The trade off is between efficiency and safety, IMO.

Thermal pollution (1)

starfishsystems (834319) | more than 4 years ago | (#32060706)

We have to choose between the glass being half full or half empty. But it's not the symmetrical choice which it might seem to be. Specifically, it's not a matter of providing cold air. That's just a means to an end. Fundamentally, it's a matter of removing thermal pollution.

The ideal environment for the equipment is one which is uniformly, ambiently cold. Not only are there fewer thermal stresses, the entire design problem is simplified if you can assume uniformity. Departures from this ideal are therefore to be minimized. You don't want to contain the cold, you want it to prevail. Instead you want to contain the heat, remove it, and minimize points of contamination along the way. This has the additional benefit that you're minimizing leakage from the relatively small heat regions rather than trying to protect the entire environment.

An analogous situation arises in a marine engine room. You want to supply clean air and also exhaust the combustion gases. Supposing you only had the following two choices, should you (a) exhaust directly into the engine room and pipe fresh air from outside to the engine, or (b) pipe the exhaust outside and let air be drawn directly into the engine? Generally (b) is regarded as better, because exhaust gas is hot, toxic, and corrosive. It's much more trouble to design every surface in the environment to tolerate it that it would be just to contain and get rid of it.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...