Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Making Data Centers More People-Friendly

samzenpus posted more than 3 years ago | from the won't-somebody-please-think-of-the-sys-admins dept.

Hardware 137

1sockchuck writes "Data centers are designed to house servers, not people. This has often meant trade-offs for data center staffers, who brave 100-degree hot aisles and perform their work at laptop carts. But some data center developers are rethinking this approach and designing people-friendly data centers with Class-A offices and amenities for staff and visitors. Is this the future of data center design?"

cancel ×

137 comments

Sorry! There are no comments related to the filter you selected.

First troll! (1)

sltd (1182933) | more than 3 years ago | (#35363086)

Maybe it's just me, but doesn't it seem like 100 degree aisles wouldn't be particularly server-friendly either? Just my $.02

Re:First troll! (1)

Eternauta3k (680157) | more than 3 years ago | (#35363106)

It's where the cooling exhaust goes, that's why it's hot.

Re:First troll! (1)

Anonymous Coward | more than 3 years ago | (#35363226)

In the Data Centers I work in, the exhaust is sucked out the top of the racks, and every few tiles in the aisles, is a tile with a bunch of holes in it acting as a air-conditioning vent.

If anything, I freeze at times, if I'm in shorts and flip flops.

Re:First troll! (1)

Anonymous Coward | more than 3 years ago | (#35363348)

Seriously. I've been in and out of datacenters, rack farms, etc over the past decade and every time I've seen the racks vented through the ceiling. Hot air rises, more efficient to just suck it up through there. Never been in a warm data center, only freezing ass cold.

Re:First troll! (3, Interesting)

EdIII (1114411) | more than 3 years ago | (#35363926)

The data center I visit most right now has hot/cold aisles. It looks more like a meat processing center with all the heavy plastic drapes. They go from floor to ceiling every other aisle. On the front of the racks they even put in plastic placeholders for gaps where we don't have equipment installed yet to maximize air flow through the equipment. They did it too, we never even had to ask.

Most of the time we work from the cold aisle with our laptop carts and it is *cold*. The article is confusing because I can't possibly understand why you need to sit with a cart in the hot aisle to work. You can install your equipment and cabling in such a way that you don't need access to the hot aisle for anything other than full server swap outs, cabling swap outs, and that's pretty much it. You can replace the hard drives from the front of the units, and maintenance the server just by pulling it out in front after disconnecting the cables if you need to do so. Most big 4U servers come with cable management arms that allow you to keep "service loops" so that you don't need to disconnect anything to pull the server out on the rails.

Heck, if you need to just get a 15ft networking cable and thread it through into the cold aisle. You don't have to sit in the heat if you don't want to. Although, I'm a big guy and I like the cold it but its funny as hell to see the skinny bastards walking over to the hot aisle to warm up.

Re:First troll! (1)

afidel (530433) | more than 3 years ago | (#35365748)

Cable management arms are the work of the devil. They do nothing but stress cables beyond the minimum bend radius and block airflow out of the back of the server. If you are messing with a server enough for the time spent installing the cable management arm is less than the time you spend disconnecting and reattaching cables you are doing something wrong. The only time I thought they were warranted was back when some higher end x86 servers came with hot replace PCI slots so a dead addon card didn't mean a system reboot but I haven't seen that feature in any PCIe system and besides it's easier to just build redundancy in and replace the failed hardware during a maintenance window (or even better do everything in VM's and just evacuate the host, fix the problem and migrate them back =)

Re:First troll! (2)

EdIII (1114411) | more than 3 years ago | (#35366028)

But Momma said Cable management arms are the work of the devil

There fixed that for you :)

*could not help myself*

Re:First troll! (1)

Redlazer (786403) | more than 3 years ago | (#35364208)

Peer1, in downtown Vancouver, uses hot/cold aisles.

Re:First troll! (3, Informative)

SuperQ (431) | more than 3 years ago | (#35363194)

100 degree hot isles are too cold. Hot isles should be the temperature near the maximum component tolerance of the parts in the server. If a part has a maximum temperature of 150 degrees, and runs happily at 120 degrees, the hot isle should be 120 degrees. This way the cooling efficiency is the highest.

See Google and SGI (Rackable) container designs.

http://arstechnica.com/hardware/news/2009/04/the-beast-unveiled-inside-a-google-server.ars [arstechnica.com]

As you can see from the photo there, all the cables in the front. No need to get behind it where the hot isle is.

Re:First troll! (1)

ko7 (1990064) | more than 3 years ago | (#35363406)

1... If a part has a maximum temperature of 150 degrees, and runs happily at 120 degrees, the hot isle should be 120 degrees. This way the cooling efficiency is the highest.

If the component is at the same temperature as the cooling air, then NO energy (heat) is being transferred from the component to the air. There must be a temperature difference (gradient) for heat to flow. Additionally, just because a part may operate at 150 F, does not mean it will live long at that temperature. The life span of must things electronic gets exponentially shorter with increase in temperature. This is why most data centers are kept so cool.

Re:First troll! (1)

BlackSnake112 (912158) | more than 3 years ago | (#35363472)

Not to mention that people do not operate too well at 120 degrees.

Re:First troll! (3, Funny)

Culture20 (968837) | more than 3 years ago | (#35364108)

Air intake is from the cool aisle, not hot aisle. Essentially, GP is saying that if the hot aisle is anything lower than 120F, there is extra air-conditioning getting into the cool aisle that shouldn't (waste of cool air). It's more of a health gauge at that end of the computer, kinda like digestion: you make sure you eat well (habitually check the air temperature of the cool side to make sure it's cool enough), and you occasionally look at your stool for corn/blood (see if hot aisle is warm enough) to make sure everything's working as it should.

Re:First troll! (1)

burne (686114) | more than 3 years ago | (#35364518)

This is why most data centers are kept so cool.

Keeping your servers hot (but not too hot) will enable you to reuse the heat from the datacenter for other purposes. Keeping your servers cool will waste energy cooling them, and waste energy when re-using the energy you extracted from the datacenter.

The sticky stuff is in the temperature-differential between 'inlet' and 'outlet'. If you pump a square yard of air through your server every second the differential will be low, and you have generated low-quality heat. Slow down the flow and the differential will rise to ten (or 31) degrees, which in terms of 'recycling heat' is much more valueable. You can't do shit with a single degree differential, but a ten-degree differential can be put to other uses.

Today (a cold day) the inlet-temperature was 4 degrees (celsius). Outlet-temperature was, on average, 35 degrees (again celcius). That is an amount of energy you could monetize. Currently we use the energy thus generated to heat the offices and those of the neighbours. Our DC is small, 10.000 square feet, but produces enough heat to sell.

Selling your waste. Think about it. It's in the order of 5 percent of our operating-costs. The way to go? (Open your garbage-bin. Look. Add up the value of everthing in there. Imagine getting one-twentieth from what you've spend at the mall. Would you go for it? We do.)

Re:First troll! (1)

rtb61 (674572) | more than 3 years ago | (#35364100)

Data centres need to pay more attention to warehousing for objects of the size they handling. All the racks should be open and they should be accessed using automatic picking and packing systems.

Once the system is built automated picking system only should access the racks to pick defective units and return them to an air locked workstation and to insert the new data unit. This allows much higher racks and you can treat the whole data processing and storage area as an industrial process. Bringing in condition air at the required volume and temperature and then immediately exhausting it unless the high temperature air can be used (heating offices or a fitness centre the only realistic user of that volume and temperature of exhaust air).

This would be dependent upon the life cycle of units, the size of the data centre and, improved air handling efficiencies reducing running costs.

Re:First troll! (1)

SuperQ (431) | more than 3 years ago | (#35364222)

Hell yes. Robot machine replacement would rock. Just like tape swapping. Just bring the busted machines to techs for repair. Or if it's just drive swaps the robot could do that too.

In theory you could use waste heat as a warming plant for other housing/office use. Since most hardware is happy with 80 degree input air you would have near 80 degree output water. That's more than sufficient to pipe to warm up homes in cold areas of the world.

Re:First troll! (1)

zero0ne (1309517) | more than 3 years ago | (#35365714)

To add:

With all the advanced redundant power, networking, etc, you could even have the servers made or adapted to allow for seamless picking...

So you go to your data center, ask for Server X1B7, picker grabs it and brings it to the room / office you are in... all seamless and without losing network or power. The room could even be designed to look and feel like a old-school datacenter.

Re:First troll! (0)

Anonymous Coward | more than 3 years ago | (#35366350)

Server jukebox!

Happy days heeeeeeeey

Fecking Fahrenheit (1)

mjwx (966435) | more than 3 years ago | (#35366640)

I was thinking, like most rational people would that 100 Degrees Celsius (boiling point of water) _would_ be way too hot and wondered how the servers could keep operating under such extreme conditions. Now in high temperature environments like mine sites we use self contained racks with their own AC unit. We occasionally use empty ones to chill beer.

But then I read this and realised they were using the backwards Fahrenheit measurement.

100 degree hot isles are too cold.

Now here I agree, 38 Degrees is not too hot so long as you aren't working there 8 hours a day. I'm sure datacentre staff have a nice climate controlled cubical to go back to so that is a non issue.

BTW, the strangest rack I've ever come across was one we deployed to a mine in Mongolia. Yes, you can get self contained racks with their own heater.

Re:Fecking Fahrenheit (1)

Shadow of Eternity (795165) | more than 3 years ago | (#35366806)

Should've just ordered a Prescott and saved yourself the heater.

Sure... (0)

Anonymous Coward | more than 3 years ago | (#35363110)

For those who want to pay for those amenities.

Who's going to pay for this? (2)

SimonTS (1984074) | more than 3 years ago | (#35363120)

Seriously! What company is going to pay an extra 10% (guessed figure) on top of the cost so they can have a nice comfy room for their data-rats?

Re:Who's going to pay for this? (1)

Anonymous Coward | more than 3 years ago | (#35363180)

<smug>
Any company that cares about the comfort and morale of their employees!
</smug>

Re:Who's going to pay for this? (0)

DogDude (805747) | more than 3 years ago | (#35363850)

A company that is interested in keeping good employees will.

Re:Who's going to pay for this? (0)

Anonymous Coward | more than 3 years ago | (#35364054)

And which company is that?

100-degree hot aisles? (3, Insightful)

Anonymous Coward | more than 3 years ago | (#35363136)

I've never had a temp problem in a data center. Noise? yes, hot? no.

Re:100-degree hot aisles? (2)

billcopc (196330) | more than 3 years ago | (#35363164)

Agreed. I'll take sound deadening over temperature adjustement. I'm admittedly very sensitive, but for each hour I spend at the DC, my ears ring for three.

Re:100-degree hot aisles? (2)

HockeyPuck (141947) | more than 3 years ago | (#35363232)

Go to home depot and buy a pair of ear plugs. Not the foam disposable ones, but the rubber in ear type. Just like some earbuds and they're connected by a rubber cord.

Just one thing: don't chew gum or clear your throat. It sounds just awful...

Re:100-degree hot aisles? (1)

0123456 (636235) | more than 3 years ago | (#35363264)

Go to home depot and buy a pair of ear plugs.

Nah, get a proper set of ear protectors; mine are probably the best $25 I ever spent.

Re:100-degree hot aisles? (1)

Anonymous Coward | more than 3 years ago | (#35364004)

I got a pair of electronic headphones at harbor freight tools for like 15 bucks, they are meant for a shooting range. Basically they block out loud noises and through a microphone and speaker re-create low decibel noises such as talking. So basically you can be firing your loud ass gun and hear people talking at a normal voice without having to remove the headphones . This works wonders in any environment with loud noises.

Re:100-degree hot aisles? (2)

SuperQ (431) | more than 3 years ago | (#35363290)

Better than home depot, Etymotic full frequency plugs are great:

http://www.etymotic.com/ephp/er20.html [etymotic.com]

I've also had some friends use things like these in datacenters:

http://www.amazon.com/dp/B00009363P [amazon.com]

They let you hear people talk (bandpass filter) without letting the low/high noise in.

Re:100-degree hot aisles? (1)

X0563511 (793323) | more than 3 years ago | (#35363400)

I've got a pair of Peltor Tactical Sports. I use them while at the shooting range, but they're a godsend in the DC as well!

Re:100-degree hot aisles? (1)

h4rr4r (612664) | more than 3 years ago | (#35363442)

Are they worth it?
I have 300 win mag with a gun loudener (muzzle break) and it seems no muffs I own are up to that challenge. I end up wearing plugs and muffs at the same time.

Re:100-degree hot aisles? (2)

sexconker (1179573) | more than 3 years ago | (#35364098)

Are they worth it?
I have 300 win mag with a gun loudener (muzzle break) and it seems no muffs I own are up to that challenge. I end up wearing plugs and muffs at the same time.

"Holster. Bandoleer. Silencer. Loudener. Speed-cocker. And this thing's for shooting down police helicopters."

Re:100-degree hot aisles? (1)

X0563511 (793323) | more than 3 years ago | (#35366136)

You'd still want those. The plus side, you can turn them on and crank the low level sounds up, so you'll be able to hear more than your breathing when you're not firing.

Re:100-degree hot aisles? (1)

energizer-bunny2 (1308043) | more than 3 years ago | (#35365190)

Better than home depot, Etymotic full frequency plugs are great:

http://www.etymotic.com/ephp/er20.html [etymotic.com]

I agree with the Etymotic earplug recommendation. I have had a pair since the late 90's and they are great.

Granted, I haven't used them in years and they are probably a bit gross now. meh....

Re:100-degree hot aisles? (1)

fritzenheimer (886192) | more than 3 years ago | (#35364130)

Everything's remotely managed these days anyway, so who cares how hot / cold / loud the cabinets are? Scurry in to hot-replace whatever and scurry back out to the cubi.

Exactly. (1)

Travoltus (110240) | more than 3 years ago | (#35366938)

I managed a data center. Temperatures like that radiating from servers is bad, bad news. That is an obvious bad airflow problem.

Plus, there is emerging technology to use sound waves for refrigeration. I wonder when they'll deploy it for data centers?

Is this the future of data center design? (5, Insightful)

Anonymous Coward | more than 3 years ago | (#35363138)

No.

Not becoming the standard (3, Insightful)

confused one (671304) | more than 3 years ago | (#35363140)

This is a marketing ploy to attract customers to a new data center. Ultimately cost will determine the layout. If a cube is cheaper then cubes it will be. If 100 degree hot aisles saves money vs an 85 degree hot aisle, then they'll run them hotter.

Re:Not becoming the standard (2)

Damek (515688) | more than 3 years ago | (#35364914)

Human costs are important. We're forgetting that, hence idiocy like Wisconsin. If you want to be "old-school economics" about it, all costs should be accounted for, including those stakeholder humans bring up that you may not have realized (employees are stakeholders, even if not stockholders). If you want to be currently capitalist about it, don't bother accounting for any costs that aren't affecting today's golf game. Have fun watching the planet burn then.

The true "tragedy of the commons" is epitomized in contemporary capitalism.

Re:Not becoming the standard (1)

confused one (671304) | more than 3 years ago | (#35365410)

Nice sentiments and all. I'm living the "old school economics" tragedy where my I'm stuck in a 8' x 5.5' cubical reporting to a PHB who thinks the software I create doesn't add any value to the company. My only stake is the one that keeps getting driven into my back.

Obviously (0)

Anonymous Coward | more than 3 years ago | (#35363150)

The future of data center design is determined by the lowest denominator common between hardware, humans and cost. Anyone think "people friendly" has a snowball's chance in a 100-degree hot aisle?

Wimps (4, Funny)

RobotRunAmok (595286) | more than 3 years ago | (#35363166)

In my day Data Centers were at the top of snow mountains which you had to climb barefooted or be turned away. We built them to keep the machinery happy, not the people, whom we preferred behaved like machinery.

We liked our Data Centers the way we liked our women: Bright, White, Antiseptic, and Bitterly Cold.

Wait (1)

stms (1132653) | more than 3 years ago | (#35363556)

You forgot to tell them to get off your lawn.

Re:Wimps (4, Funny)

PPH (736903) | more than 3 years ago | (#35364554)

We liked our Data Centers the way we liked our women:

Hot. And always going down.

Re:Wimps (0)

Anonymous Coward | more than 3 years ago | (#35364964)

Tis in the nature of a robot to like Susan Calvin

Re:Wimps (2)

syousef (465911) | more than 3 years ago | (#35366534)

We liked our Data Centers the way we liked our women:

Hot. And always going down.

Charlie Sheen, is that you?

Hand Scanners... (0)

HockeyPuck (141947) | more than 3 years ago | (#35363202)

How getting rid of hand scanners?

I used to work in colo facilities for years and the one thing that always concerned me was some person that had gone through the man trap before me had some awful bacteria/virus on their hands.

If the handle on the toilet in the airport can claim that it has an anti-bacterial coating, do you think the hand scanner manufacturers could do the same?

Re:Hand Scanners... (3, Insightful)

X0563511 (793323) | more than 3 years ago | (#35363414)

Unless you've lived in a bubble your whole life, you're probably going to be OK...

Re:Hand Scanners... (1)

HockeyPuck (141947) | more than 3 years ago | (#35363438)

I hope you don't assume that everyone washes their hands after coming out of the bathroom.

While I don't know about you, I personally would not want to shake the hands of some guy that just dropped a bomb and didn't wash his hands.

Re:Hand Scanners... (0)

Anonymous Coward | more than 3 years ago | (#35363726)

Do you routinely crap on your hands? If not, then why assume everyone else does? If so, perhaps you need re-training in the restroom.

Re:Hand Scanners... (2)

cowboy76Spain (815442) | more than 3 years ago | (#35364672)

You would be OK, none of my coworkers have died of dyssenthery yet...

Now more seriously, relax. Look at how people has historically lived. How long have we had drinkable and controlled water in our home? Or have been almost be assured a john (it is how it is called?) nearby when we need it? And you are probably better feed, have had more vaccines and have acces to more and better physicians than 99,9% of the rest of mankind that has ever been. And even with those disavantages, most of these people did not die from illness (and many of the worst epidemics have been linked with periods of crisis, war, and the like that caused famine that weakened the population). And you are mostly surrounded by healthy people, the sicker staying at home or going to an hospital.

The fact that there are spots in TV telling you to buy something every time you sneeze does not mean you are in any way weaker than those people(This is no medical advice. If you tomorrow sneeze and die, I deny all responsability ;-p).

Of course, some safety measures may be sensible, but: isn't it true that you touch door handlers, phones, papers that other people have touched too? Have you heard of someone getting sick that way? Why is a hand scanner different?. Also I don't understand why "a bomb" is so dangerous... I'd always assume an sneeze would be way, way worse.

OTOH, if you want to be worried about those things, you can remember that a single bout of influenza can kill you (remember which was the most lethal pandemy in the XXth?)

*1: This is no medical advice. If you sneeze today and die of that, I

FYI (1)

Anonymous Coward | more than 3 years ago | (#35365584)

I not only don't wash my hands I wipe my butt with my fingers then wipe my fingers off with TP.

I like to sneak into the bathrooms early in the morning and rub one out over the sink handles in both bathrooms. Those aren't 'water' spots.

I'm a hand shaking motherfucker and I work with you.

Re:Hand Scanners... (1)

Peeteriz (821290) | more than 3 years ago | (#35366168)

Is there any reasonable difference there between hand scanners and doorknobs that would warrant different treatment ?
You get the same risks just by using the same door as others w/o wearing surgical gloves and discarding them afterwards.

Confused here (1)

generikz (413613) | more than 3 years ago | (#35363206)

All the Data Centers I have been working at (USA and Singapore) had some kind of lounge/relaxing room with games, food vending machines, coffees, meeting rooms you can rent, showers, etc. Maybe they just forgot to mention it to their existing customers? Or maybe Equinix (not my company) is doing a better job at taking care of their customers? And they're still aiming at 30% yearly ROE, I can't see where a few dedicated rooms would hurt the bottom line so much.

Yeah, we also did that 10 years ago (1)

billstewart (78916) | more than 3 years ago | (#35363492)

Our data centers also had customer-friendly space. I think it was mostly inside the "show ID to a guard" area, but it was as important a part of the design as the racks and cages.

Just about time (-1, Troll)

ubuntufan3 (2007430) | more than 3 years ago | (#35363266)

In company [ype.cc] I work, there is even no place where one can sit in the datacenter. Not to mention the heavy lift exercises (~30 Kg/unit) I have to go through when one of the SANs breaks up.

Re:Just about time (1)

cinderellamanson (1850702) | more than 3 years ago | (#35363360)

Parent is Goatse!

Re:Just about time (-1)

Anonymous Coward | more than 3 years ago | (#35363398)

Mod parent troll - link is goatse.

BTDT (1)

thesameguy (1047504) | more than 3 years ago | (#35363284)

Haha! We tried this back in 2000 and it didn't work out. Company tanked, got sold for pennies on the dollar. Herakles (new name) is, however, still a really nice facility.

Remote Management (3, Insightful)

eln (21727) | more than 3 years ago | (#35363336)

If you've got remote management set up properly, the only reason you ever even need to go to the data center is due to some kind of hardware failure. There's no sense paying the extra money a place like this will have to charge (to recoup the cost of all those extra amenities) for colo space if you only need to physically visit your servers maybe once or twice a year.

Re:Remote Management (2)

starfishsystems (834319) | more than 3 years ago | (#35364112)

Not just hardware failures but any sort of scheduled physical change as well. Among other things: device upgrades; server, switch and router installation and removal; cabling changes; backup media changes; UPS maintenance; rack moves.

The last data center maintenance I did, we had to move "only" seven rack's worth of gear from one floor to another. It took place in four carefully-planned phases spanning two months of time. We had eight people working at it for the first week, then the two senior guys pretty much every other day for the remaining time. This, I must emphasize, was to accomplish no functional change whatever. It was just the groundwork.

Once that was done, then we could finally start in on the long backlog of upgrade and redesign tasks that had previously been impractical due to lack of space. During the course of that redesign, lots of things broke, VLANs for example. Every device of significance was on a remote power switch, but they kind of don't work too well if you can't find a switch to talk to them through.

Re:Remote Management (0)

Anonymous Coward | more than 3 years ago | (#35365460)

To Eln's point, you need proper remote (OOB) management for your "remote power switches" APC has good stuff, though pricey.

The type of work you describe seldom happens. This should not be routine.

S

data center comfort (3, Insightful)

NikeHerc (694644) | more than 3 years ago | (#35363386)

The folks in India won't care how hot or cold it is in the data centers over here.

R&D labs: yes. Classic DC? no (1)

Anonymous Coward | more than 3 years ago | (#35363450)

I manage an R&D lab with a few hundred racks. We constantly swap out hardware for testing. This makes sitting in the lab the most efficient way of getting things done. Unfortunately my work space where I'm currently sitting reads 90F and I'm wearing noise canceling headphones, which squeeze my head into oblivion after an hour or two. We're working on redesigning a storage room into office space to improve our quality of life. Our goal is to work within a few yards of the gear. As long as an office space is nearby we'll be happier, be able to hear each-other talk, and not feel like we're driving down an Arizona highway in a convertible at 60mph.

  Designing the entire lab around a few working spaces? That seems far fetched. Most data centers choose location: cheap land, power, and cooling before the words "workspace" is even thought of. If an efficient hot/cold aisle scenario works out and a room is nearby, then great convert a closet.

Where I worked (2, Interesting)

Anonymous Coward | more than 3 years ago | (#35363452)

I've worked in several data centers. An IBM one had air cooled servers (push cold air into the floor, and every rack has a massive fan pulling cold air up from the floor. It was about 20C day and night. The floor tiles would occasionally break which caused problems when pulling skids of bills over them (it was a phone bill processing and print facility). We would also go through 30 tons of paper per month (2x 1000lb skids of paper per day). There was a lot of paper dust in the place, and the paper was perforated, but on the long side (not on the short side like pc printer paper) because it would tear apart if it was ran through on the short side. There were tractor holes too, but they weren't perforated. Rotary cutters would cut off the tractor holes. The paper went through some of the equipment at about 60 miles per hour. The printers were in general, slower (IBM 3900 laser printers), as they could only print 229 pages per minute. A 2200 sheet 35 pound box of paper would go from full to empty in about 9 1/2 minutes. Fire prevention was Halon. We were told that if the Halon goes off, you probably won't die from the Halon snuffing you out, but rather the floor tiles flying up and severing body parts (they were about 2 1/2 feet square, made of aluminum about 1 inch thick, but only about 10 pounds each). I worked in another data center that had no windows. If the power went off (and it did once, but not when I was on shift), everything went black. No emergency backup lights. The room was about 80 feet wide, and at least 150 feet long, with rack and servers galore (2 operators, more than 300 machines), including DEC Alpha boxen, HP HPUX boxen, PC's, network archive servers, etc. Good luck feeling your way out of that one. While the company was very picky about losing data and running jobs at night, their main interest was making money, and if that involved cutting a power line (tech cable) to put in a road to move product temporarily, . In general, data centers are built to house computers. Operators are a second thought. If there is a problem, bosses yell at operators. Is it up yet? How about now? When? ... and if bosses come in with guests for a dog and pony, operators are chattle (it would be good if you went away somewhere). If there is a problem.... whats the problem, what did you do?

Re:Where I worked (3, Insightful)

Mr. Freeman (933986) | more than 3 years ago | (#35365024)

"No emergency backup lights."
You're aware this is illegal, yes? "My boss is cheap and doesn't care" isn't an excuse. Call the fire marshal and tell them about it. They'll come down and write the owner up a ticket and force him to install the safety equipment.

Always surprises me the number of idiots that have the motivation and intelligence to bitch about the unsafe working conditions on the internet, but not to the fire marshal or OSHA.

Re:Where I worked (1)

Anonymous Coward | more than 3 years ago | (#35365986)

True, however, most people don't want to get fired and blacklisted, perhaps arrested when some "evidence" is found in a mailbox.

It is the same reason why even though forklift operators have the authority to decommission a forklift because it has no brakes, why they never do so -- because they know they will be tossed out and someone who doesn't complain put in.

Oh, try to sue for retaliation? Good luck. Not in this economy, nor this political climate that is extremely worker hostile.

Nope (2)

rsilvergun (571051) | more than 3 years ago | (#35363530)

Not the future. Didn't you get the memo? Capitalism says the cheapest is best, and amenities cost money. You might see some for the visitors (gotta keep the client happy), but you think they'll pay to keep the place cool for the admins? Not a chance.

Re:Nope (1)

Damek (515688) | more than 3 years ago | (#35364928)

Not until admins unionize, anyway. Which, luckily, they won't, because they're all libertarians and don't value more than money.

not useing rent a cop security is good! (1)

Joe The Dragon (967727) | more than 3 years ago | (#35363570)

not useing rent a cop security is good!

Re:not useing rent a cop security is good! (1)

PTBarnum (233319) | more than 3 years ago | (#35363794)

What kind of security do you prefer?

These things (1)

Hognoxious (631665) | more than 3 years ago | (#35363604)

If you have some things that need certain conditions, and other things that need different conditions, then you have a problem.

Fortunately, I have a solution. It's called putting a wall between them, you fucktards.

Of course not (2)

Dagum (26380) | more than 3 years ago | (#35363628)

Only in those situations where it makes for additional income.

I wonder where the author of this post is from... (0)

Anonymous Coward | more than 3 years ago | (#35363724)

This concept has been a growing trend over the last 10 years, it's nothing new which is why I question the author's sudden desire to post about this. It costs you in the bottom line, you pay more for your colo because of the wasted "office" space (which never gets used). I'm friends with some sysadmins who work at a large colo in Southern California with these facilities, and they say these premium offices almost never get used, and often end up being where they have their company parties where they play big movies on the projection screen, all at the expense of their customers.

All I can say is, you idiots are paying for this crap. I personally have my servers in a colo with NO office overhead or anything silly like that, and I have had my servers there over 6 year now happily. If this facility started this trend, I would move my servers immediately.

Way too darn cold (1)

MikeB0Lton (962403) | more than 3 years ago | (#35363772)

In my experience people keep the datacenter way too cold. If the equipment runs fine at 70F, then set the CRAC to that temperature. If the hot aisle is working right, everything will be cooled within specifications.

Re:Way too darn cold (1)

afidel (530433) | more than 3 years ago | (#35364242)

We have our setpoint at 72 and even without hot aisle containment (but proper hot/cold design) everything is fine. We only lose about 1.5% per year on HDD's and basically no other components in a statistically significant number (maybe 3 PSU's in 5 years).

Re:Way too darn cold (1)

MikeB0Lton (962403) | more than 3 years ago | (#35364716)

I doubt most have hot aisle containment. Typically just spacing it out right, properly sizing the air conditioner, and installing rack blanking panels is sufficient. Hot spots (usually caused by packing in blades) can be cooled with rack level gear.

amenities (3, Funny)

fahrbot-bot (874524) | more than 3 years ago | (#35364018)

... amenities for staff and visitors ...

I want a pony.

Re:amenities (1)

syousef (465911) | more than 3 years ago | (#35366538)

... amenities for staff and visitors ...

I want a pony.

I want amentiies for the staff. I hear machines don't work so well when you urinate on them.

Centers have had tech-friendly amenities for years (3, Informative)

OgGreeb (35588) | more than 3 years ago | (#35364044)

I've been renting facility space in a number of data centers over the last fifteen years, including Exodus (remember them?), IBM and Equinix. In particular Equinix facilities have always provided meeting rooms, work areas, (seriously locked-down) access terminals, great bathrooms and showers for visiting techs for at least 5-7 years. OK, the actual cage areas are pretty cold, but that's the nature of the beast -- I wouldn't want my equipment to overheat. Equinix also has tools you can checkout if you forgot yours or were missing something critical, and racks of screws, bolts, optical wipes, common cable adapters, blue Cisco terminal cables... just in case. (Other than paying them for service, not affiliated with or owning stock in Equinix. But perhaps I should have.)

I would always look forward to the free machine hot-chocolate when visiting for work assignments.

admiring the skillfulness of slashdot articles (2, Funny)

Anonymous Coward | more than 3 years ago | (#35364190)

This troll was good, though my favorites are more like "My boss asked me to spend $5 million upgrading the machine room but I've never done this before, so do you have any advice? Should I include comfy chairs?" or "I'm considering upgrading my skills, do you think it would be worth it to learn Javascript or should I just go to grad school at MIT?" Or sometimes, "I'm having a big fight with my boss, can you give me some evidence that Erlang is really the programming language of the future?" I love slashdot.

Heat and Noise (2)

sexconker (1179573) | more than 3 years ago | (#35364288)

Heat: Data centers should be cool. Everyone wants to do things as cheaply as possible, so they spot cool the racks instead of circulating the air and cooling the entire room. Nothing short of abandoning this practice will remove the "it sucks to be in here" factor. The problem isn't so much that it's 100 degrees, but that it's 100 degrees on one side of the rack and 40 degrees on the other. Spend a bit more on cooling costs and get that to 80/60 or even 90/50 and workers will be much less miserable (and equipment will fail less often!). It doesn't have to be a flat "RUN THE HVAC HARDER" solution either.

Run shit efficiently (spot cooling) 24/7.
Something breaks? Whatever alert that triggered the SMS to the on-call employee that tells him to come in and fix it can also trigger the HVAC to do a bit more work to make things more comfortable. Scheduled maintenance is even easier. When work is done, go back to miser mode.

As for noise, the answer is so fucking easy. Larger fans. It boggles my mind to see servers using 80mm, 60mm, and smaller fans for the various components.
Cut that shit out. Your 1U and 1/2U and Blade style shit might not have much leeway, but bigger, slower fans are certainly an option on everything 2U and up.

But as long as cost is king, nothing will change.

Missing the point much? (1)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#35364382)

If you have a datacenter large and serious enough that you've got a full hot/cold aisle setup, deafening fans, etc. rather than just a rack frame in a closet somewhere, people being in there is supposed to be unusual. Unless a piece of hardware is being swapped out, as fast as your screwdrivers will carry you, or somebody fucked up in a network-unrecoverable way, why are there humans inside at all?

Re:Missing the point much? (0)

PPH (736903) | more than 3 years ago | (#35364516)

why are there humans inside at all?

Alt-Ctrl-Del?

Re:Missing the point much? (1)

dzr0001 (1053034) | more than 3 years ago | (#35365092)

Again, what the parent said. If you have a datacenter that serious you do not need people to three-finger-salute your boxes. Remote management was invented for a reason. Granted, people still have to go in there to rack, cable, and kick boxes with misconfigured IPMI or the like. That doesn't mean you design the DC to be comfortable enough to office out of.

Nothing new (0)

Anonymous Coward | more than 3 years ago | (#35364422)

I work in a data center built in 1999 and it has rooms for visitors and conference rooms, as well as "cube land" for staff.

Hot! (1)

bloobamator (939353) | more than 3 years ago | (#35364466)

The only thing hot in the many data centers I toured were the saleswomen. Every single one of them, young, female and HOT! No exceptions.

Re:Hot! (1)

Gofyerself (1709970) | more than 3 years ago | (#35365530)

Amen

Does cost matter? (1)

sdguero (1112795) | more than 3 years ago | (#35364806)

If it does, then the answer is a resounding no. NO. NO!!! NoooOOOOOOooooOOOOOOooooOOOOOOooooo!!!!!!!!!!!!!!!!!!!!!!!

You are missing the whole points!!! (1)

oktokie (459163) | more than 3 years ago | (#35364824)

With whole things moving to the universe of virtualization and heavy encryption technology the location and security of the physical machines become non-issues. there is no need for owning the hardware in the future unless you had specific needs and if you are are the type requires special needs then you can not risk not to have in-house data-center where you probably have $$$ and the thing is high profile. Um...google & apple don't they have their own data-center? Enough said...and move along, as there is nothing to be discussed.

the bandwidth is just not there for that right now (1)

Joe The Dragon (967727) | more than 3 years ago | (#35365578)

the bandwidth is just not there for that right now do you want pay the costs to get a fios like network in the areas that are still just on ADSL?

I'm moving toward human-free data centers..... (3, Insightful)

Desmoden (221564) | more than 3 years ago | (#35365072)

It's not that I don't like humans, hell I married one. However humans are unpredictable. Applications want and need predictable hardware to live on. Even in a "CLOUD" with floating VMs that fly around like Unicorns you want stable predictable hardware underneath.

Humans trip on things, excrete fluids and gases, need oxygen, light, are temperature sensitive and depending on who's stats you believe cause up to 70% of outages.

I see convergence, virtualization etc as a chance to finally get humans OUT of the data center. Build it out, cable everything. Then seal it. Interaction does NOT require physical access. And a team of dedicated obsessive compulsive robots or humans can replace memory, drives etc.

Data Centers need to be human FREE zones. Not the common room in a dorm.

Peer1 is nice (0)

Anonymous Coward | more than 3 years ago | (#35365210)

My gear is at Peer1's new Data Center in Toronto. They have nice work areas outside of the POD to work on your gear, where it is quite and you have plenty of space etc. Plus they have a nice couch with a huge TV if you need a break. The staff have nice offices too. Although there are never any paper towels in the washroom.

http://www.peer1.com/hosting/toronto_data_center.php

No. (2)

lcllam (714572) | more than 3 years ago | (#35365374)

Data centers are utility rooms and serve a utility purpose. Aside from the showoff trips for the clients, they are probably factored as such and will be closer to a boiler room than an office. Ever see a nicely decked out boiler room?

Container-based data centers - zero amenities (1)

extcontact (2002990) | more than 3 years ago | (#35365624)

Gen 4 data centers: ITPACS - no HVAC, no aisles, you never touch a server. Automated provisioning and data transfer means when a server starts experiencing too many the vm gets shut down, moved, and the server shut down; when n servers are shut down the container is shut down and replaced. No one looks at an error log to see what the problem is - no one cares - just send it back to the vendor for refurbishing.

http://blogs.technet.com/b/msdatacenters/archive/2011/02/01/datacenter-efficiency.aspx

http://www.microsoft.com/showcase/en/us/details/84f44749-1343-4467-8012-9c70ef77981c

Re:Container-based data centers - zero amenities (0)

Anonymous Coward | more than 3 years ago | (#35366064)

A large amount of PCs may be MS's dream, but in reality, not all tasks can be done on scads of crap x86 hardware in a can.

One can take a zSeries mainframe which takes up 1-2 mainframe racks, put it up against that container, and show that even though the IBM mainframe does take up HVAC energy, it actually uses far less total due to virtualization as opposed to a ton of x86 hardware. Duplication of machine images? A DS8xxx has built in deduplication, so having a stack of VMs is easily doable. To boot, if a VM requires more combined CPU power than one of the x86 racks/blades, it easily can get it. Yes, there are large I/O tasks that will make x86 machines cry for their mommies, because the I/O buses just isn't up to snuff.

x86 hardware is great, but you can only put so much lipstick on a pig -- it still is not up to snuff if you want to go past 2-3 9s in reliability, even with clusters.

You can keep the Microsoft clusters... I'll stick with the Sun and IBM solutions, which are designed from the ground up for reliability, not cheapness above all else.

Re:Container-based data centers - zero amenities (1)

extcontact (2002990) | more than 3 years ago | (#35366444)

Would love to see a head-to-head, apples-to-apples vm-based set of benchmarks for a z10/DS8800 rig with full HVAC versus a container of HVAC-less x86 boxes on a transactions/$ and transactions/kWh basis. Doubt such a fair comparison exists from either side, however, I'd be taking the bet that the container would win in both dollars and kWhs.

Windows Friendly (1)

lexcyber (133454) | more than 3 years ago | (#35366670)

Shouldn't it be windows friendly since they have to be there first Tuesday of the month.

http://www.ppshopping.us (0)

lili30 (2007600) | more than 3 years ago | (#35367112)

www.ppshopping.us
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>