Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Facebook Suffers Actual Cloud In Oregon Datacenter

timothy posted about a year ago | from the but-the-cloud-is-the-computer dept.

Cloud 83

An anonymous reader writes "The Register carries the funniest, most topical IT story of the year: 'Facebook's first data center ran into problems of a distinctly ironic nature when a literal cloud formed in the IT room and started to rain on servers. Though Facebook has previously hinted at this via references to a 'humidity event' within its first data center in Prineville, Oregon, the social network's infrastructure king Jay Parikh told The Reg on Thursday that, for a few minutes in Summer, 2011, Facebook's data center contained two clouds: one powered the social network, the other poured water on it.'"

cancel ×

83 comments

first post (-1, Offtopic)

Anonymous Coward | about a year ago | (#43950547)

Bing it on!

Re:first post (1)

bkcallahan (2515468) | about a year ago | (#43962841)

Yup, that's Oregon.

Obligitory (5, Insightful)

Anonymous Coward | about a year ago | (#43950553)

And nothing of value was lost.

Re:Obligitory (1)

K. S. Kyosuke (729550) | about a year ago | (#43951309)

I'm sure that won't prevent Facebook from putting the people responsible behind a grid.

Re:Obligitory (5, Funny)

Anonymous Coward | about a year ago | (#43951569)

And nothing of value was lost.

Various US intelligence agencies and their chums in other countries beg to differ.

Re:Obligitory (-1)

Anonymous Coward | about a year ago | (#43952327)

Various intelligence agencies are wrong, then. They're doing the equivalent of looking for murder plots on the Jerry Springer show. They're gonna find a lot of people spouting "I'll kill you, you cheatin' whore" and not a lot of actual, honest-to-god murderous intent. Will they solve some petty/personal crimes? Sure. But they're going to be entirely ineffective at doing their fricking job.

De-fund them all and let God sort it out.

Re:Obligitory (0)

geogob (569250) | about a year ago | (#43951729)

I'm pretty sure those servers and power supplies were valuable, regardless of the data they host (which is also of value to someone).

Re:Obligitory (1)

Coren22 (1625475) | about a year ago | (#43956983)

Some insurance company will pay for them I am sure...and some AC tech might lose his job...

Re:Obligitory (1)

Dr Floppy (898439) | about a year ago | (#43965723)

And nothing of value was lost.

Oh how absolutely true

Where are the Pics? (5, Insightful)

anthony_greer (2623521) | about a year ago | (#43950555)

I dont se any pics in the linked article, Someone has to have pictures of this if it happened...

Re:Where are the Pics? (3, Informative)

Wild Wizard (309461) | about a year ago | (#43950609)

This is one of those RFA to get to the RA type stories.

The link next to the quote is the one you want :-
http://www.opencompute.org/2011/11/17/learning-lessons-at-the-prineville-data-center/ [opencompute.org]

Re:Where are the Pics? (5, Insightful)

anthony_greer (2623521) | about a year ago | (#43950625)

saw that and I think the issue is that the sudden humidity change caused condensation, not terribly uncommon if prompt action isnt taken upon AC failure in a humid climate...I don't see a "Cloud in the room"

Hype to sell newspapers...and link bait...

Re:Where are the Pics? (0)

Anonymous Coward | about a year ago | (#43950689)

666 failure? Are you sure your looking the right direction?

Re:Where are the Pics? (1)

Anonymous Coward | about a year ago | (#43950873)

Prineville is high desert and is very dry.

Re:Where are the Pics? (1)

sjames (1099) | about a year ago | (#43954325)

Given that it was a condensing environment to the point that even the power supplies (typically hotter than the incoming air) got wet enough to short out, yes a cloud formed in the data center. There would have to have been actual droplets of water in the air.

Re:Where are the Pics? (0)

Anonymous Coward | about a year ago | (#43954555)

Meanwhile back in reality (and the source article if you read it) you'll find no reference to a physical cloud forming. What happened was the same thing as a cold glass on a hot humid day. The water vapor (not droplets) condensed against cooler surfaces and PSUs are pretty cool components except where the various power transistors are located. In the photos you can clearly see condensation forming on the chassis, capacitors, and inductors, components which don't produce meaningful, if any, heat.

Re:Where are the Pics? (1)

sjames (1099) | about a year ago | (#43956105)

Actually, those pics look a LOT like what things look like when you're actually inside a cloud. I'm guessing you haven't ever been.

Re:Where are the Pics? (2)

aXis100 (690904) | about a year ago | (#43956747)

That is one of the funniest things I have read for ages!

Power supplies are dealing with 700+ watts at maybe 90% efficiency, and you describe them as "pretty cool"? Bollocks! Capacitors and Inductors have current flowing and series equivalent resistance, which certainly generates heat. They *should* certainly be hotter than ambient, which is the most important factor in condensation.

If I saw condensation on powered electronics components I would be looking for the ladder to climb out of the pool.

Re:Where are the Pics? (0)

Anonymous Coward | about a year ago | (#43957377)

Someone call the doctor! Gramps woke up and is talking! It's a miracle!

Re:Where are the Pics? (0, Insightful)

Anonymous Coward | about a year ago | (#43950685)

Pics or GTFO

Re:Where are the Pics? (1)

Anonymous Coward | about a year ago | (#43951487)

there are - those photos are just private

Re:Where are the Pics? (1)

antdude (79039) | about a year ago | (#43956967)

Or better, videos. :)

Obligatory (5, Funny)

identity0 (77976) | about a year ago | (#43950585)

Welcome to Oregon, it rains a lot.

Re:Obligatory (4, Informative)

Ol Biscuitbarrel (1859702) | about a year ago | (#43950619)

Sure, if you think 10.4 inches [usclimatedata.com] yearly average is a lot. East side of the state's actually quite arid; the west side is quite soggy in the Coast Range and seaside but the Willamette Valley where most of the population lives isn't exceptionally rainy, it's that it's subject to never-ending spells of overcast weather; other parts of the country actually have higher annual precipitation.

Re:Obligatory (0)

Anonymous Coward | about a year ago | (#43950649)

I'll still take Oregon over DC any day of the week...and yes, I am a native of DC.

Re:Obligatory (1)

Runaway1956 (1322357) | about a year ago | (#43953043)

"and yes, I am a native of DC."

I've never met anyone who was willing to admit that! Oh, wait - Anonymous Coward? I still haven't met anyone who is willing to admit that he is a native of the District of Columbia.

Re:Obligatory (0)

Anonymous Coward | about a year ago | (#43953325)

But that's PERFECT!!!!!! An AC in DC!!!! He's clearly either highly charged up about this, or he's on the Highway to Hell. Pick one.

Re:Obligatory (2, Interesting)

donaldm (919619) | about a year ago | (#43950819)

Welcome to Oregon, it rains a lot.

From the Article

This resulted in cold aisle supply temperature exceeding 80F and relative humidity exceeding 95%. The Open Compute servers that are deployed within the data center reacted to these extreme changes. Numerous servers were rebooted and few were automatically shut down due to power supply unit failure.

WTF 80 deg F (approx 27 deg C) is quite warm in a Data-centre especially in a "cold aisle" and 95% humidity is criminal.

Facebook learned from the mistakes, and now designs its servers with a seal around their power supply, or as Parikh calls it, "a rubber raincoat."

When designing a Data-centre you have to plan for a certain temperature range that the equipment you have inside operates optimally. In addition you have to keep the humidity within manufacture recommended limits since too low results in static electricity and too high well you could get condensation on the electrical equipment. Rubber seals may protect power supplies although I don't think this will completely protect them, but what about the rest of the equipment such as the electronics, connectors and what about backup systems which are very susceptible to temperature and humidity just to name a few.

My score for this design is zero out of ten and ten out of ten for LOL, welcome to cloud computing :)

Re:Obligatory (1)

myowntrueself (607117) | about a year ago | (#43951271)

Do you actually know how the Facebook Oregon datacenter works?

It doesn't *have* HVAC. The building *is* HVAC: The entire building is one very very large HVAC unit.

HVAC units have 'efficiency of scale'; as you build them larger they get more efficient. Thats what Facebook was aiming at. Unfortunately its a relatively new development in building and datacenter design and clearly has some bugs to work out!

Re:Obligatory (3, Informative)

Anonymous Coward | about a year ago | (#43951379)

WTF 80 deg F (approx 27 deg C) is quite warm in a Data-centre especially in a "cold aisle" and 95% humidity is criminal.

You're used to classic datacentres, where the goal was "shove as much cold air into them as possible", i.e. "the lower the temperature the better". It all depends on how the datacentre was built, how its cooling system is/was engineered, and an almost indefinite number of variables. References for you to read (not skim) -- the study in the PDF will probably interest you the most:

http://www.datacenterknowledge.com/archives/2011/03/10/energy-efficiency-guide-data-center-temperature/ [datacenterknowledge.com]
http://www.geek.com/chips/googles-most-efficient-data-center-runs-at-95-degrees-1478473/ [geek.com]
http://blog.schneider-electric.com/datacenter/2013/05/06/getting-comfortable-with-elevated-data-center-temperatures/ [schneider-electric.com]
http://www.cs.toronto.edu/~nosayba/temperature_cam.pdf [toronto.edu] (PDF)
http://www.dummies.com/how-to/content/data-center-temperature-and-humidity-range-recomme.html [dummies.com]

TL;DR -- 80F is not "quite warm" for a datacentre designed/built within the past 10-11 years.

Re: Obligatory (1)

DigiShaman (671371) | about a year ago | (#43953211)

80f does cause the server fans to work harder via higher RPMs though. The higher your temps, the less margin of error you have to make corrections in a DC.

Re: Obligatory (0)

Anonymous Coward | about a year ago | (#43954011)

80f does cause the server fans to work harder via higher RPMs though. The higher your temps, the less margin of error you have to make corrections in a DC.

(Same AC person here...)

Your logic is correct, but you're making some very erroneous assumptions about server-grade hardware and do not appear to have a good grasp of familiarity with hardware monitoring. What you say might be true for desktop systems, but is often not for server systems. I'll explain:

The types of fans used in actual server-grade hardware do not have on-die thermistors for temperature detection in them (meaning they will not throttle the incoming voltage used to drive the motor themselves). Most chassis fans are driven either off of a 4-pin Molex that goes right to the PSU (and many of these fans are rated at 10,000rpm or 15,000rpm), while some are driven off of a 3 or 4-pin header on the mainboard linked to a Super I/O chip that also offers H/W monitoring capability (Nuvoton/Winbond, Maxim LM75, ITE IT8712F, ADM1028, etc.), which is nice since then you can actually monitor the fans through software, usually via SMBus but sometimes through classic LPC (classic ISA ports via x86 inb/outb instructions).

In the case of the latter, the Super I/O chip is configurable in many different ways. For servers, the stock defaults in the BIOS are always "Full speed", meaning chassis temperature has no bearing on fan voltage/RPMs. If the BIOS (which configures the Super I/O chip pre-POST) is set to something that causes the chip to honour that in its calculation formula, then the server administrator needs to have his/her ass kicked. The only time you change from "Full speed" to something else is if your server is in an environment where noise is a concern (i.e. a 2-post telco rack sitting next to your desk). We're talking about actual enterprise-grade datacenters here, so that shouldn't be the case. The fans stay at a constant RPM regardless of workload.

This same advice applies to CPU cooling fans as well, and in fact some thinner chassis (1U) systems lack a CPU fan entirely, instead relying on nothing but a large "notched" or "sliced" copper or aluminium heatsink (with several 15,000rpm fans blowing air through/across it).

How do I know all this to be true? Because I write open-source software that monitors and manages the H/W monitoring capabilities of a large number of Nuvoton/Winbond Super I/O chips, specifically for server-class motherboards, and also a system administrator (23 years since I started learning the ropes, 18 years in the field/professionally) who worked in datacenters for about 8 of those years.

Re: Obligatory (1)

sjames (1099) | about a year ago | (#43954497)

Most of the servers I have seen default to throttling the fans based on temperature. That included 1U systems where the fans are controlled by CPU temperature. Perhaps some enterprises run the fans full speed, but others *INSIST* on throttling in the datacenter. I prefer to run them at full throttle, but do acknowledge that in an im[perfect world, after a few years there WILL be a few dead fans being actively ignored.

In a cluster, you can actually hear it when a job is submitted, it sounds a bit like a jet plane prepping for takeoff.

80F is a bit warm, 75-78 is more reasonable. The old school likes 65-70F.

As fopr appeal to authority, I have written the software that configures those Superio chips (right after getting the main memory and the busses running).

Re: Obligatory (1)

OdinOdin_ (266277) | about a year ago | (#43958249)

I concur...

Re GPP: " Nuvoton/Winbond Super I/O chips" these sound like cheap chips, I don't think many of my servers uses these. The servers under high temp and high CPU load do indeed have sensors to throttle/control fans to increase/decrease airflow, they also have a fail-safe mechanism that is the sensor breaks the falls run at full speed. On top of all this the server unit has a motherboard base thermal shutdown that turns the whole server off if the ambient temperature is over some limit.

I have had this happen before in just such a scenario where the D/C aircon failed and the redundant capacity was under-rated for the area being cooled. The room temperature rose so the point that it increased the set-points of the equipments ambient temperature and other parts to trigger a shutdown to protect itself.

Re: Obligatory (1)

DigiShaman (671371) | about a year ago | (#43960711)

You're right about server grade hardware being more robust and intelligently managed to that of a standard desktop. However, in this context, I was speaking of servers. I'm not sure what kind of units you're talking about, but I work exclusively with Dell PowerEdge units. Take a PE R710 (IU) for example. It's maximum warning threshold is set for 42C (108F) with a failure of 47C (117F). The RPMs will throttle on their own (factory default) based on ambient temp per sensor readings on the IPMI bus. Though it should be noted that excessive high temps will shorten the lives HDDs and in some cases cause a premature SMART failure of the drives. And we haven't even begun looking into the theoretical reduction in MTBF. Perhaps with someone as large as Google and Amazon, the savings in energy is great that it offsets the cost to swap hardware often. I wouldn't know. But for a small to medium business that requires their own off site servers, the risk is not acceptable to me.

Re: facebook learned fom its mistakes... (0)

Anonymous Coward | about a year ago | (#43951945)

no, it didn't.

Re:Obligatory (1)

faedle (114018) | about a year ago | (#43955511)

Only on the western third. Prineville is, in fact, on the northwestern edge of the Great Basin Desert, and in fact gets about 11 inches a year of rainfall.

By comparison, Phoenix, Arizona, gets 9.

Re:Obligatory (0)

Anonymous Coward | about a year ago | (#43958705)

Not in Prineville.

Lightning in that particular server room (1)

Anonymous Coward | about a year ago | (#43950627)

as well and Ill start believing in a just $deity

Re:Lightning in that particular server room (1)

maxwell demon (590494) | about a year ago | (#43951329)

There were some power units failing. I can imagine that this happened with sparks, which are sort of small lightning.

Maybe Berndnaut Smilde snuck in: (5, Interesting)

Tablizer (95088) | about a year ago | (#43950715)

Humour (-1)

Anonymous Coward | about a year ago | (#43950731)

That was the funniest? Goodness, I must avoid that website.

www.AddaBazz.com (-1)

Anonymous Coward | about a year ago | (#43950739)

www.AddaBazz.com It is New Social Site

Tech Support (1)

Tablizer (95088) | about a year ago | (#43950745)

Caller: "There's a cloud in our cloud, come immediately!"

Support: "Speak loud, I cannot hear you."

Caller: "No, it's cloudy in our cloud."

Support: "Yes, speak loud."

Caller: "Yes, there is a big cloud."

Support: "Yes, you must speak loud, that's what I said."

Caller: "You must have a cloud also, nothing's making sense. Let's try this: bring some sun."

Support: "Come soon?"

Caller: "Yeah, that too. Come soon with sun."

Support: "I can't hear you, my connection is cloudy."

Long story in short ... (0)

Anonymous Coward | about a year ago | (#43950807)

Traditional air conditioners in data centres regulate temperature and humidity. Energy efficient air conditioners used in trendy data centres take care of the temperature, but ignore the humidity. This created condensation. The condensation was bad for operating electronics. Things went wrong.

Moral: hire more competent engineers, preferably ones who understand physics.

Re:Long story in short ... (2)

Mr. Freeman (933986) | about a year ago | (#43956663)

No, you are (mostly) wrong.

What happened was that the system malfunctioned which led to hot and humid air being circulated throughout the system. This normally would not cause condensation. However, all of the equipment was previously cold (because the system was working normally before it failed). The hot and humid air came into contact with cold components (various components in the power supply and computer casing). This caused condensation (because the hot and humid air contacted the cold components, cooled down, and had to ditch some of its water due to thermodynamics). The condensation on components of the power supply was then blown into various PCB components when the maintenance staff increased the airflow in the datacenter in an attempt to bring the temperature down. The condensation hitting the PCB caused what you would expect when dumping water onto electronics: stuff shorted out and failed.

You're correct insofar as that humidity was not properly controlled, but this was hardly a case of idiots at the controls or at the drawing board. This was an unanticipated failure mode that was experienced because of a faulty control sequence being generated.

TFA has an even better one (0)

Anonymous Coward | about a year ago | (#43950827)

"a few were automatically shut down due to power supply unit failure."

I'm not a modern IT guy -- possibly there's actually something that can be described that way? Till then, that's easily the best PR facepalm I've heard this week.

Streamed Hams (2, Funny)

Anonymous Coward | about a year ago | (#43950971)

Superintendant Chalmers: A rain storm? At this time of year? At this time of day? In this part of the country? Localized entirely within your datacentre?

Re:Streamed Hams (1)

crutchy (1949900) | about a year ago | (#43950987)

it's supernintendo chalmers... you insensitive clod!

Re: Streamed Hams (1)

Colourspace (563895) | about a year ago | (#43951197)

Can I see it? .... no. One of my favourite Simpsons sketches ever..

happened before (2)

crutchy (1949900) | about a year ago | (#43950983)

in nasa's vehicle assembly building

Sorry for this but (0)

Anonymous Coward | about a year ago | (#43951071)

Yo dawg, we heard you like clouds, so we put a cloud in your cloud.

There are at least 3 clouds then. (3, Interesting)

VortexCortex (1117377) | about a year ago | (#43951099)

The first cloud would be the humidity and condensation sort. The second cloud would be the online service itself. The third cloud, would be the open Internet between the endpoints in a network graph. [infront.com]

What do all these clouds have in common? They're dangerous. The less clouds in your diagram the more you know about your network architecture, latency, and data integrity. The less clouds the better! When a packet goes into the shroud of the cloud in the diagram there's a much higher chance we'll never see it again. This cloud is the one where we must encrypt our data and protect against spoofing and hacking and all forms of data manipulation and latency. The receiving end must be very careful to sanitize the inputs and verify the requests vigorously all because the packet has encountered the cloud. Likewise if we want to interact with an online "cloud" service, we shift the name packet to "our stuff" our login credentials and even bank account info, we have to worry about availability and bandwidth caps when streaming, and unwanted prying eyes from folks we may not desire to have looking, everything becomes far more risky because our stuff touched the cloud service; Far more risky than physically going to the bank or visiting a friend in person would not be subject to. If someone hacks the ATM, the entire bank doesn't lose everyone's credentials. As for the mist filled variety of cloud: It can not only get wet, but if you have a big enough cloud, it can strike you with lightning. We must have surge protections and battery backups against this cloud too.

When I hear people talking about embracing the "cloud" I cringe. "To The Cloud!", in my mind means, "Danger Will Robinson!"

As somebody working on building energy topics (1)

drolli (522659) | about a year ago | (#43951247)

That this happens shows me that they realy optimize their air conditioning for energy consumption.

Traditionally the approach would have been: "Dont think, cool down and re-heat the air constantly to dehumidify it sufficiently". So traditionally you do this dumb with a lot of energy, even if its not needed at all times. What we probably see there is that some control could not (predict or) handle some drop in the inner load (electrial power) in the data center.

Re:As somebody working on building energy topics (3, Interesting)

myowntrueself (607117) | about a year ago | (#43951275)

That this happens shows me that they realy optimize their air conditioning for energy consumption.

Traditionally the approach would have been: "Dont think, cool down and re-heat the air constantly to dehumidify it sufficiently". So traditionally you do this dumb with a lot of energy, even if its not needed at all times. What we probably see there is that some control could not (predict or) handle some drop in the inner load (electrial power) in the data center.

The Facebook Oregon datacenter doesn't 'have' air conditioning.

The building is an 'air conditioner'. Its an experimental design...

Re:As somebody working on building energy topics (1)

drolli (522659) | about a year ago | (#43951553)

Right. I said "air conditioniong" i did not say "air conditioner".
 

Re:As somebody working on building energy topics (1)

faedle (114018) | about a year ago | (#43952655)

I work for another data center operator in Central Oregon. We also use ambient air cooling. I don't know if we use the same system Facebook does, to be honest.

Central Oregon is the northwestern edge of the Great Basin desert. Summers here are bone dry. Our data center gets so dry we actually have the opposite problem: it gets TOO dry.

Re:As somebody working on building energy topics (1)

Runaway1956 (1322357) | about a year ago | (#43953141)

Alright - what happens in a data center when it gets "TOO dry"? I would assume that people entering the building would become static electricity hazards. It would become essential that anyone handling or touching equipment must use a grounding strap. Anything else?

Re:As somebody working on building energy topics (1)

jones_supa (887896) | about a year ago | (#43953627)

Would hard drives work in very low humidity? Rubber mounting components (if any) in the servers might start crumbling down.

Re:As somebody working on building energy topics (2)

faedle (114018) | about a year ago | (#43955531)

Not just the people, but yes, static electricity is the primary concern. Also, I'm told by the people that manage these sorts of things that a "too dry" environment also makes air cooling less effective. Something to do with the fact that a little bit of moisture actually allows the air to carry more heat than if it was 100% dry.

Re:As somebody working on building energy topics (1)

Runaway1956 (1322357) | about a year ago | (#43957107)

Ahhh - never thought of that. Makes sense to anyone who has ever had cold survival training. A humid atmosphere leeches warmth from a human body much faster than a dry atmosphere, all other things being equal.

Re:As somebody working on building energy topics (0)

Anonymous Coward | about a year ago | (#43957805)

That this happens shows me that they realy optimize their air conditioning for energy consumption.

Traditionally the approach would have been: "Dont think, cool down and re-heat the air constantly to dehumidify it sufficiently". So traditionally you do this dumb with a lot of energy, even if its not needed at all times. What we probably see there is that some control could not (predict or) handle some drop in the inner load (electrial power) in the data center.

The Facebook Oregon datacenter doesn't 'have' air conditioning.

The building is an 'air conditioner'. Its an experimental design...

The 'building' is an air conditioner.

Fixed that for you.

Re:As somebody working on building energy topics (1)

jbengt (874751) | about a year ago | (#43951749)

Traditionally the approach would have been: "Dont think, cool down and re-heat the air constantly to dehumidify it sufficiently"

Actually, traditionally, the cooling and the reheat would each be cycled/modulated by the thermostat/humidistat, not run constantly
The mistake in this case was not accounting for changes in temperature and humidity, including the fact that the dewpoint temperautre of the air can change much more rapidly than the temperature of the solid objects in the room. It really was a boneheaded mistake, one that is hard to understand given the unusual nature of the HVAC for this facility; you would think that a lot of careful thought about the controls would have taken place during the design of something out-of-the ordinary.
I have designed around simlar potential problems on much more mundane designs before.

Heat exchange (1)

Reliable Windmill (2932227) | about a year ago | (#43951377)

I'm thinking there has to have been great heat exchange in a system like this.

I severely doubt (1)

wisnoskij (1206448) | about a year ago | (#43951681)

That the roof was high enough to actually allow the formation of clouds. Also, I believe you need dust in the air to form clouds, and I would think that their would be a lot less than normal in a server room. Intense condensation on the roof causing rain is another thing.

Re:I severely doubt (0)

Anonymous Coward | about a year ago | (#43952161)

The only building I know of that's big enough to form clouds inside it is NASA's Vehicle Assembly Building.

-- hendrik

No, really, WTF? (2)

faedle (114018) | about a year ago | (#43952595)

Ok, I work for a data center operator. In Central Oregon.

Our data center is so damn dry that most of the time in the summer we're getting alerts about the humidity being too low. How did Facebook fuck this one up?

Re:No, really, WTF? (0)

Anonymous Coward | about a year ago | (#43967053)

Two words: "swamp coolers".

Phone Conversation (1)

FeatureSpace (1649197) | about a year ago | (#43952791)

Phone conversation between two data center techs:

Tech 1: "There's a cloud in the Facebook datacenter!"

Tech 2: "So? Facebook is built on cloud technology!"

Tech 1: "No I mean a real cloud!"

Tech 2: "Facebook is built on a server cloud architecture. It IS a real cloud you idiot!"

Tech 1: "There is a real cloud with real rain in the data center you geek retard! Its shutting down the servers!"

Tech 2: "Servers shutting down? Maybe the rainfall service is flooding the network with raindrop packets? That would be an emergency! Facebook's cloud is overloaded!"

Tech 1: "WTF are you talking about? There's real water in the servers. The water is causing electrical faults. Humidity is really high."

Tech 2: "There's a humidity problem? Probably the flood of raindrop packets interfered with the environmental control service.

Tech 1: "You might be on to something. Maybe that's what caused the cloud in the first place?"

Tech 2: "Caused the cloud? The environmental controls have nothing to do with the Facebook cloud!"

Tech 1: "You idiot! There is a real meteorological cloud in the data center, complete with real rain! Everything is getting wet! I'm not talking about the Facebook cloud!"

Tech 2: "So what's the problem? Just adjust the environmental controls and reduce the humidity. At least the Facebook's cloud and raindrop service are ok. You really had me worried about Facebook's cloud for a second."

Tech 1: "The entire data center is powering off you flaming moron!"

Tech 2: "Wow that's a real emergency! Facebook can't operate without its cloud!"

Tech 1: "Right. So back to the real cloud in the data center. I'm looking at it right now. Its unreal! You should see it!"

Tech 2: "Great! Now that the power is back I'll VPN in and check on the cloud. Are you seeing a lot of raindrop packets on the network?"

It's clouds (4, Funny)

rastos1 (601318) | about a year ago | (#43952905)

It's clouds all the way ... up?

I warned them ... (1)

PPH (736903) | about a year ago | (#43952907)

... they shouldn't have hired that Joe Btfsplk [wikipedia.org] guy for IT support.

Load balancing (4, Funny)

gmuslera (3436) | about a year ago | (#43953959)

Both clouds were leaking and pissing off users. Facebook must have real sysadmins [xkcd.com] .

Re:Load balancing (1)

ImprovOmega (744717) | about a year ago | (#43962727)

A better example [xkcd.com]

New entry to fortune database (0)

Anonymous Coward | about a year ago | (#43954235)

$fortune bofh-excuses
"Humidity event" in datacenter

You know your datacenter is too big... (1)

alanshot (541117) | about a year ago | (#43957505)

...When it develops its own atmospheric systems. (that include the water cycle)

Raining in Datacenters (0)

Anonymous Coward | about a year ago | (#43963525)

This actually happened to me as I was working in a containerized datacenter it was -2 outside and probably around 45-55 degrees internally.....

This actually happened to me around 1990 (1)

azav (469988) | about a year ago | (#43964223)

In the computer section of the UMass Dartmouth Library back around 1990, we had a really humid day one spring/summer inside the Library. There is a part of the library that is open several stories tall and glass windows let the light shine in. We actually had a few drops fall within the library that day. Luckily, no computers were harmed in this event. Super strange to be in the middle of one of these events, that's for sure.

may i call bullshit on this one? (0)

Anonymous Coward | about a year ago | (#43964261)

the condom on the power supply is also suspect

And data recovery was ..... (1)

dcpking (1463769) | about a year ago | (#43968483)

And I'm sure that they were happy to be able to ask their friends in the NSA for a backup copy of all their data for restoration :)
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...