Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Data Dome: A Server Farm In a Geodesic Dome

samzenpus posted about a month ago | from the keeping-it-cool dept.

Data Storage 62

1sockchuck writes In a unique approach to data center design, the new high-performance computing center in Oregon is housed in a geodesic dome. The new facility at the Oregon Health and Science University requires no mechanical air conditioning, using outside air to racks of servers reaching densities of 25kW per cabinet. The design uses an aisle containment system to separate hot and cold air, and can recirculate server exhaust heat to adjust cold aisle temperatures in the winter. It's a very cool integration of many recent advances in data center design, combining elements of the Yahoo Chicken Coop and server silo in Quebec. The school has posted a virtual tour that provides a deep technical dive.

cancel ×

62 comments

Sorry! There are no comments related to the filter you selected.

Awesome (-1)

Anonymous Coward | about a month ago | (#47698255)

According to the ebola story, niggers should be able to invent the same things, right? What have they actually invented? The stick? Drinking corpse water?

Re:Awesome (-1)

Anonymous Coward | about a month ago | (#47698457)

What have they actually invented?

Loincloths to hide their big black dicks. Jealous much?

Re:Awesome (1)

maroberts (15852) | about a month ago | (#47702277)

Supersoaker water pistol (Lonnie G Johnson)
Smoke protection hoods for firefighters & traffic signals (Garrett Morgan)
Electronic components and pacemaker controllers (Otis Boykin)
Carbon filament for lightbulbs (Lewis Latimer)
Various agricultural products - (NOT peanut butter though) (George Washington Carver)

If you're bored, read Wikipedia [wikipedia.org]

Moisture? (2)

Mr D from 63 (3395377) | about a month ago | (#47698341)

I wonder if they have any issues with moisture from constantly cycling in outside air? Its being heated, so I guess it won't condense, but still seems like it could be a concern over the long term. Is the air filtered? Particulates would be another concern, or would they just do some sort of cleaning?

Re:Moisture? (4, Informative)

Dan East (318230) | about a month ago | (#47698367)

In the video the narrator specifically states that the incoming air is filtered.

Re:Moisture? (1)

JWSmythe (446288) | about a month ago | (#47698543)

Particle filtration does not mean it dehumidifies.

Re:Moisture? (3, Informative)

JWSmythe (446288) | about a month ago | (#47698507)

You can take a look at their official page. http://www.ohsu.edu/xd/about/initiatives/data-center-west.cfm [ohsu.edu]

The tour video and text talk about plants outside filtering. The video around 3 minutes, shows additional filtering inside.

I suspect prevailing winds will really screw with the site cooling.

The "Virtual tour" has more details than the rest. Nothing about humidity.

Their security seems odd. They talk about the security being very strict. The video shows the inside of each "pod" to be open to the common hot air area in the upper part of the roof. So they have security, but you can get around it by not going through the doors. {sigh}

I never got the idea of sticking square boxes in a round hole. They're wasting a lot of good real estate by leaving all that extra space between the servers.

It seems like it was drawn up with an ideal world in mind, which usually doesn't translate well to the real world.

Wasting (1)

SuperKendall (25149) | about a month ago | (#47698669)

I never got the idea of sticking square boxes in a round hole. They're wasting a lot of good real estate by leaving all that extra space between the servers.

What you call "wasted space" I call "ventilation".

Also not factored in is how much space traditional HVAC equipment takes up in a normal data center.

Just the fact that this kind of building doesn't have the same power drain as HAVC facilities means you could have one in more places than a "normal" data center.

Re:Wasting (2)

JWSmythe (446288) | about a month ago | (#47698937)

As described, after looking at their materials, I don't see an advantage to the radial design over a grid design. There is nothing to that which would improve airflow, and it leaves huge underutilized areas.

On the other hand, a traditional grid design optimizes the space, and it would still allow for the same airflow.

It's not a matter of being round, or having dead space, it's simple things we teach children. Square boxes don't fit through round holes. Round objects don't stack optimally.

One of the Equinix datacenters in Los Angeles (previously Pihana Pacific) has all of it's cooling on one side of the room, and returns on the other side. Each row is basically a wind tunnel. There is no appreciable temperature difference between the two sides. Both the front and back of the cabinets have the same airflow, and maintain roughly the same temperature.

As far as the total power load, they could keep the load the same, and have almost half of the building for just storage.

Of course, a square building that the industry uses as a standard for this kind of work, would not make the news. No one would be talking about it.

I guess if they have money to burn and real estate to waste, it doesn't matter what shape they make it or how much space is underutilized.

Re:Wasting (1)

SuperKendall (25149) | about a month ago | (#47700267)

I guess if they have money to burn and real estate to waste

Domes are cheaper to build and use less material than a traditional building since you need no load bearing walls (although the design they had was not a full dome).

You are simply handwaving away the significant energy savings this design brings to the table.

I've lived in a dome before, the round walls do not waste THAT much space. And you need room to move around equipment anyway.

Re:Wasting (1)

JWSmythe (446288) | about three weeks ago | (#47710081)

Did you look at their floorplan? There are huge wedge shaped gaps.

Or lets do math. For the sake of argument, lets say that the diagram in their virtual tour was to scale. We're also going to say that each rack is a standard 19" rack, taking up 22" each. That can be wrong, but it's what I'm using for measurement.

The entire circular structure has an area of 24,052 sq/ft.
A square structure on the same property would be 30,625 sq/ft
The circular structure wastes 6,573 sq/ft.

Each pod, with a 3' buffer on each end, and a 3' buffer between rows would have a footprint of 768.4 sq/ft. Since I only included one aisle buffer on each (since they share common aisle), add one more aisle at 102 sq/ft.

The total datacenter rack space is really 3,944 sq/ft.

In the difference between the round and square structure, you could put all the racks and aisles. and still have 26,681 sq/ft.

Or about the size of two Olympic size swimming pools.

Or 0.017 LoC.

Or 53,362 bread boxes one layer deep.

Or you could tile the floor of the wasted space with approximately 106,724 AOL CDs, which coincidentally is about half of the total number of AOL CDs received in Centennial, Colorado in one bulk mailing. Unfortunately, it will be very ugly, because you're trying to tile a square floor with round objects which has lots of wasted space.

I could dazzle you with more numbers, but you've already started cursing me, and I really don't care.

Re:Moisture? (0)

Anonymous Coward | about a month ago | (#47702027)

It seems like it was drawn up with an ideal world in mind, which usually doesn't translate well to the real world.

You think?

"and can recirculate server exhaust heat to adjust cold aisle temperatures in the winter."

That sentence alone from the summary tells me they know nothing about computer thermals. You don't ever pump hot air back in the cool aisle, EVER! If the cold aisle drops below 68 F that's a good thing not a bad thing. You get more cooling for less cost. Last time I checked, unless we're talking about temperatures below 40 F there's nothing wrong with the cold aisles getting colder than human comfort levels. With 25kW (17 hair dryers on full blast in a phone booth sized space) I don't think they have to worry about below 40 F temps. Relative humidity is a major issue, but it sounds like they have that covered, although with the hot air thing in the cold aisle I would be skeptical that they did that correctly too!

Re:Moisture? (1)

jwdb (526327) | about a month ago | (#47702071)

I don't think they have to worry about below 40 F temps

Don't know why on earth you'd think that, considering that the average outdoor temperature in Portland in the winter is around 40...

Re:Moisture? (1)

turning in circles (2882659) | about a month ago | (#47700751)

Beaverton, Oregon, is near Portland, and it can get very humid there, so adding the additional humidity inherent in evaporative cooling could cause a potential problem. Of course, since there is no air conditioning, you don't have a condensation problem. I looked at some other sites that use evaporative cooling to cool servers, and the systems only need to cycle on ten minutes in thirty, even in 100F weather, so you could use the fans to mitigate humidity to some extent.

However, that being said, I don't see this idea working in really hot and humid climates, such as Alabama.

Re:Moisture? (0)

Anonymous Coward | about a month ago | (#47702565)

In summer, prevailing winds are from the west and temperatures are mild. High pressure dropping down from Canada causes hot, dry wind from the east, but that's the perfect situation for evaporative cooling.

Re:Moisture? (1)

JWSmythe (446288) | about three weeks ago | (#47720067)

since there is no air conditioning, you don't have a condensation problem

No, it's not HVAC induced condensation. Meteorologists call it the dew point.

Right at this moment, the temp is 53.3F with a relative humidity of 78%. The dew point is 47F.

You're suppose to run a datacenter between 40% to 60% relative humidity. Without a system in place to dry the air, they're asking for corrosion on parts.

You can't say computers are corrosion proof either. When I worked in a computer store, we had computers come in all the time that were in houses with no HVAC, so they were exposed to outdoor humidity.

I left some old gear in a friends garage for a while. One of the units was a used Catalyst 5000, with cards I didn't really care about. When I put it in the garage, it was in functional condition.

I decided to bring it back up to play with. There was corrosion on the line card handles, and I'm sure corrosion inside. Nothing looked bright and clean. There was visible corrosion on the cat5 pins (for the cat5 ports). When I took it out, it barely worked with lots of errors. Reseating the cards didn't help at all. I don't know (or care) which parts went bad, I sent it off for electronic scrap recycling.

Someone's going to be really pissed off when they spent a fortune on servers that have to be trashed because they stop working properly.

There are other parts machines in the garage too. I only go to them for fans, power supplies, etc. I had already pulled out all the memory and CPUs. Sometimes they still work. Sometimes they don't.

Specs have some wild numbers on them. Some say they operate in 10% to 90% humidity. Sure, they *can* run in it for a while. They aren't expected to survive in that kind of temperature indefinitely. I've seen some specs that say they'll operate over 120F. Sure, for a very short time. I had one place argue with me because the spec showed wild numbers, but they were already experiencing hardware failures for operating servers in an uncooled server room (the HVAC broke, and they didn't want to fix it).

Re:Moisture? (1)

turning in circles (2882659) | about three weeks ago | (#47725993)

Now there's a justification for me to keep the air conditioning on in my house all summer long. Wouldn't want to corrode the lap tops . . . Thanks for the info.

Re:Moisture? (1)

JWSmythe (446288) | about three weeks ago | (#47726409)

It's good for your house too. I've seen houses where the homeowner never ran their A/C and they were proud that they saved money. They also had problems with mold, paint peeling, drywall falling apart, and various wood things in their house warping.

At one place I lived, there were ceiling fans throughout the house, which was nice. There were also some on our back porch. The ones inside stayed in almost original condition. The ones outside had rust on the metal parts, and the blades warped.

But this was a discussion about datacenters, so I talked about the corrosion problems with IT equipment.

Security (1)

Dan East (318230) | about a month ago | (#47698359)

Every time the video showed a door, the narrator had to say that the door is locked. I get it. Doors can be locked. It just seemed there was an agenda in the video to point out to some specific audience the trivial and standard physical security involved, as blatantly obvious as that should be to anyone.

Re:Security (1)

JWSmythe (446288) | about a month ago | (#47698585)

Did you notice that he talked about the doors to the warm side? Controlled and logged access. And just a couple seconds later he says the top of the pods are all open to the common upper area. I'd hope they'll have something in the way, but I doubt it would be anything that bolt cutters (or just tin snips) and a few minutes would have a problem with.

Re:Security (1)

Skuld-Chan (302449) | about a month ago | (#47701069)

What he didn't say - this data center is actually in a primate research lab - the entire campus is surrounded by 20 foot high electrified fence with mesh so tight it makes it difficult to scale.

Plus the entire place is coated in surveillance cameras (like every fence pole had a cluster of several sort of thing). I suspect you could leave the doors unlocked and it would probably more secure than many data centers you read about.

No I don't work for OHSU, but I live close to this campus.

Re:Security (1)

JWSmythe (446288) | about three weeks ago | (#47717367)

I should make an obligatory reference to Jurassic Park. :)

I was guessing by the fact that they had employees accessing the building, and parking lots, that it was a facility that had some sort of access.

Souinds like the data center of the future, circa (1)

robstout (2873439) | about a month ago | (#47698407)

Sounds like the data center of the future, circa 1975. I wouldn't mind working in it though, but I wonder how they control humidity. Lack of cooling may work well in their area, but I see problems in hotter/ more humid places.

Re:Souinds like the data center of the future, cir (2)

hey! (33014) | about a month ago | (#47698489)

Oh, come on; everything's more futuristic in a geodesic dome.

Re:Souinds like the data center of the future, cir (1)

jd (1658) | about a month ago | (#47698759)

What about a TARDIS?

Re: Souinds like the data center of the future, ci (2)

jd2112 (1535857) | about a month ago | (#47699017)

TARDIS racks are great! I can get 932U of equipment in the space of a normal 42U rack, and I get the results before I enter my data!

Re: Souinds like the data center of the future, ci (1)

Obfuscant (592200) | about a month ago | (#47699073)

TARDIS racks are great! I can get 932U of equipment in the space of a normal 42U rack, and I get the results before I enter my data!

I don' t need a rack of equipment, I just use the spare cycles in my screwdriver and let it cogitate for 400 years on the problem. Sometimes linear programming works better than parallel.

Re:Souinds like the data center of the future, cir (1)

Mr D from 63 (3395377) | about a month ago | (#47699003)

Oh, come on; everything's more futuristic in a geodesic dome.

Pyramids used to be the future.

Re:Souinds like the data center of the future, cir (3, Interesting)

jd (1658) | about a month ago | (#47698749)

1955. The Manchester Computing Centre was designed to be one gigantic heat sink for their computers in the basement, using simple convection currents, ultra-large corridors and strategically-placed doors to regulate the temperature. It worked ok. Not great, but well enough. The computers generated enormous heat all year round, reducing the need for heating in winter. (Manchester winters can be bitingly cold, as the Romans discovered. Less so, now that Global Warming has screwed the weather systems up.)

The design that Oregon is using is several steps up, yes, but is basically designed on the same principles and uses essentially the same set of tools to achieve the results. Nobody quite knows the thermal properties of the location Alan Turing built the Manchester Baby in, the laboratory was demolished a long time ago. Bastards. However, we know where his successors worked, because that's the location of the MCC/NCC. A very unpleasant building, ugly as hell, but "functional" for the purpose for which it was designed. Nobody is saying the building never got hot - it did - but the computers didn't generally burst into flames, which they would have done if there had been no cooling at all.

Also similar to Bucky Fuller's 1930 Dymaxion House (1)

Paul Fernhout (109597) | about a month ago | (#47701801)

http://en.wikipedia.org/wiki/D... [wikipedia.org]
"The Dymaxion was completed in 1930 after two years of development, and redesigned in 1945. Buckminster Fuller wanted to mass-produce a bathroom and a house. His first "Dymaxion" design was based on the design of a grain bin. ... The Siberian grain-silo house was the first system in which Fuller noted the "dome effect." Many installations have reported that a dome induces a local vertical heat-driven vortex that sucks cooler air downward into a dome if the dome is vented properly (a single overhead vent, and peripheral vents). Fuller adapted the later units of the grain-silo house to use this effect."

Internally, I like the radial design of the server clusters around the central networking core to reduce and make consistent network physical travel paths within the system.

In OEM specs? (3, Interesting)

mlts (1038732) | about a month ago | (#47698415)

Where the rubber meets the road is if the machines are in temperature and humidity specifications for the equipment, so warranties are not voided.

If this is workable, even during the winter or when it is extremely rainy/humid, this might be a useful idea. However, there is only a limited set of climates that this would work in. The PNW with its moderate temperatures makes sense for this. However, if I attempted to do the same thing in Texas, come summertime, I'd have a building full of BBQ-ed servers.

100+F or 38+C typical annual high (3, Informative)

raymorris (2726007) | about a month ago | (#47698499)

In Portland, it's reasonably cool MOST OF THE TIME.
Temperatures reach or exceed 90 F (32 C) on 14 days per year and reach or exceed 100 F (38 C) on 1.4 days per year on average.

I'm thinking this project will last about 350 days.

Re:100+F or 38+C typical annual high (1)

stewsters (1406737) | about a month ago | (#47698617)

You then need an identical data center in South America, and switch which one you use every half year. The cloud.

Re:100+F or 38+C typical annual high (1)

Anonymous Coward | about a month ago | (#47698635)

From the article:

"When outside air temperature is too warm for free cooling, the data center’s adiabatic cooling system kicks in automatically to help out. Beaverton, Oregon (where the facility is located), experienced some 100 F days recently, and the evaporative-cooling system cycled for about 10 minutes at a time at 30-minute intervals, which was more than enough to keep supply-air temperature within ASHRAE’s current limits. Gleissman said he expects the adiabatic cooling system to kick in several weeks a year."

Re:100+F or 38+C typical annual high (0)

Anonymous Coward | about a month ago | (#47698721)

How do swamp coolers keep things operable if the humidity is 100%? There is a reason they are used in only dry climates.

adiabatic definition: does not get rid of heat (1)

raymorris (2726007) | about a month ago | (#47698869)

> From the article:
> "When outside air temperature is too warm for free cooling, the data center’s adiabatic cooling system

Which is funny, because the word adiabatic means something that does not get rid of heat, or draw in heat, from the outside.
An adiabatic system would cool the building by drawing a vacuum, sucking all of the air out of the building. The decreasing air pressure would lower the temperature for a few minutes. Since you can't keep lowering the air pressure below absolute vacuum, the servers would melt after a few minutes.

Perhaps they meant to say "diabatic cooling system". A diabatic system is one that gets rid of heat (or draws in heat). Of course that's also the definition of "cooling", so if that's what they meant, it's a snobbish way of saying "cooling cooling system". With the a prefix, it's "non-cooling cooling system", which is gibberish. Unless of course by "abiatic (non-cooling) cooling system", they mean "cooling system that doesn't col, one that doesn't work". If on 100F days they are relying on a abiatic aka non-cooling aka broken cooling system, I don't think I want my servers there. I had a taste of that when I had servers at Alphared.

Re:adiabatic definition: does not get rid of heat (0)

Anonymous Coward | about a month ago | (#47700285)

Adiabatic doesn't refer to the air, it refers to the water vapor.

RTFA - 100+ handled by cooling incoming air (2)

SuperKendall (25149) | about a month ago | (#47698711)

The article explicitly states when the temperatures REACHED (as in already happened) 100+, the water cooling units designed for that very purpose cycled on 10 minutes out of every thirty to keep incoming air within tolerance to cool the servers.

Re:100+F or 38+C typical annual high (0)

Anonymous Coward | about a month ago | (#47698713)

100F is no problem if the dew point is low.
Then a water mist will bring the air temp down to the dew point.
The technology is called a 'swamp cooler'

Works great in the S/W, but in the S/E, I'm skeptical.

Presumably, they have compared historical climate data to their equipment specs.

Re:100+F or 38+C typical annual high (0)

jd (1658) | about a month ago | (#47698809)

Portland is cool, yes. But that's mostly down to the bookshops and tea shops. Temperature-wise, it doesn't get "hot" per-se, but it does get very humid. And the air is horribly polluted. Spent the time moving up there reading about dangerously high levels of mercury in the air, the deadly pollutants in the river, the partially dismantled nuclear reactor and highly toxic soil (the reactor has since gone, the soil remains drenched in contaminants), the extremely high levels of acid rain due to excessive cars (which are driven by maniacs) and the lethal toxins flowing through the rivers that have been built over to level out the ground.

In short, I landed there a nervous wreck.

Things didn't improve. I saw more dead bodies (yes, dead bodies) in Portland and heard more gunfire in my five years there than I heard in the suburbs of Manchester, England, in 27 years. You will find, if the archives let you get back that far, that I was almost normal before that time.

Re:100+F or 38+C typical annual high (1)

rogoshen1 (2922505) | about a month ago | (#47698955)

When you say 'high levels of mercury' are you sure you weren't confusing air pollution with printed page pollution? :) Not sure which is more dangerous to consume, best to avoid both

What part of NE Portland were you living in anyhow?

Re:100+F or 38+C typical annual high (1)

jd (1658) | about a month ago | (#47700469)

Southwest. Above Jake's Seafood (which, incidentally, serves terrible food at absurd prices for the benefit of the ambiance of an old brick building).

Re:100+F or 38+C typical annual high (1)

SpankyDaMonkey (1692874) | about a month ago | (#47700723)

There are 8760 hours in a year. If, as you say, you exceed 32C on 14 days per year you don't exceed 32C for the full 24 hour block, instead you may be over 32C for late-morning through to late-afternoon - call that 11AM to 5PM or 6 hours. That means a total of 84 hours where you have to run active cooling systems out of a year, which is approximately 1% of the year. If you can also specify that any hardware installed is certified at the upper limit of the ASHRAE standards then your thermal window increases and you can drop that 1% down even lower.

Free air cooling, it just makes sense provided your filtering and air handling systems are up to it. Keeping in the limits for humidity are a little harder to manage as you don't want to see condensation in your cold aisle in the summer, or to pick up static shocks when you touch anything in the winter.

That's way too high. Incoming != case (1)

raymorris (2726007) | about a month ago | (#47700869)

You realize datacenters normally run at 23-25C, right? In the middle of the DC. The incoming air is a couple degrees cooler.
You're thinking of the maximum allowable temperature inside the case, in a rack, and at the back of the case, by the hot aisle. The cold aisle needs to be cooler than the hot aisle. Those days when your cold aisle hits 90F are the days you're GUARANTEED to destroy hardware if you don't take action. Most of the rest of June - August you'll need cooling to stay within SAFE temps.

Re:That's way too high. Incoming != case (1)

SpankyDaMonkey (1692874) | about a month ago | (#47704867)

Basic ASHRAE standards have a recommended range of 18C to 27C, but a maximum allowable range of 15C to 32C. If you specify an A2 or A3 ASHRAE compliance when buying your hardware you can stretch that allowable range all the way up to 35C (A2) or even 40C (A3),

Most datacentres these days are looking closely at the ASHRAE limits and at monitoring to raise the average cold aisle temperature and make major savings. There are a lot of steps on this path if you're bringing an older datacentre up to the modern ways of thinking, including strict hot/cold aisle separation, re-alignment of hot aisles to match CRAC / CRAH units, implementation of live temperature, pressure and humidity monitoring, all the way up to a fluid dynamics analysis of airflow. You only have one chance to get it right and a huge number of ways to get it wrong so it's a very conservative approach. On the other hand being able to make a $500K annual saving by raising the overall temperature by 2C in the cold aisle and still deliver the same service is the sort of numbers that make a lot of sense.

In addition the point I was making is that it's only during daylight that the external air temp will mean you need additional cooling, at night the temperature drops - so you only need to run your cooling plant for a percentage of the day, and the temperature at night is absolutely fine.

Disclaimer: I'm a Certified DC Designer and a Certified DC Management Professional with 8 years experience running a blue-chip datacentre, so I live this stuff every day

Re:In OEM specs? (0)

Anonymous Coward | about a month ago | (#47699021)

Ummm. He said BBQ. I must be from Texas, because I love BBQ. Oh, wait, I am.

Crystals, too (4, Funny)

PopeRatzo (965947) | about a month ago | (#47698417)

They should put the data center in a pyramid. Then, the servers would last forever!

Re:Crystals, too (0)

Anonymous Coward | about a month ago | (#47701727)

The hot air could be sent through the central soul canal to the underworld for the souls of Pharaohs to worry about, thereby lessening the entropy form the local universe. Brilliant!

seen it (0)

Anonymous Coward | about a month ago | (#47698435)

The exit is a wormhole at the bottom of a cliff at the end of a tunnel under the school.

Nonsense (0)

Anonymous Coward | about a month ago | (#47698459)

they should put servers in space. Space is cold, right? And private space means you can launch a 3D printer for peanuts and 3D print all the computers you need in orbit from asteroids, right?

Re:Nonsense (3, Informative)

bobbied (2522392) | about a month ago | (#47698761)

It's only cold in space when you are in the shade. Direct sunlight is pretty hot stuff, but if you use reflective surfaces it limits the absorbed energy.

The problem with space though, is it is a vacuum and usually weightless. No convective cooling, only radiative cooling. Which is why they put a huge ammonia based cooling system on the ISS that drives external hot plates they keep in the shade when they can. So apparently, cooling stuff in space isn't all that easy or cost effective.

Chicken coop? (4, Insightful)

TWX (665546) | about a month ago | (#47698607)

Why did the chicken coop have two doors?


.


.


Becauase if it had four doors, it'd be a chicken sedan!

Re:Chicken coop? (-1)

Anonymous Coward | about a month ago | (#47698643)

Shut up and make me a chicken sandwich, bitch.

Re:Chicken coop? (0)

Anonymous Coward | about a month ago | (#47698779)

Go get your own from Chick-Fl-A. Where they didn't invent the chicken, they claim to have invented the chicken sandwich..

Re:Chicken coop? (1)

TWX (665546) | about a month ago | (#47698783)

Shut up and make me a chicken sandwich, bitch.

*waves magic wand*

*POOF!*

Congratulations, you are now a chicken sandwich!

Re:Chicken coop? (0)

Anonymous Coward | about a month ago | (#47700215)

Eat me. I'm full of hot sticky mayonnaise.

Insightful? (1)

antdude (79039) | about a month ago | (#47699421)

More like funny!

But why a dome? (1)

pz (113803) | about a month ago | (#47698787)

The article lists the requirements for the structure, which include things like massive air flow, high heat density, high electrical power density, etc. Constraints like that tend to point toward structures with high surface area to volume ratios. A sphere (or section of a sphere in this case) has the MINIMUM surface area to volume ratio. So why would you want to put this structure into a dome rather than a long, low building?

(And if you really insisted on getting all fancy, architecturally, you could still make the long low building into a ring and retain most of the advantages.)

Re:But why a dome? (2)

NixieBunny (859050) | about a month ago | (#47698823)

A dome doesn't care which way the wind blows, because it's round. Your long low building might have issues with that.

Re:But why a dome? (1)

tomhath (637240) | about a month ago | (#47699079)

Looking at the picture (maybe an artist's drawing) I see a roundish 2 1/2 story structure sitting behind some trees. So I doubt the wind is a factor. Plus the article talks about fans pulling in outside air.

Not in my backyard.... (2)

bobbied (2522392) | about a month ago | (#47698819)

Not because I would object though.. But because it gets pretty hot here from time to time.

So, if you move it north, why not? Heck, the south pole is pretty cold most of the year..

I have a better idea, how about we just put server farms out at sea, then just use seawater from a few hundred feet down for cooling. That works great, even in the tropics.

much harder to become an evil overlord with these (1)

Provocateur (133110) | about a month ago | (#47699819)

I mean, come on, we're having such a hard time getting frickin' lasers for our sharks! Let me guess, an army of accountants to figure out how much we're going to save on our taxes?

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>