Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Building a Data Center In 60 Days

kdawson posted more than 7 years ago | from the 170-racks-and-nothin'-on dept.

Businesses 117

miller60 writes "The facilities team at Australia's Pipe Networks is down to the wire in its bid to complete a data center in 60 days. And in an era when many major data-center projects are shrouded in secrecy, these guys are putting the entire effort online, with daily updates and photos on the company blog, a live webcam inside the facility, a countdown timer, and a punch-list of key tasks left to finish. Their goal is to complete the job by Friday morning eastern US time."

cancel ×

117 comments

why? (5, Interesting)

thedrunkensailor (992824) | more than 7 years ago | (#19489317)

Why not forget the deadline and get it right? TFA says this was an exec's idea....go figure

Re:why? (4, Interesting)

Aladrin (926209) | more than 7 years ago | (#19489391)

Why not? It's a challenge, not a true 'deadline'. Think of it as an episode of 'Monster House' where they get to keep the tools if they get the nearly-impossible project done on time. There's -always- work to be done afterwards to finish it off, but the work is complete as far as they were contracted.

It's not 'have a fully functional data center filled with customers.' It's only 'build it.'

Re:why? (1)

Professor_UNIX (867045) | more than 7 years ago | (#19489657)

Because it's exciting!!!!!! We're really into open source too so we're totally cool with you guys knowing all our internal security layout. In fact, the keypad combo for the data center gate for visitors is 31337! Welcome anytime 24/7! Please don't steal anything or mess with any cables though... we don't believe in cameras since they create a hostile work environment so it'd be really cool if you were good fellows and didn't mess anything up if you want to pop on down to check s out in the middle of the night. 'K? Thanks.

Re:why? (1)

raju1kabir (251972) | more than 7 years ago | (#19497307)

Why not forget the deadline and get it right?

No shit. Why on earth would I want to locate in a datacentre which was intentionally thrown together in a hurry? That's like buying a parachute that was made by the lowest bidder.

This seems like really bad advertising to me. Anyone who is careful in vendor selection will be unimpressed, and they'll have an uphill battle to convince these prospective customers that no corners were cut or harmful shortcuts taken.

Can be that great of a data center. (3, Funny)

Anonymous Coward | more than 7 years ago | (#19489329)

It's already Slashdotted.

Re:Can be that great of a data center. (1)

creimer (824291) | more than 7 years ago | (#19489519)

That's pretty bad for a data center under construction. Maybe they never counted on Slashdot finding out in the first place?

Re:Can be that great of a data center. (1)

EveryNickIsTaken (1054794) | more than 7 years ago | (#19489533)

Look at who submitted it. This is clearly a stress test.

Re:Can be that great of a data center. (1)

gbjbaanb (229885) | more than 7 years ago | (#19495445)

I don't think Datacentreknowledge.com cares about the DC, they care about the adverts on their link-scavenging 'blog'.

the link to the DC build is at http://www.pipenetworks.com/dc3/ [pipenetworks.com] but it's slashdotted - come back tomorrow.

Re:Can be that great of a data center. (0)

Anonymous Coward | more than 7 years ago | (#19493901)

Indeed. I would go so far as to say say it's technically impossible. Grandparent is clearly stupid, unless it was a joke, in which case you are.

pipenetworks.com isn't loading for me (1, Funny)

WrongSizeGlass (838941) | more than 7 years ago | (#19489333)

Either pipenetworks.com has been /.'d or their 'network of pipes' needs a little visit from RotoRooter. ;-)

Re:pipenetworks.com isn't loading for me (0)

Anonymous Coward | more than 7 years ago | (#19489379)

Clogged pipes? What a comparison for Slashdot would be in such a metaphor isn't encouraging. D:

Re:pipenetworks.com isn't loading for me (0, Flamebait)

morgan_greywolf (835522) | more than 7 years ago | (#19489469)

Either pipenetworks.com has been /.'d or their 'network of pipes' needs a little visit from RotoRooter. ;-)

And if one of those pipes is the stink pipe, what does that make Slashdot? ;)

Re:pipenetworks.com isn't loading for me (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#19490083)

OH! OH! OH! I get it! That would make /. a poo ! See, you say "stink pipe", and then we have to guess that you mean the drain, but you've slipped in that word "stink", so now we're all thinking about things that smell really bad, and then you mention "Slashdot", and suddenly we're all thinking like, "posting on slashdot is like flinging virtual poo!" and everybody will start running around yelling "POO FLINGER!!" and "GET A PLUNGER!!"

And then we can start substituting /. all the places where we'd normally refer to poo, like "slashdot for brains", "slashdot on a stick", "HOLY SLASHDOT!!", "You smell like you slashdotted your shorts", "hot grits on Natalie Portman is hot as slashdot", "cramming 10 pounds of slashdot in a 5 pound bag", "when the slashdot hits the fan", "he's so anal retentive his dentist smells slashdot", "slashdot rolls downhill"...

Well, OK, there might be some truth to that last one.

What's burning? (4, Funny)

Max Romantschuk (132276) | more than 7 years ago | (#19489377)

Oh, it's just the server going up in smoke trying to serve a live webcam on Slashdot...

Re:What's burning? (2, Funny)

Tim82 (806662) | more than 7 years ago | (#19489439)

I guess they just need a bigger datacenter.

Re:What's burning? (4, Funny)

Thox (1031212) | more than 7 years ago | (#19489457)

60 day project set back to day 0 due to fire damage.

A couple black boxes (5, Interesting)

mikaelhg (47691) | more than 7 years ago | (#19489405)

They could also just have bought a couple of Sun Black Box [sun.com] datacenters in a truck container.

Re:A couple black boxes (0)

Anonymous Coward | more than 7 years ago | (#19489525)

>They could also just have bought a couple of Sun Black Box datacenters in a truck container.

westie.

Re:A couple black boxes (1)

bombastinator (812664) | more than 7 years ago | (#19490997)

I bet it would have been a lot cheaper if you count the labor. Even with the shipping.

Checklist (5, Funny)

WillRobinson (159226) | more than 7 years ago | (#19489435)

1. Get first DS3 up - check
2. Setup webcam - check
3. Setup webserver - check
4. Post on slashdot and soak the DS3 - check
5. Stress test in progress

Re:Checklist (1)

andyh3930 (605873) | more than 7 years ago | (#19489689)

5. Stress test - failed Slashdotted at 13:50UTC Never underestimate the power of the /.

Re:Checklist (0)

Anonymous Coward | more than 7 years ago | (#19490113)

That's a pretty good idea actually. I wonder if the editors would let a story through that basically said

Our company has had the goal of building the most robust data center of our competitors. We've used *geeky operating system here* with *geeky drive setup* and *geeky DC current power source* and now challenge you to A SLASHDOT CHALLENGE.

Re:Checklist (1)

MMC Monster (602931) | more than 7 years ago | (#19493163)

Frankly, I don't think there is a single website and isn't multi-located that can survive an overt /. challenge. Because you *know* that a bunch of us will put the site on autorefresh every second until it goes down.

Re:Checklist (1)

NIN1385 (760712) | more than 7 years ago | (#19490363)

Yeah, way to throw a wrench in their plans /.

Datacenter???? (5, Interesting)

Critical Facilities (850111) | more than 7 years ago | (#19489449)

Pipe's DC3 facility will be about 4,800 square feet and will be able to accommodate 170 server racks.

I'm sorry, but 4,800 square feet and room/capacity for 170 server racks is a SERVER ROOM not a DATACENTER. I'm not trying to troll here, but this mis-use of the word datacenter gets old. The time/effort/planning/money it takes to build a datacenter is exponentially more complicated than to upfit an area to accommodate a few server racks.

In short, sticking in a few Liebert CRACs and a little 150kva UPS does not constitute "building a datacenter".

Re:Datacenter???? (2, Informative)

walt-sjc (145127) | more than 7 years ago | (#19489561)

It depends on the community it is serving. Yeah, that is pretty pathetic compared to datacenters in major cities, but for a small city it would be perfectly fine.

When I first started using colocation back around 96, Exodus's colo room was 6 racks. They had explosive growth and by 2001 had massive datacenters in several cities around the globe. Anyway, give them time. If they do things right, they will grow.

Re:Datacenter???? (5, Insightful)

Xicarius (935198) | more than 7 years ago | (#19489885)

In Australia, its a datacentre. Comparative to the number of people in Au & the connections within Australia & to the rest of the world, its pretty big.

We only have two major cables out of Australia & capacity on them to the US costs hundreds of dollars a megabit/month.

Re:Datacenter???? (1, Funny)

Anonymous Coward | more than 7 years ago | (#19490783)

I see in Autralia they also do not use the apostrophe.

He didn't want to block the tubes! (4, Funny)

adamofgreyskull (640712) | more than 7 years ago | (#19491563)

Duh...he just said bandwidth was expensive.

Re:Datacenter???? (0)

Anonymous Coward | more than 7 years ago | (#19499799)

I see you don't use the "s" key.

Re:Datacenter???? (1)

waterford0069 (580760) | more than 7 years ago | (#19492341)

Ah... but does the Usenet feed still go back and forth acrosss the Pacific on 8mm tapes, sitting in the passenger seat of an air-liner. Never underestimate the band with of a FedEx plane filled to the brim with magnetic tape.

Re:Datacenter???? (3, Informative)

zevans (101778) | more than 7 years ago | (#19490047)

But with blades, 1U pizza boxes, xSeries+VMWare, LPARs, 156 and 288gb spindles, etc and the consolidation tools that all the vendors are pushing, data centres can and should be smaller than they were five years ago.

Re:Datacenter???? (2, Insightful)

walt-sjc (145127) | more than 7 years ago | (#19490237)

That depends on how the data center is designed. Is it the typical 300W / sq foot that typical datacenters are, or is it designed for high density servers and the additional power / cooling they need? From the size of the generator, there is no way they can go that dense.

Re:Datacenter???? (2, Insightful)

Critical Facilities (850111) | more than 7 years ago | (#19490695)

I think that's a little bit of wishful thinking. With the shift to online apps, the increase in streaming media, and general hunger for bandwidth/throughput (especially on corporate LAN's), I'd say that while it's true that advances in server design and virtualization has enabled the IT industry to do more with current equipment, the "market" that those products/services serve have stepped up their demand as well. The idea that datacenters are serving a static need just plainly isn't true. The demand for increased speed, storage, processing power will continue to increase at a rate reciprocal to datcenters' ability to provide it.

Re:Datacenter???? (0)

Anonymous Coward | more than 7 years ago | (#19497101)

Well.... sure, machines are smaller now than they were five years ago. But we're not processing data the way we were five years ago, either. As our business has found yet more ways to make use of our data - which generally involve more and more complex relationships - the more hardware we've needed just to perform the same number of transactions - and we have growth to factor in on top of that. The growth has slowed down the last two years, we've only seen 35%-50% annually, not the 100% that we kept up for the four years prior to that.

The scenario of "fifteen underutilized machines that can all be consolidated with VMWare" is something that vmware likes to tout, but when things get serious, you're more likely to find 8-way and 16-way machines that can barely keep up with a single task. And for the distributable work, you're likely to find clusters of at least a dozen, all working near capacity to keep up.

Re:Datacenter???? (0)

Anonymous Coward | more than 7 years ago | (#19490963)

Ah, shut up. Seriously, you realize your post is a proxy for your barely-unconscious desire to dick-wave, right? Right? OK, fine. We all recognize your self-perceived penile superiority.

170 racks is not a server room. Maybe it's not a large data center, but it's not a server room.

Besides, it's not the size that counts, it's the, uh, dwindle of the spindle? Oh God. It appears that my post is a proxy for my barely-unconscious desire to apologize for my self-perceived penile inadequacy!

Re:Datacenter???? (0, Troll)

Critical Facilities (850111) | more than 7 years ago | (#19491783)

Leave it to a lowly, trollish coward to try to make this into a pissing contest. I do not have an issue with there being facilities that are bigger/smaller than ones I have worked in, and I'm certainly not trying to impress anyone. That said, I'm simply taking issue with the idea that it's even POSSIBLE to build a datacenter in 60 days.

So, really, I'm arguing semantics and terminology here and not jockeying for "penile superiority". It is impressive if they can pull off this upfit in 60 days (although since it appears that the site's already slashdotted, I'm wondering if 60 days was enough time). But to try to equate it with building a full-on datacenter from the ground up in 60 days (grading, excavation, construction, generator order/installation, ups order/installation, commissioning, etc ) is a bit of a stretch in my book.

Sorry if I stepped on your toes there, lil' fella.

How is this a troll?? (1)

Critical Facilities (850111) | more than 7 years ago | (#19492549)

Give me a break, mods. I was RESPONDING to a troll, and I thought I showed a decent bit of restraint in even responding. Ok, I took a little dig, but geez I still think I made a point with what I said.

Re:Datacenter???? (2, Informative)

jlf278 (1022347) | more than 7 years ago | (#19491851)

It is a data center; and actually, you're the one guilty of mis-use as datacenter is not, in fact, a word. http://en.wikipedia.org/wiki/Data_center [wikipedia.org]

Re:Datacenter???? (2, Informative)

afidel (530433) | more than 7 years ago | (#19492129)

You think a 150KVA UPS will service 170 racks?!?!? HAHAHAHAHA
You have lost all credibility to determine what a datacenter is. A 150KVA UPS would service about 50 moderately loaded (about half empty) racks with most current equipment. 170 racks could power many midsized companies, my employer's an S&P 500 company and we have 11 racks moderately full. Wikipedia defines a datacenter as:

A data center is a facility used for housing a large amount of electronic equipment, typically computers and communications equipment. As the name implies, a data center is usually maintained by an organization for the purpose of handling the data necessary for its operations

Which I would say 170 racks could definitely qualify as.

Re:Datacenter???? (1)

Critical Facilities (850111) | more than 7 years ago | (#19492393)

You think a 150KVA UPS will service 170 racks?!?!?

No, I don't think that a 150kva UPS will service 170 modern rack servers, I was making an exaggerated example of what sometimes gets referred to as a datacenter/data center. However, by your Wikipedia "definition", a closet with a few servers and a window air conditioner would constitute a data center since "large amount of electronic equipment" is a very subjective term.

Re:Datacenter???? (1)

AK Marc (707885) | more than 7 years ago | (#19493765)

Well, aside from you calling 170 racks "a few", perhaps rather than whining that this isn't a datacenter, you could tell us how many racks are needed for a datacenter. Or is it services? You like to mention grading and such, so does a datacenter have to be a fresh build? The largest facility of this type in my state is a converted building. Of course, it was, at one time, the largest building of the largest bank, built with security and building strength in mind. All they had to do was add "a few" tons of lead acid batteries and other such minor "upfit" details.

Re:Datacenter???? (1)

Critical Facilities (850111) | more than 7 years ago | (#19494409)

I never called 170 server racks a "few". As far as it having to be a fresh build, no, it doesn't have to be, but the headline says "Building a Data Center in 60 days", so maybe I inferred a bit.

I'm not "whining" about anything at all, I'm just suggesting that the headline is a little misleading and senstaional. The largest facility in your state that you refer to, did they build it in 60 days? In 180 days? In 365 days? If you've been around environments like these (which I'm assuming you have) you realize how much is involved with building and operating them, so it shouldn't be that big of a mental leap to see how I might think suggesting that it can be done on a fairly large scale (and done well) in 60 days is a bit exaggerated.

I'm not even going to get into this whole "how many racks are required" business, but suffice to say, there's a big difference between the facility being referred to in TFA and any of Google's data centers, and I think it's a bit disingenuous to try to lump them both into the same category. I don't think the facility in The Dalles, OR [yinfor.com] could be slapped together in 60 days, do you?

Re:Datacenter???? (1)

AK Marc (707885) | more than 7 years ago | (#19499111)

I'm sorry, but 4,800 square feet and room/capacity for 170 server racks is a SERVER ROOM not a DATACENTER. I'm not trying to troll here, but this mis-use of the word datacenter gets old. The time/effort/planning/money it takes to build a datacenter is exponentially more complicated than to upfit an area to accommodate a few server racks.

I never called 170 server racks a "few".

Well, you refer to 170 racks in one sentence, and while apparently on the same subject, you say that accommodating "a few" server racks doesn't make it a datacenter. Forgive me for the mistake. It seemed quite clear to me that you were calling 170 server racks "a few."

I'm just suggesting that the headline is a little misleading and senstaional.

It's Slashdot. *Every* headline is misleading and sensational. The editors like it that way because they are paid, not to inform with news, but to agitate and generate page views and responses.

I'm not even going to get into this whole "how many racks are required" business, but suffice to say, there's a big difference between the facility being referred to in TFA and any of Google's data centers, and I think it's a bit disingenuous to try to lump them both into the same category.

What's Google have to do with this? Are you saying that they are the minimum for a datacenter, or are you saying that it's offenseive to think of "datacenter" to cover both 170 rack installations as well as Google's facilities? With illogic like that, a Ferrari and a Yugo can't both be cars, since they are so dissimilar in action, ignoring that they both have 4-wheels and an engine. A "datacenter" could be quite small, if it has the proper features (at a minimum, I'd say constructed to withstand the natural disasters for the area, independent power with unlimited run-time, proper cooling, open racks available for public rental, and adequate connectivity to the outside world). I know people that ran "datacenters" out of their garage with these features and more, and they are no more or less than what you'd find at commercial locations (well, other than not wanting to give potential customers a tour of the facility).

It just seems senseless that you declare this to be insufficient to be a "datacenter" but are completely unable to define a datacenter.

Re:Datacenter???? (0)

Anonymous Coward | more than 7 years ago | (#19494423)

While I wouldn't call 170 racks "a few", I was recently involved in a job which had 150 racks and provision for 50 future.

That was not called a data center, it was a research and development lab for the data center.

(It did take well over 60 days to plan plus 6 months to construct, in an existing building)

Re:Datacenter???? (0)

Anonymous Coward | more than 7 years ago | (#19496869)

Actually, the data center that I have my servers in is smaller, even, than mentioned in the article.

And yes, it's a data center, not just a server room or colocation room. While there are only about 100 cabinets, the actual data portion is massively larger. The pipes start with a few OC192s, and work their way down from there. The wall of outbound DS3 patches alone is probably 30 feet long. There are pipes from every conceivable player coming into the center in one way or another.

Re:Datacenter???? (0)

Anonymous Coward | more than 7 years ago | (#19499521)

The time/effort/planning/money it takes to build a datacenter is exponentially more complicated than to upfit an area to accommodate a few server racks.

What the hell does this mean? With every extra computer there is a multiplicative increase in cost? For someone so dedicated to the defense of poorly defined terms, you seem to struggle with mathematical terms with basic and easily tested definitions.

Re:Datacenter???? (1)

Shadowlore (10860) | more than 7 years ago | (#19500353)

What, are trying to compensate vicariously or something?
Most people's houses are less than 4800 square feet. Most businesses fit in less than 4800 square feet. By those accounts you can't call it a "server room" then, can you?

170 racks, assume 42U per rack, 1U servers will get you hmmm a damned lot of servers. Take a couple racks out for infrastructure and some SAN and it looks like a DC, sounds like a DC, quacks like a DC, and smells like a DC. I'd call it a DC.

Seriously, if you insist in being pedantic:

  * A "data centre" is a "center for data", not "a giant room filled with thousands of computers". So by that account the two racks in my garage count. They've got raid arrays, multiple servers and switches, UPS, etc..
  * "a few" is a lot less than 170.

As far as effort, I've seen more effort go into server rooms than "full fledged data centers". If you have the requisite knowledge, ability, and contacts a data center going into an existing building is not terribly difficult. Ever see a "retrofit" of an old bank vault or bunker into a DC? Pretty quick and smooth.

A data center is a data center based on it's purpose, not the real or perceived effort that went into it or how many square feet it has.

Re:Datacenter???? (1)

DJMajah (738313) | more than 7 years ago | (#19500429)

If a shipping container can be a datacentre, this room definitely can.

Make a TV reality show (4, Funny)

yohanes (644299) | more than 7 years ago | (#19489479)

Everyone is doing that, why shouldn't they. Then we can fire managers that we hated the most.

Trunk delivery timeframe (1)

EveryNickIsTaken (1054794) | more than 7 years ago | (#19489489)

Australia's telco(s) must be vastly more responsive than Verizon. To even get a DSL line up and running within 60 days here would be amazing - a DS3+ typically takes 4-5 months, minimum.

Re:Trunk delivery timeframe (1)

tbcpp (797625) | more than 7 years ago | (#19489867)

I have a friend from Australia. Seems like he told me that they have a country wide provider for cable, telephone, internet....Any Aussies want to help me out? Or am I just off my rocker.

Re:Trunk delivery timeframe (1)

lazybeam (162300) | more than 7 years ago | (#19490229)

That would be Telstra: A company which sells "broadband" for $30/month which includes 200MB/month transfers (up+down) and 15c/MB excess use. You can get this price on either 8mbit cable (metro) or 256kbit ADSL (around 95% of the country). (And yes, people have had huge bills and Telstra makes them pay) See: http://bc.whirlpool.net.au/isp.cfm/Telstra-BigPond /1.html [whirlpool.net.au] You can get better deals off other companies, but all must give some money to Telstra (except the very few non-Telstra-cabled areas).

Quickest I've had ADSL installed was 3 days, slowest 21 days. I've had it connected at about 10 different places with 8 different ISPs - all wholesale customers of Telstra.

Re:Trunk delivery timeframe (4, Informative)

mcbridematt (544099) | more than 7 years ago | (#19490231)

Former government phone monopoly, now privatized and run by evil Americans - Telstra, basically owns 99% of fixed-line infrastructure. (As they are legislated to do so) Captial cities got TWO separate cable networks during the 1990's - one from Optus (who got the first telco license after deregulation), and the other from Telstra, who built one thinking pay TV was the bomb - it wasn't, and both cable networks have actually shrunk by some degree since.

(Note the Optus cable network provided , and was designed to fixed line telephones from the start, which makes up the small percentage of non-Telstra fixed-line infrastuture around.)

However, Telstra, as a monopoly, MUST provide wholesale access to the fixed-line infrastructure, as such most Australians are actually with internet providers who wholesale off Telstra, either over Telstra DSLAM's or their own. The wholesale prices of which have been ENFORCED even DICTATED by the Australian competition authorities, who among other things, refuse to tolerate American crap such as "up to XXX mbps" (Australian consumers, unlike American's, demand full line speed, no lousy contention or else), "unlimited... up to XXX GB" etc.

A federal election issue this year is an FTTN (fibre-to-the-node) rollout to every single location within these captial cities, and an assortment of regional centers. Two proposals are in play - one from Telstra, who set wholesale prices up high because they don't want to share, and their shareholders (investment funds, small % of mon'n'dad investors) who want returns, and the "G9" - favored by many, but the pricing still sucks.

As the majority of Australian internet traffic is to/through the US, Australian bandwidth pricing is dictated by capacity on submarine cables to the US - of which there is only one - running out of "spare" capacity fast*, despite only being turned on a decade ago. Some providers lease additional capacity via Japan, and there are three new submarine cables under planning that are attempting to remedy the bandwidth shortage, either by going to Guam to patch into Japanese capacity, or only up to Hawaii. As I've said, unlike American's, Australian users, after suffering a few years of low broadband speeds, don't tolerate US style bandwidth overselling (those that have tried failed miserably), and as such a lot of ISPs, outside Telstra (who charge almost business rates anyway), we're forced to raise prices due to the increasing use of bittorrent etc.

* even worse the operators of the cable in question, Southern-Cross cable, aren't in a particular hurry to upgrade either.

Re:Trunk delivery timeframe (0)

Anonymous Coward | more than 7 years ago | (#19490635)

You mention the Southern Cross Cable. Telstra has at least two cables to the East Coast. AT&T has a cable to Sydney. Tyco has one as well. There is a new cable to Indonesia. There are at least 3 on the west coast.

Which is the "only one" are you talking about?

Southern Cross was the 1st private cable group to talk openly about their cable. Everyone one keeps the details very quiet.

Oh... its only $200mil to get your own cable to Guam now.

Re:Trunk delivery timeframe (1)

FixManTx (1115247) | more than 7 years ago | (#19499089)

If I'm reading things wrong I apoloigze, but I think there is some confusion here about dc3 and ds3. Pipe Networks is setting up a DC3, but it looks like somebody thought they were talking about a DS3 connection. Also, here in America yes, we have to put up with "up to xxxkbps" speed claims, but hardly any of our ISPs limit how much transfer volume we get. I've heard about how most UK and Aussie ISPs limit how much volume you can use each month. Despite our ISP's fudging the speed numbers, I count myself lucky to not have to endure the monopolistic antics of a company such as Telstra. With my surfing habits I know I would hit a volume cap the first week of the month, sooner if I ran bittorrents alot like some folks.

Re:Trunk delivery timeframe (1)

catprog (849688) | more than 7 years ago | (#19499829)

Instead you have to worry about monopoly in each local area?

We have a range of isp that serve everybody
Inre HOME-512-Elite 80 GB Shaped Free Dynamic $144.95 /mo
  Smart Choice 512^ No set limit(high downloaded are shaped during congestion) Dynamic $49.95 /mo

Re:Trunk delivery timeframe (1)

FixManTx (1115247) | more than 7 years ago | (#19500319)

Most areas in the US have many ISPs to choose from. We don't have to buy our internet from the local phone company. For instance, I have internet from Speakeasy even though AT&T (formerly SBC) owns the actual wires. Of course I can get AT&T's service but I prefer Speakeasy's features and unrestricted ports. It's a bit more expensive but I run my FTP server from home with no problems. There are also lots of dialup services to choose from all over the country. There are some exceptions in rural areas where most of the big ISPs don't offer service, but that's chaging quickly with newer technology coming online. There are also the cable TV companies offering broadband. Most of them offer their own internet service, however Time Warner will let you pick from several ISP such as AOL and Earthlink, or you can use TWC's RoadRunner. AT&T offers basic dynamic DSL 1.5/384 (I might be wrong on the upload) for around $15 US per month, no volume cap. The actual speeds vary due to different wire conditions and distance from the CO (Central Office). Here locally AT&T is installing fiber into neighborhoods, and Verizon is running FIOS in it's coverage areas, so those speed differences are diminishing. Yes I'm sure some of the difference between advertised and actual is just number fudging by the companies involved. So far I haven't seen major discrepancies, either in my own service or anybody I know who has broadband, cable or DSL. I'll take that minor discrepancy over a monthly volume cap any day. No disrespect intended,but I wonder why Australians tolerate such nonsense.

Re:Trunk delivery timeframe (0)

Anonymous Coward | more than 7 years ago | (#19490159)

ooo a whole 45mbps

Re:Trunk delivery timeframe (0)

Anonymous Coward | more than 7 years ago | (#19490789)

I'm not in Australia but I'd bet their prices are as high as ours or higher.
The last time I looked at a line about that size I received quotes of around $150,000 to $250,000 per year.

If it helps, I'm sure US internet prices will start to rise now AT&T is back in charge.

Re:Trunk delivery timeframe (1)

thogard (43403) | more than 7 years ago | (#19499419)

I could get you 45mb for about AU$100,000/y in Sydney but it will cost you an extra $50k in Brisbane or Melbourne. If your more than a few km outside of the core areas then the price goes up very rapidly because of tail charges. In Sydney its about $200/mb to talk to Telstra, $200/mb to talk to the rest of the world and $20/mb to talk to the peering points. Then its about $100/mb to talk to the nearest major cities.

Re:Trunk delivery timeframe (0)

Anonymous Coward | more than 7 years ago | (#19490163)


Aussie telco's responsive?
We're years behind other western countries, I still can't get mobile (Cell) phone coverage on major roads let alone a stable internet connection.
Please, INVEST IN OUR COMPANIES and help us out of the 90's!

Re:Trunk delivery timeframe (1)

thetable123 (936470) | more than 7 years ago | (#19490827)

To even get a DSL line up and running within 60 days here would be amazing
Strange also through Verizon here. Most of the time a T1 we can get within a couple days, and our last DS3 was in under a week. I guess it is a YMMV.

Raw Knut LOL!!! (-1, Troll)

Anonymous Coward | more than 7 years ago | (#19489531)

Raw Knut LOL!!!

Pipes? (0)

Anonymous Coward | more than 7 years ago | (#19489565)

Don't they mean tubes?

slashdotted an interconnect??? (3, Funny)

spectrokid (660550) | more than 7 years ago | (#19489583)

The connection has timed out
The server at www.pipenetworks.com is taking too long to respond.

This does not look good!

Re:slashdotted an interconnect??? (0, Flamebait)

Professor_UNIX (867045) | more than 7 years ago | (#19489715)

I think you're vastly overestimating the power of the Slashdot effect anymore. It's more likely they're just using some crappy default install on an underpowered box that hasn't had its Apache settings tweaked. The pages are dynamic and may be fed out of some database-driven content management system probably using MySQL which hasn't been tweaked either. Either that or they got Dugg in which case they're fukked. Digg has a ton more active users than Slashdot can ever hope for, which is sad considering they suck donkey balls and their comment threading was designed by a five year old who can't comprehend nesting comments more than 2 deep.

Sun has your covered there (4, Interesting)

invisik (227250) | more than 7 years ago | (#19489633)

Project Black Box. Just drop it off in the parking lot and plug it in.

http://www.sun.com/emrkt/blackbox/index.jsp [sun.com]

-m

Re:Sun has your covered there (1)

zerocool^ (112121) | more than 7 years ago | (#19490663)


Ok, I just spent like 20 minutes poking around that site. They really have thought of everything with that box. It's pretty amazing.

The scope is limited to specific applications, but if you need a data room on the go? I can think of a few places where it is exactly ideal. Post-disaster, i.e. New Orleans, would be one. NATO actions, or similar needs for mobile infrastructure in war zones? I mean, it's a really neat idea.

And I was like "they can't have solved the cooling problems". But, appearantly, the box requires ethernet (fiber, probably), 208v 3phase 600A power, and ... water. I would assume that's "cold municipal water", and I suppose you shouldn't be running a datacenter anywhere you can't get that. But, yeah, all watercooled heat exchangers, closed system. It's pretty brilliant.

Color me impressed.

~Wx

Re:Sun has your covered there (4, Informative)

eln (21727) | more than 7 years ago | (#19490867)

Actually, it requires "chilled water." I took a tour of this thing when it came to the local Sun campus, and it really is quite an amazing piece of engineering. Basically, you need one (small) cargo container for the data center itself, and a chiller for the water. They are able to carry the cargo container and a chiller around in a standard sized 18-wheeler. Obviously, if you were trying to take this into a disaster area, you'd need another truck or two to carry generators and fuel.

Inside the building, they had a bunch of photoshopped pictures of these black boxes in various locations like on top of an offshore oil rig, stacked 3 high in a warehouse, and sitting on top of a skyscraper. The photoshopping was fairly good, but you could tell the photos were faked, mostly because at the time only 2 black boxes had actually been built, and one of them was outside in the parking lot.

Re:Sun has your covered there (1)

drinkypoo (153816) | more than 7 years ago | (#19491933)

It's pretty brilliant.

The US (and I would assume basically every other) military has been doing this for longer than I've been alive. They have standardized connections for power, water, et cetera. They've got a darkroom-in-a-box, a hospital-in-a-box, et cetera.

And it's not a closed system if you have to pipe cooling water into and out of it, nor if it consumes electricity. It's a "sealed box" but not a closed system. I mean, Earth isn't a closed system.

Re:Sun has your covered there (1)

zerocool^ (112121) | more than 7 years ago | (#19493885)


Oh, right, right about the "closed system". I just meant that it recirculates the same air and doesn't rely on having air vents to the outside world (such as would be required for, say, a traditional air conditioner).

And it figures that the military would have something like this. In sci-fi novels, they often have like "city in a box" or whatever where it's a bunch of modular buildings that you snap together like legos and bam instant city.

Neat!

~Wx

Re:Sun has your covered there (1)

Barny (103770) | more than 7 years ago | (#19490683)

Just add time to ship it to Australia, get it through customs, hope some dockworkers don't take a liking to it and probably 3-4 months after sun send it, it may get to its location and plugged in.

Did you even read the summary?

Re:Sun has your covered there (0)

Anonymous Coward | more than 7 years ago | (#19494253)

If there is no time for shipments from the US then they are totally screwed and everyone knows this.

And since when does it take more than one week to get a shipping container from Sun's HQ in
Cullyfornya (you know- our West Coast;) to Austrailia??

Don't be such a troll, you smelly dum-dum head.

Re:Sun has your covered there (1)

Stu101 (1031686) | more than 7 years ago | (#19491523)

Could Sun make it any plainer to the would be thief "High quality, expensive computers in here". Its the exact opposite of what it should be, ie non discript so all the scrotes don't even think to try and bust it open.

Re:Sun has your covered there (1)

Wesley Felter (138342) | more than 7 years ago | (#19497457)

Only the demo Blackbox has Sun logos all over it; presumably the real ones are nondescript and rusted on the outside. :-)

Roughly translated (0)

Anonymous Coward | more than 7 years ago | (#19489713)

"The idea of a 60-day project originated with company executives with a history of rapid build-outs"

Roughly translated

The idea of a 60-day project originated when the sales force promised the earth to the end customer. Meanwhile the company's executives with a history of unrealistic time scale rapid rollouts quizzed a junior engineer, hypothetically speaking "how long do you think it would take to install a few racks in a data centre?"

crashed (-1, Redundant)

Anonymous Coward | more than 7 years ago | (#19489725)

You build it in 60 days...slashdot readers crash it in 60 seconds :)
what a bunch of noobs they are.

Re:crashed (0)

Anonymous Coward | more than 7 years ago | (#19489815)

Your an idiot, the DC is not running, its being hosted off a SINGLE temporary adsl connection that was supposed to only handle a hand full of people at any one time.

Did ? (1)

mistralol (987952) | more than 7 years ago | (#19490033)

It did have a webcam It Did have a Blog Ahh well ....

Time zones (4, Insightful)

Xiroth (917768) | more than 7 years ago | (#19490063)

Uh, the article says that they aim to be complete at 9 am EST. While that might mean an American time zone in America, in Australia that means an Australian time zone (specifically, AEST, or GMT+10, aka their local time). So they're actually aiming to finish on Thursday afternoon Eastern American time.

Just a FYI, unless there's clarification somewhere that they were speaking of the American EST.

Re:Time zones (1)

eln (21727) | more than 7 years ago | (#19490981)

It's a built-in safety mechanism. If they can't get it done by Friday morning American EST, they'll just claim they meant Australian EST the whole time.

Re:Time zones (0)

Anonymous Coward | more than 7 years ago | (#19496917)

definitely got that one backwards...Australia happens first

Re:Time zones (1)

CharlieHedlin (102121) | more than 7 years ago | (#19496989)

Your times zones are the wrong way around... (The international dateline plays a role here)
So I guess AEST is the first goal, but since they just said EST, they could always say they meant the US time zone.

And the judges say... (3, Funny)

VinB (936538) | more than 7 years ago | (#19490247)

This challenge would be great if they also had David Hasselhoff, Paula Abdul and John Schneider making comments after each piece of equipment installed.

Re:And the judges say... (0)

Anonymous Coward | more than 7 years ago | (#19491171)

... and a sniper practicing the 1000M shot on them.

Re:And the judges say... (0)

Anonymous Coward | more than 7 years ago | (#19492741)

That was the most retarded Slashdot comment I've ever seen. Stop wasting our oxygen.

A gentle bit of load testing perhaps? (1)

simong (32944) | more than 7 years ago | (#19490299)

Probably not. The blog server is probably running on an old P2 under someone's desk. And is currently leaking magic smoke.

Call me when they've kept it running for a year (2, Interesting)

progprog (1016317) | more than 7 years ago | (#19490375)

Building something in a hurry is not an accomplishment in itself. Keeping it well-maintained is the real challenge.

Would you rather slap together a DIY PC in 15 minutes or spend time ensuring your cables are positioned to allow good airflow, etc? Same principle applies.

Top Secret ? (1)

Bramantip (1054582) | more than 7 years ago | (#19490391)

And in an era when many major data-center projects are shrouded in secrecy, these guys are putting the entire effort online....

Really? Data-Center projects shrouded in secrecy ?

Maybe it is simply because you want it to work before the customers actually connect to it, not like the actual datacenter which can't handle the load of a few slashdot users....

Re:Top Secret ? (1)

squizzar (1031726) | more than 7 years ago | (#19491589)

Ok, much as it is a big joke that their datacentre has been slashdotted, ha ha, it does strike me as a little unlikely that they are running the blogs etc. from the actual finished datacentre itself. After all, they haven't actually built it yet, right? I mean if they actually managed to get a blog/webcam/whatever to work in an empty room _before_ they installed the hardware, i'd say they were pretty good at their jobs, non?

Re:Top Secret ? (1)

Bramantip (1054582) | more than 7 years ago | (#19496261)

My only real complaint is the premise they were starting from: that most Data-centers are created 'in secret', when in fact not too many people are interested in webcamming an empty room. It would be more logical to set up the servers then install the webcam and post to slashdot, but I suppose that is just me. As nice as it is for them to show them hard at work, I still think this story is a lot of 'much ado about nothing'. Some of us do this type of work routinely and it is just a little strange to call it 'in secret' when in reality some of us just want to get a datacenter working, not webcam it.

If they had spent as much time setting up the datacenter as they had wasted in setting up the webcam, posting to slashdot, reading the commentaries on slashdot and putting their server on ice, they probably would have finished by now....

Wouldn't it have made more sense .. (1)

talexb (223672) | more than 7 years ago | (#19490685)

.. if they'd moved hosting for the blog and webcam to North America or Europe once they were mentioned on SlashDot?

In this case, there is a case of too much publicity. And I'd hate to see their bandwidth bills for this month.

--Alex

This is a great PR piece! (2, Funny)

xxxJonBoyxxx (565205) | more than 7 years ago | (#19491547)

This is a great PR piece! Budding marketeers take note: "experiments" like this is a great way to get all kinds of free press. I hope the marketing team at Pipe gets a raise for this.

Virtual Reality (1)

NotQuiteReal (608241) | more than 7 years ago | (#19492797)

In the future we won't need to worry about all this secrecy. You can just do everything virtually, on the Internet. Use encryption if you have something to hide.

Oh, wait... never mind.

More details (2, Informative)

BDPrime (1012761) | more than 7 years ago | (#19492973)

More details in this story [techtarget.com] . It definitely seems like a "we promised the customer we'd be ready by this time so you'd better get it done" type of deal. Demand for colo space is strong, but I don't know that it's so strong that Pipe Networks has to cobble together a data center as fast as it can. It could have probably doubled the time and it wouldn't have made a difference.

The story also says the 60-day period is just the construction time period, and not the planning behind it, etc. But whatever. They created some hype and it worked. It worked too well, apparently.

Must be TINY Pipesnetwork (1)

SkyDude (919251) | more than 7 years ago | (#19494143)

The /. effect is in full motion at 1:45EST. The site is down or too busy.

Mirror (0)

Anonymous Coward | more than 7 years ago | (#19496731)

Site has been mirrored at www.cloggedpipenetworks.com

We do this regularily (0)

Anonymous Coward | more than 7 years ago | (#19497259)

Hi,
we are pulling this kind of stuff regularily.

6 weeks from scratch to an completely installed company office with servers and clients (PCs), INCLUDING delivery of equipment, INCLUDING redesign of rooms - anywhere in Europe.

Unpack the delivered boxes, connect cables, start servers, start clients, company is up.
With the right tools (automatic software installation, not! using images) this is actually no problem at all.
With the right firms who want your money and want to deliver your equipement in time this is again no problem.
On-site you will only have to make sure the firewall works as exspected and the ISP has actually delivered the Internet lines.

I'm sorry, we keep doing this kind of stuff for 2 years now.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...